So lately I have noticed that what dramatically reduces my productivity is spending time on trying to decide whether a certain action I should do on the backend or the frontend, modelling my databases and so on. And please keep in mind, I am trying to find the balance of being productive and at the same time keeping quality, and so it seems to me that I am falling into the "premature optimization is the root of all evil" thingy, in my case, reducing my productivity dramatically.
I am trying to figure out a, framework lets say, that will allow me to make these decisions easier.
For example, right now I am making a simple functionality, and I'll be specific, because it will be easier, so I have a functionality that allows the user to add a skill to the mongo document for skills, the problem occurs with the requirement that skills must be unique, for example only one skill named "Javascript" can be within the database, and so a check must be performed, the question is, where and how do I do the check?
There are of course many possibilities, but the two main ones I have in mind are:
1.
- The user fills the form with the skill name
- the data is sent to the backend
- the backend attempts to create that skill
- mongodb returns error that it already exists
- error is sent back to the front end and displayed to the user
This approach works just fine, however the drawback imo, is it takes time and does not allow the user to see his mistake in "realtime".
2.
- While the form is opening, send a get request to the server to get all currently existing skills is made
- the user fills the form
- as the user is filling the form, the check is performed in realtime with every pressed key, and so the user gets instant feedback if the skill already exists.
This would also work just fine, with the drawback, that if the collection of skills is huge, and a search must be performed on that collection at pressed key, processing will depend on the users device and the size of the collection, which could potentially lead to bad user experience.
So does this dilemma boil down to the expected amount of information to check against, if a collection is never expected to be huge enough to cause a problem do 2, if not, go with 1?
Do I start with the second approach and later move to the 1st if performance suffers at some point?
Do I go with 2 and use some more complex data structures that will allow for faster searches to mitigate that potential performance issue?
Keep in mind please, this is just one example, the same decision can be extrapolated to many other similar scenarios and thats whats tripping me.
Aucun commentaire:
Enregistrer un commentaire