Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The question to ask is whether building a cache of normalized data will be more efficient than addressing the complexities of non-normalized data, eg. duplication, renaming, integrity problems, etc.


I've seen the downside of this. And it's having to have an oncall staff of DBA's hand-grooming fragile databases and running deduplication processes that sometimes get so far behind that it's a mathematical impossibility to catch up with new data coming in. (never mind what happens to your maintenance schedule when you have entire teams focused on "fixing this problem" for weeks at a time).

If it were up to me (and it almost never is, because I'm not a DB expert), I would ALWAYS normalize as much as possible when designing a database. I really can't even wrap my brain around why large (and old) databases ever get into this state. But I've seen it at two different employers and it's very painful (and costly) to deal with. My take was that these databases were probably originally set up by people who had no idea what they were doing, and ended up locking the company into a shitty implementation that kept the company crippled 15 years later. But that's just me.


Totally. All I mean to say is that "everything should always be normalized" doesn't necessarily make sense. You need to consider your situation.


Premature optimization is the root of all evil. It’s unlikely normalizing data is going to be an actual problem you can’t solve via caching.


I hate how this quote has been perverted. “Premature optimization” was meant to apply to people making their programs an unreadable/unmaintainable mess to save a couple of CPU instructions. Figuring out how to organize your data is not premature optimization, it’s literally one of the first things you have to tackle when starting a project, because trying to change it down the line is going to be miserable, especially if you can’t afford any downtime.


Yes, and I think the parent was pointing out that denormalization is often a premature optimization that makes the data schema an unreadable/unmaintainable mess to save a few CPU instructions. The normalized schema is answering the question of "how to organize your data". Denormalizing that schema is an attempt at optimization.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: