• TechNom (nobody)@programming.dev
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 month ago

    While I understand your point, there’s a mistake that I see far too often in the industry. Using Relational DBs where the data model is better suited to other sorts of DBs. For example, JSON documents are better stored in document DBs like mongo. I realize that your use case doesn’t involve querying json - in which it can be simply stored as text. Similar mistakes are made for time series data, key-value data and directory type data.

    I’m not particularly angry at such (ab)uses of RDB. But you’ll probably get better results with NoSQL DBs. Even in cases that involve multiple data models, you could combine multiple DB software to achieve the best results. Or even better, there are adaptors for RDBMS that make it behave like different types at the same time. For example, ferretdb makes it behave like mongodb, postgis for geographic db, etc.

    • CeeBee_Eh@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      1 month ago

      Using Relational DBs where the data model is better suited to other sorts of DBs.

      This is true if most or all of your data is such. But when you have only a few bits of data here and there, it’s still better to use the RDB.

      For example, in a surveillance system (think Blue Iris, Zone Minder, or Shinobi) you want to use an RDB, but you’re going to have to store JSON data from alerts as well as other objects within the frame when alerts come in. Something like this:

      {
        "detection":{
          "object":"person",
          "time":"2024-07-29 11:12:50.123",
          "camera":"LemmyCam",
          "coords": {
          	"x":"23",
          	"y":"100",
          	"w":"50",
          	"h":"75"
          	}
          }
        },
        "other_ojects":{
           <repeat above format multipl times>
        }
      }
      

      While it’s possible to store this in a flat format in a table. The question is why would you want to. Postgres’ JSONB datatype will store the data as efficiently as anything else, while also making it queryable. This gives you the advantage of not having to rework the the table structure if you need to expand the type of data points used in the detection software.

      It definitely isn’t a solution for most things, but it’s 100% valid to use.

      There’s also the consideration that you just want to store JSON data as it’s generated by whatever source without translating it in any way. Just store the actual data in it’s “raw” form. This allows you to do that also.

      Edit: just to add to the example JSON, the other advantage is that it allows a variable number of objects within the array without having to accommodate it in the table. I can’t count how many times I’ve seen tables with “extra1, extra2, extra3, extra4, …” because they knew there would be extra data at some point, but no idea what it would be.