-
-
Notifications
You must be signed in to change notification settings - Fork 204
Database Vendor Support: PostgreSQL #369
Comments
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
From @puco β directus/directus#1020 (comment)
So to give an update here, this is indeed the (second) most requested feature (behind GraphQL support, which is almost ready for beta). The reality is: we're a small open-source team and have to triage tickets based on priority. As of now that means that we are focusing on:
Essentially, we're trying to limit the breadth of what we need to test/support until things are in a good place. Adding Postgres will be awesome, and we're moving in the direction by continuing to identify, document, and abstract any MySQL specific queries/features. That way we'll be able to hit the ground running when the time is right to add this support. It's looking like we'll be starting in on this in Q3-2019 unless there are other priority issues that arise (pushes it back) or a company "sponsors" it's development (pushes it sooner). In the meantime, if you want to help out, try contributing to our public Doc where we're consolidating info on what needs to happen to fully abstract our database layer, become SQL vendor agnostic, and support Postgres: https://docs.google.com/document/d/17ZUuUxb2qMotYduTzLuntvdj45ogfvU-d_RsDMMEhOA/edit?usp=sharing |
To achieve better clarity/visibility, we are now tracking feature requests within the Feature Request project board. This issue being closed does not mean it's not being considered. |
What's the current state of PostgreSQL support for Directus? The release board for v2.2.0 says it's DONE but the code seems to only be merged to the Is this actually done and if so, how can I use it - can't find any documentation. (https://docs.directus.io/guides/cli.html#help for example still says |
I thought our PostgreSQL PR was merged in. @directus/api-team ? |
We're collecting community feedback on the postgres implementation before releasing it to the public |
Until we have automation testing in place, we have to check the stability of the PostgreSQL time by time manually (Mostly when we send any DB related fixes). @urvashithakar @BJGajjar |
Are there any plans on when PostgreSQL support will be available for the general public - any estimated time frame? |
@matleh like @BJGajjar mentioned above, it's going to be very hard to know if it's stable enough to be "production ready" as it's going to rely on manually testing some use cases. We might merge it in to master but not "officially support"* it for a little while. * Officially supporting something basically means pointing people to it in the docs. We'll add a page in there saying that postgres support is experimental |
I've just seen your comment @hemratna
As far as I'm aware, my test suite will be a simple Ultimately, as long as the API works as expected, tests should pass, and I shouldn't have to write any separate tests for different databases. |
You're right, those tests should work regardless of database / tech stack used in the API |
Maybe the tests could add a test table to the database with known data before running, and remove after? That way it works on any database, but still has the info it needs to perform the tests? |
It definitely has to do that, as we should assume an empty installation of directus for these tests (ping @shealavington ) |
I could specifically write the test setup and tear-downs for MySQL, so then I could use a node MySQL implementation to do the following, however then we require anyone running the tests to be specifically using MySQL:
Either that, or I was going to completely rely on the API to do it all, however then if the before or after fails then the test fails as it was incorrectly configured, but this allows us to run it against any instance.
But yea, all-in-all, it's nothing to do with if you're using MySQL, MariaDB, Postgres, we're simply seeing if the API is performing as we expect it to. |
That makes sense. We'll definitely want to make sure that this is database vendor agnostic. One more option to add into the mix: If we want to ensure the setup/teardown works every time (doesn't rely on the API), then we could also have specific commands for each vendor. Since this "dummy" data shouldn't really change often, that shouldn't be a crazy amount of work. Question 1: Can we keep the dummy data as one pure SQL file? Or does it need to be individual commands to work properly? Question 2: Would we have setup/teardown before/after each individual test, or before/after all tests? |
I really wanted to K.I.S.S, therefore adding vendor specific commands creates more complexity. Ideally we could use the API for the entire thing, therefore if you make a change and run the tests, and the tests fail anywhere, then there's likely an issue with the code, even if it was in the setup(before) or teardown(after). Q1: I believe we could have one SQL file that we use every time, we could run it before all tests run, and drop it after all tests have ran. Though, we need to ensure that when someone adds a new test, we have any seed data needed for the test, and we'd need it to work for each vendor requiring someone to test the tests for each vendor. Q2: The original thought was before/after each suite. For example I want to test that the API can delete items, I'll make a suite of tests to delete 1 item, and 4 items, but, before they run, I'd need to insert 5 items for the tests to use. Maybe we just write all tests in a format that allow them to take care of themselves?
|
Originally I was thinking we could use the Demo SQL file for tests, but I think we should be able to change that whenever we want, and shouldn't worry about breaking the tests. Ooooh, I like the CRUD tests... testing themselves. If we use the API to create collections/fields/relationships, create items, then read them, update it, revert them, check revisions, delete, etc... then we can just Maybe the dummy data could even be language agnostic. So a lot of numbers, Also I just realized all this useful info is on the PostgreSQL issue... whoops! |
I tried to test the code as well and can't seem to connect. Can someone post how should the config file for postgres look like? Or how to turn on better logging (the only thing I see in the logs currently is quite unspecific |
Re connection: found the issue. Needed to install pdo_pgsql (the error reporting was not very helpfull as it does not log the original exception being thrown in Zend |
Hey @puco β I assume you're testing with the |
@benhaynes yes I am. I can provide you more feedback as I test along if you are interested. But I don't think this is the right place for it. |
Thank you! This is probably the best place for it, for now. Any feedback is very much appreciated! |
I'll give a try on this branch because I really need PostgreSQL for one of my project. |
Yeah, let's get this up-to-date and tested... then we can finally look into it getting merged. The problem in the past has been the added time investment and complexity of maintaining a new database vendor, so the more help we get from the community the better. |
#1319 PR is initiated, i'll try to test on top of this branch. |
π¨ MIGRATED FROM REQUESTS.GETDIRECTUS.COM π¨
Database Support - PostgreSQL DB
π = 14
Created 1 year ago by @ricricucit
@rijkvanzanten β (11 months ago)
It would be great to have some alternatives to just MySQL!
The text was updated successfully, but these errors were encountered: