I've got a Django application that is storing a large amount of data in it's models. The problem is, whenever I deploy to Heroku even if it's a small change, the remote database with the correct data gets overwritten with the local database of dummy data.
Scenario:
I have a db file my_db
which is remote. Now, when pushing to heroku, I just git add
> git commit
only the files with the changes rather than the whole project. My problem lies in the fact that, it somehow still overwrites remote database with local data.
Is there a way to prevent this?
Disk backed storage If you were to use SQLite on Heroku, you would lose your entire database at least once every 24 hours. Even if Heroku's disks were persistent running SQLite would still not be a good fit. Since SQLite does not run as a service, each dyno would run a separate running copy.
However, it seems to be that Heroku doesn't support applications with sqlite3 as the database.
Heroku does not provide a persistent filesystem.
Most Heroku applications that I have worked on use PostgreSQL for their database, so this isn't a problem. But SQLite is just a file sitting in a directory somewhere, so every time you deploy your database will be lost.
The easiest solution is probably to migrate from SQLite to PostgreSQL, which is very well supported on Heroku (and in Django) and will not lose data every time you deploy.
If you're firmly committed to SQLite you may have some other options:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With