Is there some limit of rows for a sql lite table ? My app is giving a error " no such table x..." when that table have more than 6000 rows. Below the 6000 rows, dont give any error.
Thank you.
This is the query.
db = (new DatabaseHelper(this)).getWritableDatabase();
cursor = db.rawQuery("SELECT continentes._id, continentes.ContinenteID, continentes.Continente"
+ " FROM continentes"
+ " WHERE continentes.Continente LIKE ?"
+ " GROUP BY continentes.ContinenteID, continentes.Continente ORDER BY continentes.Continente",
new String[]{"%" + searchText.getText().toString() + "%"});
This is where i create the table
@Override
public void onCreate(SQLiteDatabase db) {
String s;
try {
Toast.makeText(context, "Creating Database", 2000).show();
InputStream in = context.getResources().openRawResource(R.raw.continentes);
DocumentBuilder builder = DocumentBuilderFactory.newInstance().newDocumentBuilder();
Document doc = builder.parse(in, null);
NodeList statements = doc.getElementsByTagName("statement");
for (int i=0; i<statements.getLength(); i++) {
s = statements.item(i).getChildNodes().item(0).getNodeValue();
db.execSQL(s);
}
} catch (Throwable t) {
Toast.makeText(context, t.toString(), 50000).show();
SOLVED !!! I just find out why i got this problem. Because the files that i use with Data to create and populate the sql database, can´t have more than 1.2mb. So, i split them, and dont have more problems. Thanks to all.
SQLite database files have a maximum size of about 140 TB. On a phone, the size of the storage (a few GB) will limit your database file size, while the memory size will limit how much data you can retrieve from a query. Furthermore, Android cursors have a limit of 1 MB for the results.
An unlikely requirement for an engine popular on Android and iOS. SQLite, which claims to be "used more than all other database engines combined", has been updated to version 3.33. 0 with the maximum size increased to 281TB, around twice the previous capacity of 140TB.
A SQLite database can have maximum 2147483646 pages. Hence the maximum number of tables in a schema cannot reach more than 2147483646. The maximum number of rows in a table is 264. The maximum number of columns is 32767 in a table.
Practically sqlite is likely to work as long as there is storage available. It works well with dataset larger than memory, it was originally created when memory was thin and it was a very important point from the start. There is absolutely no issue with storing 100 GB of data.
You can get the standard information on the limits of SQLite but that isn't going to be what's causing your issue as the limit is actually very high. (2 to the power of 64!)
More than likely your issue has to do with how your application is inserting the records and/or the types of records being inserted.
Can you post some code for the application and how it's handling the insert? I can revise my answer after reading through it.
As an added tip, make sure you've set:
Cursor myCursor = db.rawQuery("PRAGMA synchronous=OFF", null);
myCursor.close();
As this can really improve the performance of your larger bulk inserts.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With