I need to load a table with a large amount of test data. This is to be used for testing performance and scaling.
How can I easily create 100,000 rows of random/junk data for my database table?
Using cloud storage. Cloud storage is an excellent solution, but it requires the data to be easily shared between multiple servers in order to provide scaling. The NoSQL databases were specially created for using, testing and developing local hardware, and then moving the system to the cloud, where it works.
After a database has been created, there are two ways of populating the tables – either from existing data or through the use of the user applications developed for the database.
The “INSERT INTO” command is a part of the Data Manipulation Language (DML), a sublanguage of SQL that enables modification and retrieval of information from database objects. This command enables sus to insert rows into tables. Using this command, you can insert values into all columns or selected columns of a table.
Populate one table using another table. You can populate the data into a table through the select statement over another table; provided the other table has a set of fields, which are required to populate the first table.
You could also use a stored procedure. Consider the following table as an example:
CREATE TABLE your_table (id int NOT NULL PRIMARY KEY AUTO_INCREMENT, val int);
Then you could add a stored procedure like this:
DELIMITER $$ CREATE PROCEDURE prepare_data() BEGIN DECLARE i INT DEFAULT 100; WHILE i < 100000 DO INSERT INTO your_table (val) VALUES (i); SET i = i + 1; END WHILE; END$$ DELIMITER ;
When you call it, you'll have 100k records:
CALL prepare_data();
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With