Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to backup database with 100 rows in each table?

Tags:

sql-server

I want to backup a SQL Server database with all the objects along with data but data in all the tables should be limited that is 100 rows for each table. I can do this in mysql very easily but in SQL Server I don`t know how to do??

like image 855
Nitesh Kumar Avatar asked Jun 06 '12 11:06

Nitesh Kumar


1 Answers

You can't really use an explicit BACKUP DATABASE for this. However you could do something like this - however keep in mind that - as in my comment - this data will be of limited use if you are relying on any sort of data integrity since the ordering will be relatively arbitrary and unless everything is 1:1 and you happen to get magically convenient sorting on all the queries it will just be a massive hodge podge of data:

CREATE DATABASE copy_of_original;
GO

USE original_db;
GO

DECLARE @sql NVARCHAR(MAX);
SET @sql = N'';

-- this assumes all tables are in `dbo` schema:
SELECT @sql = @sql + CHAR(13) + CHAR(10)
  + 'SELECT TOP (100) * INTO copy_of_original.dbo.' 
  + QUOTENAME(name) + ' FROM dbo.' + QUOTENAME(name) + ';'
FROM sys.tables
WHERE schema_id = 1;

PRINT @sql;
-- EXEC sp_executesql @sql;

Once you're done this, you can backup copy_of_original - but keep in mind it will not have any of the indexes or constraints present in the original database, and that TOP will have picked an arbitrary set of 100 rows from each table (or the whole table, for small tables with less than 100 rows).

like image 134
Aaron Bertrand Avatar answered Oct 12 '22 09:10

Aaron Bertrand