Newbie : I have an Excel file, which has more than 100 different Sheets. Each sheet contains several tables and charts.
I wish to save every sheet as a new Excel file.
I tried many python codes, but none of them worked.
Kindly help in this. Thanks!
Edit 1 : In reponse to comments, this is what I tried:
import pandas as pd
import xlrd
inputFile = 'D:\Excel\Complete_data.xlsx'
#getting sheet names
xls = xlrd.open_workbook(inputFile, on_demand=True)
sheet_names = xls.sheet_names()
path = "D:/Excel/All Files/"
#create a new excel file for every sheet
for name in sheet_names:
parsing = pd.ExcelFile(inputFile).parse(sheetname = name)
#writing data to the new excel file
parsing.to_excel(path+str(name)+".xlsx", index=False)
To be precise, the problem is coming in copying tables and charts.
I have just worked through this issue so will post my solution, I do not know how it will affect charts etc.
import os
import xlrd
from xlutils.copy import copy
import xlwt
path = #place path where files to split up are
targetdir = (path + "New_Files/") #where you want your new files
if not os.path.exists(targetdir): #makes your new directory
os.makedirs(targetdir)
for root,dir,files in os.walk(path, topdown=False): #all the files you want to split
xlsfiles=[f for f in files] #can add selection condition here
for f in xlsfiles:
wb = xlrd.open_workbook(os.path.join(root, f), on_demand=True)
for sheet in wb.sheets(): #cycles through each sheet in each workbook
newwb = copy(wb) #makes a temp copy of that book
newwb._Workbook__worksheets = [ worksheet for worksheet in newwb._Workbook__worksheets if worksheet.name == sheet.name ]
#brute force, but strips away all other sheets apart from the sheet being looked at
newwb.save(targetdir + f.strip(".xls") + sheet.name + ".xls")
#saves each sheet as the original file name plus the sheet name
Not particularly elegant but worked well for me and gives easy functionality. Hopefully useful for someone.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With