Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How would I parse a text file into a 3D array in Python?

Tags:

python

arrays

I have a text file with a dimension of 82355. I want to define this as density(82355) and parse this into a three-dimensional array called rho(5,91,181), where

  • 5 is for the number of days,
  • 91 is for the number of latitudes,
  • 181 is for the number of longitudes.

This is an example of the text file with the first 11 values. There are 5 days with latitudes from -90 to 90 in two-degree steps. For each latitude, there will be 180 rows, corresponding to longitudes from 0 to 360 in two-degree steps.

       Latitude  Longitude           rho
0         -90.0        0.0  3.396760e-12
1         -90.0        2.0  3.397140e-12
2         -90.0        4.0  3.397510e-12
3         -90.0        6.0  3.397870e-12
4         -90.0        8.0  3.398470e-12
5         -90.0       10.0  3.399060e-12
6         -90.0       12.0  3.399810e-12
7         -90.0       14.0  3.400560e-12
8         -90.0       16.0  3.401440e-12
9         -90.0       18.0  3.402310e-12
10        -90.0       20.0  3.403200e-12

I'm confused how to start and parse this text file with dimension 82355 into a 3D array called rho(5,91,181) in Python. Does anyone have any recommendations?

like image 917
Sjcarlso Avatar asked Dec 30 '25 03:12

Sjcarlso


1 Answers

The file itself is quite easy to parse as it seems to be in standard tsv format. You only need to specify how the day is calculated, then just go over each row and put the rho value into the right place of array:

import numpy as np
import pandas as pd
df = pd.read_csv('path_to_data_file', sep=r'[ \t]+')
rho = np.zeros((5, 9, 181))
for i, e in df.iterrows():
    day = calc_day_from(i, e)
    lat = int((e['Latitude'] + 90) / 2)
    long = int(e['Longitude'] / 2)
    rho[day, lat, long] = e['rho']
like image 140
zwang Avatar answered Jan 01 '26 18:01

zwang



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!