I have a list (of length 3) which is made up of sublists (each of differing length - 2, 2, 3). I would like to store all of this as one big list (e.g., no sublists - just one list of length 7). I understand how to do it manually, but is there a function or command I can use?
I would like to be able to do this for lists and sublists of any length.
Here's an example of the list:
[[1]]
[[1]][[1]]
name n l_1 t t_3 t_4 t_5 cluster
12 563035 19 9.263158 0.2017045 0.06379453 0.075876830 0.095852895 1
14 563037 19 8.026316 0.2076503 0.05634675 0.098684211 -0.104566563 1
[[1]][[2]]
name n l_1 t t_3 t_4 t_5 cluster
13 563036 20 7.200000 0.1838450 -0.06428098 0.085681987 -0.011070830 2
17 563042 20 7.725000 0.2168285 0.15161037 0.117570045 -0.067102568 2
[[2]]
[[2]][[1]]
name n l_1 t t_3 t_4 t_5 cluster
1 561101 11 6.772727 0.19731544 0.029478458 -0.128117914 6.235828e-02 1
44 563080 11 7.545455 0.18554217 0.103896104 0.285714286 -2.164502e-02 1
[[2]][[2]]
name n l_1 t t_3 t_4 t_5 cluster
48 566017 33 10.400000 0.2037624 0.16432326 0.1166006937 -0.012830017 2
49 566018 22 9.218182 0.2113271 0.30646667 0.2502280702 0.189838207 2
50 566020 19 11.736842 0.3111609 0.51217445 0.5147883012 0.462723120 2
[[3]]
[[3]][[1]]
name n l_1 t t_3 t_4 t_5 cluster
158 568004 18 8.722222 0.1787186 -0.05083857 0.06498952 0.06918239 1
161 568046 19 11.794737 0.3646190 0.54582540 0.49747236 0.32255755 1
162 568047 18 12.916667 0.3366224 0.53523112 0.40464111 0.29960541 1
163 568048 20 11.590000 0.3918986 0.50007725 0.43039556 0.34299752 1
[[3]][[2]]
name n l_1 t t_3 t_4 t_5 cluster
165 568050 20 9.125000 0.2034607 0.29789747 0.31073776 0.09157738 2
167 568054 20 8.850000 0.1332144 0.09895833 0.18636204 0.04641544 2
[[3]][[3]]
name n l_1 t t_3 t_4 t_5 cluster
168 568058 20 8.675000 0.2012741 0.18161266 0.200319163 -0.009375416 3
170 568061 18 24.861111 0.7394676 0.91836281 0.928317483 0.905563950 3
Many thanks, Sylvia
For your specific question, the answer is simple:
unlist(mylist, recursive = FALSE)
However, you asked how to be able to do this for a list with an arbitrary number of sublists. That is a bit more tricky. Fortunately, an Akhil S Bhel has tackled that problem for us and created a function called LinearizeNestedList
. His site is down at the moment, but I had put his function up as a Github Gist.
First, we'll create some sample data with nested lists within nested lists.
NList <- list(a = "a", # Atom
b = 1:5, # Vector
c = data.frame(x = runif(5), y = runif(5)),
d = matrix(runif(4), nrow = 2),
e = list(l = list("a", "b"),
m = list(1:5, 5:10),
n = list(list(1), list(2))))
The source list looks like this. Notice the nesting that happens with the nested list item "e".
NList
# $a
# [1] "a"
#
# $b
# [1] 1 2 3 4 5
#
# $c
# x y
# 1 0.7893562 0.47761962
# 2 0.0233312 0.86120948
# 3 0.4772301 0.43809711
# 4 0.7323137 0.24479728
# 5 0.6927316 0.07067905
#
# $d
# [,1] [,2]
# [1,] 0.09946616 0.5186343
# [2,] 0.31627171 0.6620051
#
# $e
# $e$l
# $e$l[[1]]
# [1] "a"
#
# $e$l[[2]]
# [1] "b"
#
#
# $e$m
# $e$m[[1]]
# [1] 1 2 3 4 5
#
# $e$m[[2]]
# [1] 5 6 7 8 9 10
#
#
# $e$n
# $e$n[[1]]
# $e$n[[1]][[1]]
# [1] 1
#
#
# $e$n[[2]]
# $e$n[[2]][[1]]
# [1] 2
You can see how LinearizeNestedList
"flattens" all sublists so you end up with a single list.
LinearizeNestedList(NList)
# $a
# [1] "a"
#
# $b
# [1] 1 2 3 4 5
#
# $c
# x y
# 1 0.7893562 0.47761962
# 2 0.0233312 0.86120948
# 3 0.4772301 0.43809711
# 4 0.7323137 0.24479728
# 5 0.6927316 0.07067905
#
# $d
# [,1] [,2]
# [1,] 0.09946616 0.5186343
# [2,] 0.31627171 0.6620051
#
# $`e/l/1`
# [1] "a"
#
# $`e/l/2`
# [1] "b"
#
# $`e/m/1`
# [1] 1 2 3 4 5
#
# $`e/m/2`
# [1] 5 6 7 8 9 10
#
# $`e/n/1/1`
# [1] 1
#
# $`e/n/2/1`
# [1] 2
By the way, I forgot to mention that you can flatten data.frame
s in list
s too (since a data.frame
is a special type of list
in R.
If you really want to flatten everything out (well, except arrays, since they are just vectors with dim
s), add LinearizeDataFrames = TRUE
to your LinearizeNestedList
call:
LinearizeNestedList(NList, LinearizeDataFrames=TRUE)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With