I'm wondering what the order of complexity for a Python v2.7 list being built up using append() is? Is a Python list doubly linked and thus it is constant complexity or is it singly linked and thus linear complexity? If it is singly linked, how can I in linear time build up a list from an iteration that provides the values of the list in the order of from beginning to end?
For example:
def holes_between(intervals):
# Compute the holes between the intervals, for example:
# given the table: ([ 8, 9] [14, 18] [19, 20] [23, 32] [34, 49])
# compute the holes: ([10, 13] [21, 22] [33, 33])
prec = intervals[0][1] + 1 # Bootstrap the iteration
holes = []
for low, high in intervals[1:]:
if prec <= low - 1:
holes.append((prec, low - 1))
prec = high + 1
return holes
Practical Data Science using PythonSuch a linked list is called Doubly Linked List. Following is the features of doubly linked list. Doubly Linked List contains a link element called first and last. Each link carries a data field(s) and two link fields called next and prev.
It's not a linked list at all. It's essentially what is formally called a dynamic array.
A Doubly Linked List (DLL) contains an extra pointer, typically called the previous pointer, together with the next pointer and data which are there in the singly linked list. Following is a representation of a DLL node: C++
The time complexity for python list.append()
is O(1). See the Time Complexity list on the Python Wiki.
Internally, python lists are vectors of pointers:
typedef struct {
PyObject_VAR_HEAD
/* Vector of pointers to list elements. list[0] is ob_item[0], etc. */
PyObject **ob_item;
/* ob_item contains space for 'allocated' elements. The number
* currently in use is ob_size.
* Invariants:
* 0 <= ob_size <= allocated
* len(list) == ob_size
* ob_item == NULL implies ob_size == allocated == 0
* list.sort() temporarily sets allocated to -1 to detect mutations.
*
* Items must normally not be NULL, except during construction when
* the list is not yet visible outside the function that builds it.
*/
Py_ssize_t allocated;
} PyListObject;
The ob_item
vector is resized as needed with overallocation to give an amortized O(1) cost for appends:
/* This over-allocates proportional to the list size, making room
* for additional growth. The over-allocation is mild, but is
* enough to give linear-time amortized behavior over a long
* sequence of appends() in the presence of a poorly-performing
* system realloc().
* The growth pattern is: 0, 4, 8, 16, 25, 35, 46, 58, 72, 88, ...
*/
new_allocated = (newsize >> 3) + (newsize < 9 ? 3 : 6);
This makes Python lists dynamic arrays.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With