Conversation
anselrognlie
left a comment
There was a problem hiding this comment.
💫 Overall, a nice implementation, Vange. For the comprehension questions, we can also say that a heap is weakly ordered compared to a BST. And from your response I see that mixing iterative and recursive approaches for heap_up and heap_down was intentional, but be aware that this actually has a complexity impact. While your add (heap_up) is time O(log n) and space O(1), your remove (head_down) is time O(log n) and space O(log n). In general, in Python an iterative approach will consume less space than a recursive approach.
I left some comments on your implementation below. Notice a tricky little issue in heap_down.
🟢
| Time Complexity: ? O(log n) | ||
| Space Complexity: ? O(1) |
There was a problem hiding this comment.
✨ Great! In the worst case, the new value we're inserting is the new root of the heap, meaning it would need to move up the full height of the heap (which is log n levels deep). Your implementation of the heap_up helper is iterative, meaning that while we do have to loop log n times, we don't have any recursive stack overhead. So the space complexity is constant (O(1)).
| ''' i tried to import MinHeap from the other file but it wasn't passing pytest tests | ||
|
|
||
| this is what i was writing: | ||
| from min_heap import MinHeap |
There was a problem hiding this comment.
You could either do
from .min_heap import MinHeapor
from heaps.min_heap import MinHeap| Space Complexity: ? O(1) | ||
| """ | ||
| pass | ||
| if value == None: |
There was a problem hiding this comment.
👀 Prefer using is to compare to None
|
|
||
| # Act | ||
| returned_items = ["Donuts", "Pizza", "Pasta", "Soup", "Cookies", "Cake"] | ||
| # assert heap.store == returned_items |
There was a problem hiding this comment.
We can't make a check like this since the store is a list of heap nodes, not just the values. We also wouldn't generally expect the order of the items being returned to match the final retrieved order, since a heap is only loosely ordered. We only know that the smallest (or largest if a max heap) value is at the front, but we can't say with certainty the specific ordering of the remaining items.
| Time Complexity: ? O(log n) | ||
| Space Complexity: ? O(1) |
There was a problem hiding this comment.
As the cost of add mostly comes from heap_up, the main cost of remove comes from heap_down. However, while you heap_up is implemented iteratively, achieving O(1) space complexity (and O(log n) time), your heap_down is implemented recursively. It still at worst must move the repositioned value from the top of the heap to the bottom (O(log n) operations), but for each move, it also makes a recursive call, adding stack space to the complexity. So this is O(log n) for both time and space complexity.
| if left_child_index <= len(self.store) - 1: | ||
| return True | ||
| return False |
There was a problem hiding this comment.
👀 Performing a boolean check (if condition) then returning a boolean is an anti-pattern. We can return the value of the condition directly.
return left_child_index <= len(self.store) - 1| time complexity O(log n) | ||
| space complexity O(1) |
There was a problem hiding this comment.
👀 Nice of you to include the complexity here when we forgot to ask for it!
This is where the O(log n) time and space complexity in remove comes from. Since heap_down calls itself recursively, the space complexity will grow with the depth of the stack, which is the height of the heap, or O(log n).
| self.swap(index, left_child_index) | ||
| self.heap_down(left_child_index) |
There was a problem hiding this comment.
Before deciding to swap here, we would also need to compare the left child to the right (if present). We only want to swap the one child which is the smallest.
Currently, if the left child were smaller then the parent, but larger than the right, first we would continue swapping the parent down the left subtree, then potentially need to swap the new parent down the right sub tree. Consider the arrangement [3, 2, 1]. 3 > 2 so it would swap with the left child → [2, 3, 1]. But notice the heap property is not re-established since 2 > 1. This code will continue and swap 2 and 1 → [1, 3, 2]. But if we had compare the values of the children first, we could see that 1 < 2 and swap that with 3 directly → [1, 2, 3] which re-establishes the heap in a single swap.
| Time Complexity: ? O(n) | ||
| Space Complexity: ? O(n) |
There was a problem hiding this comment.
👀 The theoretical best time complexity for a general sorting algorithm (like heap sort) is O(n log n). Here, we can see that we add n things to the heap (which each take O(log n)), then we remove n things from the heap, which again takes O(log n) each, for a total of O(2 n log n) → O(n log n). The space complexity is O(n) due to the internal store that MinHeap uses.
| for num in list: | ||
| da_heap.add(num) | ||
|
|
||
| while len(result) != len(list): |
There was a problem hiding this comment.
Consider using the empty helper
while not da_heap.empty():
Heaps Practice
Congratulations! You're submitting your assignment!
Comprehension Questions
heap_up&heap_downmethods useful? Why?thanks for looking at this down to the wire assignment!!!