You can find a lot of good material just via Google, actually.
If you read the likes of The Art of Computer Programming for breakfast, you might like 'The Discrepancy Method - Randomness and Complexity' available at https://www.cs.princeton.edu/~chazelle/pubs/book.pdf But it's not for the faint of heart.
I'm surprised and delighted to see someone recommend The Discrepancy Method by Chazelle. I love this book, but even beyond it requiring a lot of mathematical prerequisites, I'm not sure of the use to most software engineers. I think only the chapters on sampling, geometric algorithms, minimum spanning trees, and maybe linear programming would be useful references on algorithm design for most software engineers.
I second the recommendation of Probability and Computing by Mitzenmacher and Upfal. In addition to being much more mathematically self-contained, there is more focus on models and techniques that I imagine software engineers might actually use, e.g. hashing, load balancing.
In addition, Randomized Algorithms by Motwani and Raghavan is a fantastic book, and should be doable after completing Mitzenmacher and Upfal. It goes into further depth on techinques for designing randomized algorithms.
A bit orthogonal but still related to the idea of "what should a software engineer read for interesting algorithm ideas", I really like Approximation Algorithms by Shmoys and Williamson. There is some intersection with randomized algorithms there too.
Chazelle also came up with the ingenious soft heaps.
I was just giving the recommendations in the context of 'Do you know any good books or resources to read up more on random algorithms?'. Not with any regard to practicality.
I have to admit, I only managed to work through some of the chapters in the books I recommended. And for example, the main reason I know of Chazelle's book it's because it is available for free online and stumbled across it a while ago.
Btw, you seem knowledgeable. I have an algorithmic puzzle that has plagued me for some years now:
Starting with an empty heap, and a sequence of inserts and min-pops, can you compute what elements will be left in the heap at the end in linear (expected) time?
Obviously, you can't just run the instructions, that would take O(n log n) time.
I have an algorithm to verify a proposed solution in linear time. But I don't have anything solid for finding one in linear time. I have a hunch that soft-heaps might be useful. And there's probably an even simpler probabilistic approach.
Ideally, I'd like the solution in the comparison model, but if you have something that works on numbers only, that's also fine.
> I was just giving the recommendations in the context of 'Do you know any good books or resources to read up more on random algorithms?'. Not with any regard to practicality.
Fair enough! I still think it's a good recommendation, I was also adding on some thoughts on things that might be easier to digest :-)
> Chazelle also came up with the ingenious soft heaps.
Yes, soft heaps are very cool!
> I have an algorithmic puzzle..
Cool puzzle! Hmm, the solutions depend on exactly what you're asking.
The puzzle is substantially easier if the min-pop operation is not required to return anything. In this case, you are solving the much easier problem "return the top k elements of an array in unsorted order". You can insert into an unordered list and increment a counter every time min-pop is called. Then the last step can be done with a basic quickselect. See https://www.cs.cmu.edu/~avrim/451f11/lectures/lect0908.pdf page 21, the "QuickSelect" algorithm. You need to do some small modifications to give you the "top k" elements rather than just the "kth" element. This gives expected linear time, and the note describes a deterministic algorithm that makes this worst-case linear time.
You can implement deterministic quick select using soft heaps, or instead you could also do a radix sort and then slice out the popped elements.
If the min-pop operation is required to return the popped element, then I believe you run into the sorting linear bound that prevents a deterministic O(n) solution. (Surprisingly, this indicates the hardness of the problem is not really in the last step, it's in implementing constant-time insert and pop). I can't think of an immediate solution off the top of my head, but I don't think a soft-heap provides the right guarantees here. I also don't know of a probabilistic data structure that provides both insert and min-pop in expected amortized constant time, and it seems that this could be an area of research. There are some better, but not quite linear results outside the comparison-based model (https://cs.stackexchange.com/questions/6455/an-efficient-dat...)
The min-pop is not supposed to return anything, yes. We are only interested in the final contents of the heap. So no sorting required. You can eg give the results of the whole operation as a bitmap over the inserts, so you need to return a linear number of bits.
Min-pops and inserts will in generally be interleaved. The prototypical example has blocks of 2 inserts and 1 pop repeated n times. (All other interleaving patterns can be reduced to this one in linear time.)
QuickSelect doesn't work for this.
QuickSelect or Median-of-Medians approaches work if you only have a small constant number of interleaved blocks (of any arbitrary internal length). Like eg all the inserts first then all the min-pops is equivalent to finding the k smallest elements.