If you want a foundation for computing, may I humbly suggest imperative algorithms in the integer RAM model. That gives correct answers to questions like "what's the time and space complexity of quicksort in the average and worst case?" Good luck answering that with lambda calculus, or Turing machines for that matter.
The RAM model is nice for some simple space and runtime complexity analysis. Of course, it doesn't know about eg caches or parallelism, either.
(Lambda calculus is an interesting foundation mostly for work on correctness of algorithms---indeed analysing cost of computations is harder. Chris Okasaki has done a good case study of analysing runtimes of purely functional algorithms / data structures.)
If you are happy to prove correctness of tree-sort, which does the same comparisons as quicksort, it's straight-forward.
If you want to talk about the clever in-place quicksort, you'll want to talk about a slightly more complicated model.
Either implicitly, where you still model everything as lambdas, or explicitly: you can model state (like an array) and its manipulation inside a lambda calculus framework just fine. Eg using linear typing, or something like Haskell's state monad (which has a pure description, but ghc is smart enough to optimize it to what you would expect when translating to machine code).
So that would be the traditional analysis of quicksort, mucked up with linear logic terminology so you can't see what's going on.
The sweet spot of lambda calculus is structurally recursive algorithms on trees. You can use it for other kinds of algorithms, but there it just makes things harder, and I've never seen it solve a problem that wasn't solved first by other means.