I guess I'm just a luddite that spends my life on a CLI or text editor. Taking my hands away from my keyboard to leave finger prints on my screen just doesn't make sense to me.
I think people that do do tasks where a touch screen makes sense are probably just doing most of their work on an iphone or an ipad anyway.
Now gesture control on VR/AR setups? Sure, that feels like a new human/computer interaction system that makes sense. Jabbing at my laptop screen with one hand on my keyboard, not so much.
It’s not. I had a thinkpad with a touchscreen and while I used the touchscreen seldomly, it was useful in some applications. Notably to easily develop touch based applications.
I have a M1 MacBook Pro with the touch bar since. It’s crap. I remember the keynote where they introduced it and a DJ mixed music using it. It was ridiculous that it got approved.
> Notably to easily develop touch based applications.
Ok, actually you're right, that's a use case where I'll agree it's probably useful. If you're writing iOS applications it might be nice to run it in Simulator and be able to do gestures without having to offload to your physical device for testing.