Think we'll have the first job niche shock from AI that the wider public notices. A bit like online translation market just vanishing...except big enough for people to go "oh". Probably transport or warehouse related.
US democracy fundamentally breaks (more).
Substantial global turmoil directly resulting from US losing it's way. Will take time for an alternate order establishing itself in that void.
People's minds will become even lazier due to prolonged daily use of LLMs. They literally won't be able to think for themselves without AI assistance (that's why OpenAI won't fall, btw). Attention spans will drop even lower, causing severe psychological problems. Think 'Digital Dementia 2.0.'
Later, LLMs will be portrayed as something evil, yet everyone will still use them. Parents will use them, while telling their kinds not to do so.
Leetcode is already standard for SWE interviews, but other industries will need to adopt similar tests to verify that an applicant's brain is functioning correctly and that they're capable of doing the job. Maybe a formal confirmation from a psychologist specializing in 'fried brains' will be required.
My knowledge and engineering has only gone up over time. I read significantly more and higher quality technical information and need to debug problems significantly harder.
I think AI will in general make everyone a lot smarter. Maybe the people who use AI as a companion will melt? I'm sure there's some kind of repetitive addiction loop that could melt your brain just like anythibg else though.
Have you considered that you are a minority who leverages LLMs for your own personal gain? The parent is referring more to the general population who is already doomscrolling away and would LOVE a service that generates prompts for them due to the hassle this represents for them.
I understand it. For example, with AI you don't need to remember stuff. Like there is a command in MacOS (two actually) to flush the DNS cache. I used to memorize it because I needed it like twice a week. These days, I can't remember it. I just tell Copilot to flush the cache for me. It knows what to do.
And it's like that for many things. Complicated Git commands that I rarely need. I used to remember them at least 50% of the time, and if not, I looked them up. Now I just describe what I need to Copilot. But also APIs that I don't need daily. All that stuff that I used to know is gone, because I don't need to look it up anymore, I just tell Copilot or Claude what to do.
Is that really a bad thing? It's like saying Google Maps makes you lazier, because you don't have to learn navigation. And, heck, why stop there: cars are just insanely lazy! You lose all the exercise benefits of walking.
Why is losing the ability/interest in navigating through a paper map by hand bad, though?
Humanity has adopted and then discarded skills many times in its history. There were once many master archers, nobody outside of one crazy Danish guy has mastered archery for hundreds of years. That isn't bad, nobody cares, nothing of value was lost.
Hm, perhaps a way to export all your chats from any AI provider you use + sending it back to an LLM to just sum up all the commands that you use in a text file that you can reference?
Like I am starting to use etherpad a lot recently and although I have proton docs and similar, I just love etherpad for creating quick pads for information
Or to be honest, I search it on the internet and ddg's AI feature does give me a short answer (mostly to the point) but I think that there are definitely ways to get our own knowledge base if any outage happens basically.
lol I also had all sorts of commands memorized for k8s and pandas I don't remember at all. But let's all be honest, was it valuable to constantly lookup how to make a command do what you want?
I wasted so much time on dumbass pandas documentation search when I should have been building. AI is literally the internet all you are doing is querying the internet 2.0.
I often kept vast ugly text documents filled with random commands because I always forgot them.
I can only speak from personal experience. But as a 21 Year old, I'd definitely say that AI has made me so much more unproductive and reduced my attention span immensly. My Brain was already fried from social media and now there is always an "easy" way to do annoying but very educational tasks. And amongst my peers, especially those without a background in IT, misunderstanding and anthropomorphising has made this even worse. I think for people who already have great skills, AI will probably be helpful, not harmful. But for my generation, which has been through covid, social media and now has to figure out healthy AI usage, this is a fight already lost.
Eh, there's literally always something like this being told and doom and gloomed over. When I was 21 I heard almost the identical statement you said. Covid being the exception.
The only thing that matters is if YOU care. Do you like software? Do you want to learn and make something that was unatainable to you a year ago?
There's also a major difference between college and work so you shouldn't sweat it so much.
It's like saying in the 1970s that people would become dumber because of calculators. It's a tool. You can use it lazily and not learn much, or you use it actively as something that propels you further along in your learning.
no, its not - with calculators, it was up to the person using the tool to figure out which formula to use and which values to plug into the calculator.
The human has to have enough understanding of the problem to know which math to apply and calculate in the first place. That requires understanding and discernment. This works the brain. This mental work strengthens our problem solving ability.
Whereas with AI, you just tell it the problem and it gives an easy answer. Thus involving no further work from the human brain which causes it to atrophy just like any other underused muscle.
You do know this reads the same as every pessimistic commentary on technology ever, right? So many people were convinced that television was going to fry our brains.
AI stays the top story but in a boring way as novelty wears off and models get cheaper and faster (maybe even more embedded). No AGI moment. LLMs start feeling like databases or cloud compute.
No SpaceX or OpenAI IPO moment. Capital markets quietly reward the boring winners instead. S&P 500 grinds out another double digit year, mostly because earnings keep up and alternatives still look worse. Tech discourse stays apocalyptic, but balance sheets don't.
If you mute politics and social media noise, 2026 probably looks like one of those years that we later remember as "stable" in retrospect.
> If you mute politics and social media noise, 2026 probably looks like one of those years that we later remember as "stable" in retrospect.
I love this, we focus way too much on the apparent chaos of daily life. Any news seems like a big wave that announces something bigger and we spend our time (especially here!) imagining the tsunami to come. Then later, we realize that most events are just unimportant to the point we forgot about them.
To me, this is wishful thinking. The more I see these "our jobs are safe" claims, the more I fear our jobs are not safe, and people are just trying to convince themselves which is an indicator of turmoil ahead.
What does "safe" mean? Unemployment in the US right now is under 5% which is historically very good (even though it's been slightly trending upwards over the past few months).
Employed. My contention is the AI is getting so good at doing tech related things that you'll need far fewer employees. I think Claude Code 4.5 is already there. Honestly, it just needs to permeate the market.
I agree that Claude Code is a lot more effective than I was expecting, but I don't think it can fully replace human software engineers, nor do I think it's on any trajectory to do so. It does make senior engineers a lot more productive so I could see it reducing some demand for new grad software engineers.
>>> There are even more restrictions aimed at stopping kids using social media. Restrictions on phone use in schools, who can sign up for social media, etc..
Additionally, that it'll eventually prove to be a wild success, with significant benefit to kids.
On the darker side, the same technologies and restrictions will be applied in various ways to adults (similar to the porn verification laws), which will have significantly more negative effects.
There are some correct predictions it seems, a lot of this is happening around the world so it would be interesting to see how this pans out in 2026, thoughts?
YouTube will be so inundated by AI cat and dog videos that people stop watching them altogether. People will automatically assume anything labeled "cute" is fake.
I am in a hospital ward on a ward floor: everyone, every single person, every age, is scrolling 30s AI 'funny and cute' videos 247. It is quite eye opening as I never saw this close enough to actually see what they are doing. I asked why: they say it is funny, entertaining and if there has been no humans involved, they don't care as it is entertaining and fun: they rather have more different than more original/human.
I saw my neighbour here watching shorts about fat americans abusing all you can eat restaurants, generated by AI, for 7 days in a row now, whenever he is awake. He is 45 years old and wants to show me the funniest ones.
I told some of them and they said so what basically. But I guess many indeed really do not know. It is incredibly obvious but if its funny, I guess people will just laugh and not be critical.
Facebook is already like this. People didn't stop watching. In fact, they mostly stopped caring if it's fake. And no point in debunking the obvious BS: they scroll faster than you can say AI.
My mom sends me AI videos of cute cats doing the impossible (flying around the room, washing raccoons, etc). When I told her they were AI, she said “so what, they’re cute”
I wish, but from what I've seen, most "normies" don't care at all as long as they are entertained. It makes me depressed seeing my friends consuming and sharing all that AI slop while either believing it's true, or not even asking themselves if it is. At least they're having fun while they still can... right?
Significant economic turmoil. Either from private debt or AI finally deflating. Probably due to enough data center projects that won't materialize due to funding getting cut.
AI will cause the value of knowledge to plummet. This will be felt particularly by universities - graduates will be valued to near zero by business. Elite universities will continue to do well but the value there becomes elitism and social status (this has been apparent for a long while already).
Value of experience will grow - compounding this will be less opportunity for junior people to get that experience so they will be increasingly (and rightly) frustrated.
The opportunities to have a successful career as a salaried employee decline significantly - people are pushed to start small businesses out of necessity - and this barrier is lower due to AI.
A flurry of new dev tooling companies start up re-packaging software development processes and concepts into a more user friendly package that allows software development to go even more mainstream to accommodate this.
The Independent Variation Principle, the unifying software design meta-principle, becomes widely known, recognized and applied across the industry and academic world.
This piqued my curiosity. Are you the author of the paper?
For this principle to become widely known, it needs to be communicated in a more succinct way. 400 pages is too much to ask people to invest.
Even so, as far as I understand the gist of what the paper says, the principle helps make explicit some intuition I’ve had about design for a long time.
It sounds like a retake/elaboration on the "Commonality and Variability Analysis" used in Domain Engineering and employed in "Family-Oriented Abstraction, Specification and Translation" (FAST) software engineering methodology.
This is less of a prediction, more of how I see this industry progressing.
I really feel "web dev" is going to get highly commoditized by GenAI, by web dev I mean 99% of building CRUD-adjacent apps, we are already seeing it now with tools like Claude Code etc, this pipeline is just going to get more refined, with tigther testing feedback loops, PR-workflows and a CI/CD deployment pipeline which the GenAI will control. This might be amplified by the fact the sheer amount of tested, high quality there is in the JavaScript-ecosystem for the AI to train on and learn from.
Software engineering in general will tend more systems and embedded software, fields where GenAI can't perform well or can't be trusted to produce good code (I am thinking writing device drivers, or maintainence of legacy C applications) as well as deep research fields. The average software engineering job might be either that of a "technical product manager", or "researcher" or "low-level systems expert"
That's just what I feel. Honestly, I am probably much younger an others on this forum, so I haven't really seen this industry "evolve" this is just how it looks to me now. I believe there was a time in the early 2010s where there was a boom of this "generalist developer" where if you knew your JavaScript-ecosystem (or App Dev ecosystem for that matter) pretty well, you could land a pretty decent job right out of college, or without a college degree at all.
To me, at this stage the world in general needs software engineers who understand the "world" if that makes sense (in terms of physics, mathematics), or who have a really good mental model of computation. Better put, software engineering will become a tool in the larger context of research & development of tech that advance humanity.
Yeah, I think that's a fair point. I mean, we have SRE, which is a whole set of skillsets in itself and it does involve around maintaining "CRUD-apps"; Heck there is the cybersec industry as well, ensuring CRUD-apps aren't exploited. My para was certainly a bit more 2D, I was speaking more from what I might benefit learning now, to get a job, say end of 2026. I have observed many roles like SRE etc, require certain experience which is hard to get by working on your own projects without much traction; similiar case for cybersec but to a lesser extent.
It's this paradox where in order to become a senior engineer, you must get hired as a junior engineer, to learn and observe how production software works, but that is pretty hard these days.
> It's this paradox where in order to become a senior engineer, you must get hired as a junior engineer, to learn and observe how production software works, but that is pretty hard these days.
Yes that's true but I would argue this isn't because of AI itself. Sure it might have accelerated it but I have heard that in the 2020's a lot of people got a lot of jobs in the industry and then its that the market did feel saturated and in a sense, a lot of jobs feel this way in the economy where if someone has it, they continue having it but its becoming harder for new people to enter, there are financials reasons for this too mostly if i remember, interests rates are one of the core reasons
That being said, Even though I have used LLM's a lot, I do not agree with your opinion and I am even younger than you most likely (17)
A lot of people my age/people just going into college would use AI a lot to cheat but I will genuinely try to take it slow to learn things. I will try to use it as a learning tool and not as a crutch. Currently I am still in high school and I get ideas which I wish to implement but dont have the time because of exams so I test them out with LLM's but even I find the whole process frustrating at times and its just, maybe its me but there is a good ceiling that I can process. Some basic crud processes can definitely be in that but if your project is novel, even basic things can be hard
I will give you an example, I recently tried creating an api for proton docs which worked by having browser instances. I had two basic scripts that worked on one platform and converted them to puppeteer using LLM and after it worked, I then asked it to simply create a very basic crud api on top of it
Nope, I tried it 5-6 times on the best models on the market but they couldn't actively take two files which had read/write and have it work, either there was an issue in read or an issue in write
I am fine with the project as it is right now and I am fine with the templates it has given me and I am going to build it now on top of it
I had many ideas which could be considered basic crud apps like having a kanban app/github issues like ui on top of bitwarden after I saw a post here by simon saying how he uses github issues and that went semi-viral on hackernews
Another issue is that I think just as how sure writers and similar can generate AI generated, I feel like there would be more trust on the non AI generated code.
Personally I will try my best to create prototypes with AI and if I like them, I will see what it does and then rewrite them myself as an learning experience and also because people say spec driven development etc. but I just want to code things by hand at this point or convert the LLM generated prototypes of idea to something that I later code and understand by hand tbh (most likely when I get into college)
Why can't GenAI produce good code for embedded systems? There is nothing fundamentally different between the 2. Seems like some form of nimbyism - "AI can't do what I can because I am special" sort of thing.
any international trades happening in non-US dollar is an "acceleration" of dedollarizatiom, given that currently all trade is dollarized.
Original:
I don't known if I understand the spirit of the question. There are no metrics or tracking of how "dollarized" the international economy is because virtually all trade is done in dollars and there has been no risk of this changing for the past ?65? years
My original prediction for 2026 (not 2025, mind you) made in 2024 was based on a few factors, if I recall correctly, 1) the Russia-Ucraine war and related sanctions; 2) the open discussion in BRICS about trade in direct currencies; and 3) the then-promised isolatonist policies of Trump's second mandate.
What we've seen in 2025 makes all of those stronger I guess.
1) I don't know the status of the Russia-Ucraine war; all I know is it is a glaring sign of the end of Pax Americana (as is the Israel-Palestine genocide/war, by the way, which the US would -not- have allowed given the strong internal opposition and negative popularity among the American public). So sanctions of Russia notwistanding, that status quo, which strongly related to dollar as reserve currency, is either agonizing or dead.
2) Trump and the US attempted a very strong response to the (first ever?) trades in local currency between Brazil and China, mostly in the form of targeted tariffs (all discourse about Bolsonaro and whatever being the motivation for tariffing Brazilian imports being, in my opinion, political smoke and mirrors). I don't think it worked. I think BRICS might press on this. If the mercosur/EU trade deal goes through it will also be a strong force towards trade in Euros/Reais/Pesos directly I believe.
3) I don't think I need to clarify; tariffs and other issues brought on by the current US government were severely negative to the placement of the US as a preferred trade partner. This is to beyond economic choice; active anti-american sentiment are at all times high in countries like Canada, Denmark, and Mexico, all historically aligned with the US. This may not seem that relevant but come election cycles in those and other countries we might see platforms/candidates that openly propose to "secede" from the US hegemony international order. If that happens, removing US military bases is a first go. Alternatives to the US "petrodollar" a close second.
So far.. my track record has not been great, but maybe I was being a little too optimistic as I shuffle various futures in my head so lets start from more likely to more fun:
- Unabated push towards 'Snow Crash' level of extremely localized power structures at the expense of federal government ( think K shaped economy, but for governmental structures )
- Actual further descent into K shaped economy -- that.. I fear.. is a very safe prediction to make now
- Midterms will see some localized polically motivated violence ( likely across the spectrum bar some pressura valve release )
- Shadow wars will continue
- Bitcoin will crash; monero will replace it as dollar falls
- Companies and government will desperately work together to contains severely distributed ASI level entity that exists as hidden braille invisible characters across all known fora
- I manage to to move to full WFH
- Valve releases HL3 on Frame
- Fusion power will get closer by two kiloseconds
Honestly, I think it ran out of speculation runway and current 'crashing' is just poeple cashing out the profits they made on the recent swings. It doesn't take away my 2026 prediction tho:P
I suspect that 2026 will be the year we see a big breakthrough in the use of LLM agent systems. I don’t know what that will look like but I suspect the agents will be doing meaningful research (probably on AI).
- EVs will be in the Early Majority group, and 2/3 of them will be Chinese. At least 1 in 4 of new cars purchased will be EVs.
- AI will innovate towards visuals, personality, and tool use. AI tool use will start to innovate past just reading docs, maybe into more things like gaming and robotics.
- Some AI products (not necessarily LLMs) will start competing on latency. Notably on voice/calls, but also things like drones, robotics, etc.
I think EVs will head in the complete opposite direction in that sales will slow down and they will continue to be a minority.
Ford just killed off the F150 lightning and EVs (but also new cars) are still expensive purchases in a time with a lot of economic uncertainty.
While Chinese companies are making affordable options all the markets seem to love putting tariffs on them in order to keep their homegrown automakers alive.
Looking into this, this reinforces my predictions? I looked up the Ford F-150 Lightning and a quote catches my eye: "When the electric truck debuted in 2022, Russia had just invaded Ukraine, disrupting supplies of nickel, a key material in EV batteries."
Raw materials and cost is a big part of the Chinese dominance on EVs and it'll continue to be on that side of the political sphere.
Having to tariff China also emphasizes that they're gaining ground too quickly. They process about (over?) 80% of the major parts, so you can't fully tariff them either, only assembled cars or some parts.
I just read on a Polish automotive portal that the government has concerns about cybersecurity in Chinese cars. I wouldn't be surprised if Chinese cars were entirely banned for some businesses in the future.
1. Bazel is still not widely used outside of massive monorepos. (because its such a pain to use)
2. Solar power will surpass wind power in the US to become the 4th largest source of electricity. https://eia.languagelatte.com/
3. Starship begins launching real payloads, achieves reusability of the upper stage, and successfully does a ship to ship fuel transfer.
4. Tesla stock has a major correction (>20%) as it becomes increasingly clear that Waymo, Zoox, AVRide, and various Chinese companies are significantly ahead in AV technology. And as it becomes clear that Optimus is a sham.
Chinese will always be irrelevant to the US car market as both political parties will block chinese vehicle sales on (valid) national security grounds.
Uber and Lyft stocks crash as markets realise the game is up - nobody can compete with Tesla who can afford to burn excess spare factory capacity driving cars directly off the line to start picking up passengers. Waymo might have good AI but can't possibly compete with Teslas unit economics.
> 20% is not a major correction. It just recently doubled. Even at price before it doubled wasn't considered undervalued, so anything < 66% down is not a major correction.
I'll correct my take then: due to this, the epidemic of loneliness will start to surge like never before. This might pave the way to some reaction in the public opinion, but real concrete actions will not happen in 2026, I would rather expect them around 2028 or even 2030.
In the software engineering world, in 2026 we saw a wave of code assistant products. In 2026, we will see a wave of designing software architecture products, not just on greenfield projects but also brownfield projects.
There will be a cliff in the number of people in this industry. There will be a bunch of senior people floating around the job market and no more junior talent. There will be fewer new grads and the pipelines will dry up.
If they go down the route of automating as much as possible, it'll destroy the social pipelines that allow companies to reproduce themselves.
I don't think that'll actually happen, but it'll be interesting to watch
What are your opinions on accepting junior talent say, 4 years from now?
Would there be any changes because I feel like a lot of junior talent isnt picked because of AI but I feel like given a long enough timescope, AI bubble will burst and so taking that into account, what would you say about the job market?
Unless there is an exponential growth in model context AI is not replacing anything. So far every new model launches with some increase in advertised intelligence, but only with increasingly dubious ways to measure the same.
All the current crop of AI does is it makes the job of some people easier and allows some others to level up faster in terms output. If managements are going to be stupid enough to destroy the human side of the business, then they are only setting themselves up to spend much much more hiring the right people in the future.
- Current LLMs being seen more and more like commodities
- We'll start see a LOT more companies making their own LLM
- Capital stops pretending everything is fine & we finally see the stock market reflect what's been slowly bubbling up socially (people 'feeling poorer', trust declining, consumption becoming more cynical)
I predict certain political factions in the US will spend much of 2026 looking for ways to introduce a delay in the 2028 elections, such as pursuing a war so that Congress can postpone elections until they can be held safely. Which has never happened before, but who knows what absurdities will be given the OK these days?
Big mergers in the tech world with excuses including "AI". Hopefully - some reversal of sticking AI to products that don't need it. Even more massive layoffs. And the worst part - still no Half Life 3.
I think that food instability will be a serious problem. I don't think that the store shelves will be empty per se, but rather I believe it will be priced so high that people will be starving because they simply can't afford it.
Its already happening but I believe it will accelerate in 2026 especially with the Fed turning the money-printers back on. Inflation is sure to increase :-/
Physical AI will make subversive discoveries that exceed everyone's expectations - space-time integrated computing, rather than the current three-dimensional spatial computing plus discrete time steps
The problem lies here: the current "embodied artificial intelligence" still uses 19th-century numerical methods (the Adams-Bashforth integration method from 1883 and the Runge method from 1895) to represent time frames + three-dimensional space calculations to approximate four-dimensional spacetime (relativistic covariance has proven that spacetime is an integrated whole, i.e., four-dimensional spacetime). I will release more specific code later - you might wonder, don't the "scientists" at those big companies know about this? The answer is that they do know, and I will also release the reasons later, which will definitely surprise you!
Let me explain again why I said "disruptive" rather than "substantial":the current "embodied artificial intelligence" still uses 19th-century numerical methods (the Adams-Bashforth integration method from 1883 and the Runge method from 1895) to represent time frames + three-dimensional space calculations to approximate four-dimensional spacetime (relativistic covariance has proven that spacetime is an integrated whole, i.e., four-dimensional spacetime). I will release more specific code later - you might wonder, don't the "scientists" at those big companies know about this? The answer is that they do know, and I will also release the reasons later, which will definitely surprise you!
While 'substantive' would mean major progress within the current framework, I’m predicting a shift that subverts the current foundational assumptions of robotics.
Right now, we treat time as a secondary sequence—an 'add-on' to 3D space. Moving to a unified spacetime architecture isn't just a big improvement; it fundamentally undermines the discrete-frame logic that almost all current CV and RL models are built upon. It’s 'subversive' because it requires us to unlearn the way we’ve been processing motion for the last decade.
Allegedly AI recently discovered vulnerabilities in React Server that were/are being exploited on unpatched systems, so that's subversive, and we might expect a lot more of it before it gets better.
> Major breakthroughs in robotics research thanks to ultra scalable simulation software and RL
I was hoping that plumbers, electricians, construction, factory workers still had another 10-20 years before being priced out of the work pool.
I was also hoping we had that much time to use threat of mass protest to keep our politicians aligned.
I genuinely think it's game over for 99% of humanity's interests once the top wealthiest individuals and/or their governments can simply deploy robots en masse.
Hey if developers and office jobs get fucked by AI “promises” then fuck blue collar as well. You need a united population to destroy the interests of the 1%.
* Americans will get their first taste of extended range EVs (full EV powertrain with a tiny ICE that charges the battery) and they explode in popularity. It's the perfect vehicle for the US and most investment in EV charging stations will decrease.
* Oral GLP-1s hit the market and the market shares doubles
* Both OpenAI and SpaceX IPO
* Charlie Kirk's shooter will be executed after being on death row for less than a year. 50/50 chance that it's televised.
* Luigi is also executed
* Seattle causes an international incident with Egypt and Iran when they don't reschedule the Pride Parade to not be on the same day as the world cup game. Trump sends in the troops.
> Americans will get their first taste of extended range EVs (full EV powertrain with a tiny ICE that charges the battery) and they explode in popularity.
This happened, it was called the Chevy Volt. Nobody bought it.
I totally agree that you can make the argument that people didn't buy them because they weren't sexy. Post-2008, new cars became luxury items almost exclusively. So given that, there's no reason they would catch on now unless somebody makes a sexy one.
Oh that's interesting, I didn't know that. In this case I am talking about the Ram 1500. I think this one will make a splash. A truck with almost 700 miles of range. Americans think they hate EVs but they really hate the lack of EV infrastructure.
Oh I would like so but I saw that the amount of compiler optimizations/optimizations in general for an architecture like arm or intel are so high that risc was unable to perform
I think I really like risc-v but I saw the performance and I am doubtful about this, so I am interested if you have any evidence/resources that I don't know of to back up this claim
The collective West will continue its negative momentum. People will continue to seek easy solutions to hard challenges and vote accordingly, making things worse. We'll see further fragmentation of the post-WW2 international order. China does not interrupt while others are making mistakes.
The US stock market will see a correction (and possibly a crash) due to it being generally overvalued, LLM advances unproven for most enterprise contexts, and premature investment in infrastructure. Other stock markets will be dragged down because everything is correlated these days.
Building software continues being commoditized, putting downward pressure on salaries. Software quality diminishes due to prioritizing speed and using LLM output.
Billionaires continue taking control of media companies and continuing using them to influence public thought and discourse. Big budget media production companies show no signs of newfound creativity or risk taking and continues producing remakes, sequels and prequels within existing franchises.
Many people have further diminishing mental health due to a severe value crisis and lack of human connection.
The neurological, cardiovascular and long-term consequences of repeated SARS-CoV-2 infections become better known and understood. But nothing is done to prevent further damage.
1. There will be an economic meltdown by summer. Similar to 2008, likely worse.
2. Some kind of extreme US constitutional crisis.
3. Putin will remain in power, but Russia will be struggling dramatically to keep funding the war as Ukrainian attacks keep chipping away at oil and gas exports.
Engagement bait will get amplified even more, with the goal of gathering human generated data and user retention.
Lots of new subreddits have been popping up this year, in different languages, that are flooded with AI generated rage and engagement bait posts. Facebook is feeding boomers with the same kind of slop.
Debt and Liquidity Crisis in the Bond Market that spreads to other areas. Probably first signs will be in the Repo market and go from there.
Zelensky will be replaced as Ukraine's leader and Ukraine as a whole will switch back to Russian control, moving the new front line to other former Soviet states and Poland. Also, all of the weapons provided to Ukraine will begin pointing at Europe instead of Russia
China will begin blockading the islands around Taiwan, leading to a blockade of Taiwan itself. Odds are there will be a peaceful reunification with China instead of a war.
Maduro will be overthrown in Venezuela and is all being used as leverage by the U.S. in its negotiations with China and Russia
The GOP will retain control over the House in the upcoming Midterms
A large labor reduction in many high paying jobs, such as tech, due to higher productivity provided by AI. In normal times, this would cause a kind of deflation to balance everything out. Due to the pending economic crisis, the Govt will spend even more money to try to maintain the current system, leading to more inflation.
The cost to service the current debt will be used to justify a large reduction in Federal Govt agencies and services.
The Atlas comet will not be an alien spacecraft
The public will learn more about the extensive surveillance state and be less approving of it.
maybe finally the US gvt n elites will finally admit the economy is an recession.
not this dress up and cosplaying being done - like the economy is growing - while it's actually shedding jobs, homes ain't selling cz there ain't buyers
Crypto, as an economic and innovation ecosystem, is largely dead, with the partial exception of stablecoins (which are not inherently “crypto” in the ideological/decentralized sense).
The core issue is structural: most cryptocurrencies depend on foundations that fund development, marketing, and operations through large token holdings. When prices fall, these entities eventually become forced sellers. At some point, their “whale power” is not optional but necessary to survive, creating persistent sell pressure and undermining long-term trust.
Bitcoin likely has a longer lifespan than most alternatives due to its lack of a foundation, fixed monetary policy, and social inertia. Over time, it may absorb whatever residual trust exists in the broader crypto space. However, that does not imply indefinite relevance: Bitcoin could survive while gradually becoming economically marginal, not dead, but increasingly negligible outside niche use cases.
Yup I had written a similar comment where I feel like stablecoins do seem to have a good idea but I still believe that the whole industry even with stablecoins would tend on becoming even more regulated and getting more integrated with centralized finance
I do feel like it would be centralized finance which would run stablecoins and not defi/crypto companies which would connect with/partner with centralized finance in my opinion tho.
Or as I said in that comment, the rise of finance superapps like coinbase so I feel like their stablecoins might be more of influence but either way, I do feel like its going to be extremely hard for people to enter the stablecoin market basically.
The analysis requires understanding the total addressable market of greater fools. Bitcoin, like any speculative asset, has value as long as people believe its value will go up. When there are no more greater fools to buy in, its value stops going up, so it loses value as a speculative asset. People will sell off to buy other speculative assets, causing a price drop and accelerating sell-off from others who do not want to be bag holders until the asset reaches its intrinsic utility value. Since Bitcoin is strictly worse than alternatives as a form of payment, that value is 0. While this is the ultimate price of Bitcoin, correctly estimating when it will happen requires an understanding of the addressable market, which can change sharply as different countries implement regulation.
I predict more layoffs as I unfortunately already did in 2024 [0].
But let's just say you have to prepare for 2030. The future of jobs report 2025 by the WEF is also reporting that 40% of employers are planning to reduce their workforce because of AI by 2030. [1]
Climate problems are shown to be less serious than previously thought.
The rich get richer, but nobody cares because quality of life is improved for everybody.
AI brings additional leisure time, which results in a worldwide resurgence in bluegrass music as millions take up the guitar, mandolin and violin. The biggest surge is in banjo, though. Billy Strings leaps ahead of Taylor Swift in concert sales.
2026 is really looking up! Happy New year, Hacker News!
It's like Penn and Teller say: "Everyone thinks the world is getting worse, but it's always getting better." I hope you're right. Maybe it's just my age creeping up on me and turning me into a cynic. Anyway, good to know there's still people who think positively.
I think I am young but I do feel like we base our expectations of 2026 from 2025 which lets admit, was not good. I do not think that there would be sudden change from 2025 -> 2026 that could fundamentally change reality if the fundamentals stay the same which in my opinion they kinda would.
I do feel like 2026 would be like 2025 but just a little better maybe or little worse.
Yes, I agree with this sentiment. A little better or a little worse, hopefully not too bad. But I think the same feeling of dread will persist, especially going into 2027, particularly around artificial intelligence for tech workers. My concern is that the switch won't be gradual. One day someone will come out with a model that can do everything valuable this software engineer can do. Claude code 4.5 almost hits that mark for me. In a year, I can't imagine…
It might not be AGI, it might not be able to do everything everyone could do, but it'll be enough that you can fire most of the people on your team and just use a few to guide the AI. And I think that's gonna tighten things up in the job market even more. Hopefully it leads to more job creation and it empowers people to compete with one another based on taste and individuals to compete with organizations. That's the best-case scenario I see.
There would be very strong emphasis on financial superapps, I saw coffeezilla's recent video about how coinbase is expanding to stocks and other stuff too but they are also expanding to prediction markets themselves which are also eating away at gambling/basically are gambling apps itself
Coffeezilla (and me too right now) basically shout that its like a health app tried to give you junk food off the back alley basically
But I feel like these trends will continue to grow, I was partially interested in stablecoin markets and it seems that apps like paypal,venmo,cash app etc. would essentially just stand back as stablecoin and so there are chances that these financial superapps will do these too in the near future
There would be more emphasis on removing VPN's in countries. I feel like UK (and in extent other countries too) will have some influence on gatekeeping their populations from websites who dont comply to their arbitrary rules and VPN's being the last hope for many, I feel like a real crackdown on it is gonna come in 2026
I feel like centralization and extremism might continue slipping up in 2026 too perhaps. To me I just feel like we see things and this illusion that there are many players in any marketplace but it does feel like they are centralizing (some major tech apps, some major finance apps etc. which will influence a lot of average person's opinion) and we just saw something similar happen to dram prices which i also think wont really reduce even in 2026
I am seeing people fed up with centralization and big tech and more interested in homemade-alike solutions/created by normal people. Projects like clippy and others have made more people aware of such things and I think it will continue in 2026 when more people see this real "AI tax" of increasing prices of hardware
Centralized finance and decentralized finance (although I dont appreciate crypto) feel to me would kinda converge on stablecoins and then merge/already are merging and In my opinion the company with more ties to centralized finance could win. Tech and finance seem to me would have more coupling on a rate similar to 2025.
The AI bubble has chances of popping in 2026 but I am not sure about it but that being said, its a matter of when, not if in my opinion. But if AI bubble does pop, I feel like finance and tech would decouple most likely because it seems to me that if an offer as lucrative as "AGI" for the financial investors failed in their eyes, they would stray away from real tech businesses who make real life problems much easier but wont have the lure/lucrativeness if the bubble pops up
Else, we are gonna see the same thing where people will slap AI over anything to get funding. I do feel like if AI bubble pops then the companies who picked AI hype when there was no need might be shamed for it but It never really happened in the crypto space because those companies went straight to AI after crypto so I think we would need some time in between after AI bursting and another hype to give people some time to think what really happened and the scale of it if it does bursts.
In 2026 we’re still in 1998, not 2000. The AI bubble has room to inflate, the chips and “platform” stocks will keep printing new ATHs, and everyone will tell themselves this time it’s different because “it’s infrastructure”.
What actually lands inside big companies will be much less cinematic: narrow, localized automations in finance, HR, ops, compliance. A cluster of weird little agents that reconcile reports, nudge workflows, and glue together legacy systems. Not “AI runs the business end to end”, just a slow creep of point solutions that quietly become indispensable while the slideware keeps talking about transformation.
Chinas market power will continue grow around the globe. There might be some issues with there economy but they will increase exports in all relevant inudstries.
The Solar revolution in Afrika will continue, driven by even cheaper chinese goods. It will be a story around the globe how much Afrika is changing.
CATLs Sodium Batterie will reduce prices of energy stores to a new unexpected low. Cheaper faster than expected driving further some success stories.
Germany (were i'm from) will continue struggling and it will get worse for the automotive sector (especially for the suppliers).
Due to economic issues in germany, the far right nazi party AfD will continue winning a few more % points.
Some car maker will bring a relevant cheap globally available EV car. I suspect BYD, which will increase public awareness of germans automotive sector or automotive around the world. Automotive will start becoming a commodity. Revenue per car will go down.
Climate will see some new record. Either more and stronger storms, more heat or/and more flud rain. We will talk about it temporarily and then continue going back to ignoring it.
= USA vs. the rest of the world:
Stuff will continue as shitty as they are. Companies will not compensate the inflation americans have seen so they continue to struggle.
There will be 2-3 attacks on public figures. If the economy will continue downwoards, potentially also some CEO or very well known Rich Person (Musk, Suckerberg, Bill Gates (after Epstein files), Jeff Bezos)
Epstein files will come out, but due to every public media playing it down and no one caring anymore about it anyway (we all know, he is dead, Maxwell might get out of jail) and it will not lead to a coop. I will learn the full extend of Bill Gates affairs and will strike him from my list of people i thought have a positive story arc.
Republicans will lose Midterms harder than usual/landslide style.
Conflict between Trump and the house and senate then this will lead to either Trump doing more golfing or further sliding of the USA in an authoritarien direction. The only issue here is, Trump is to old and fragile, he couldn't care less about his power because he just does what he wants anything but he can't come up with a more elaborate "USA Trump Corp" strategy to take really over. While in parallel authorian stuff depends on a person and there is no one really there. Vance? No way.
Hey lets bet on Vance and the wife of the Kirk guy coming together and the republicans liking it that much but no i don't think this will play out.
Obama Care's budget cut might be a trigger point for something.
Russia:
Russia will continue influencing US Politics, but economy will continue going down. We are coming to the magic 5 year mark, Ukraines strategy attacking russian oil is working but Russia is Russia, people can suffer there.
Derek Huffman will die.
AI:
AI will continue growing. Progress will still be seen in a lot of different areas. Nvidia will focus on increasing production and will see another Hit with Rubin. We might see some very special new thing. Perhaps material science discovery, mathematics or generall so much better LLMs.
It will continue affecting jobs around the world. Especially in fields were GenAI is already really good. Arts. There will be more AI music on Radio and in the Top Charts. GenAI for images will continue reducing jobs for 2d and 3d, advertising, texting, translations.
Robots will continue progress and impress us.
Luigi:
He will get lifelong prison sentence :(
Musk:
SpaceX will be a successful launch, while Tesla struggles in car sales, he will continue hyping and overshadowed through SpaceX, Tesla will not fall yet. The Selfdriving thing will kill someone, his robot will not be ready.
To each their own. I can definitely understand that sentiment, although I'm 46 years old, and I've turned more conservative over the last few years, I still really like Obama. I also like Trump, though. It's kind of weird. Don't want to start a flame war though, I respect your opinions.
- Major disruptions on the financial markets due to the USD losing value. Commodities go up. Not certain about Bitcoin/Crypto as it is backed mostly by the USD itself.
- US goes on the offensive on tokenization moving bonds, stocks, transfers, etc. to the blockchain. China opens up its eCNY to the grand public.
- AI bubble pops. Companies decide that LLM-coding is not worth it after accounting for the downsides. LLMs for generating photos and videos are still not good enough. OpenAI goes the way of pipedpiper.
- The war in Ukraine remains unresolved. Russia advances but only marginally. Europe still shaking its head about what to do. US lower its involvement.
- The US world cup goes very badly. Like embarrassingly bad. Goes okay in Canada and kinda okay in Mexico though.
Flock and other government tools for watching and controlling you will expand.
Big companies will expand their regulatory capture, especially in medical care. Fingers will continue to be pointed at health insurance as the problem while the real problem of an artificially limited supply of doctors goes unaddressed.
Government agencies will continue their slow bloat as no mechanism exists for government like bankruptcy in the private sector.
Patent trolls will expand their lawsuits and extort more legitimate businesses.
The far left will assassinate more Republican leaders.
Given most assassinations on US politicians (including attempted) targeted Democrat politicians in 2025, I would have different expectations to your last point.
US democracy fundamentally breaks (more).
Substantial global turmoil directly resulting from US losing it's way. Will take time for an alternate order establishing itself in that void.
reply