That’s for future unreleased capabilities and models, not the model released today.
They did the same thing for gpt-5.1-codex-max (code name “arcticfox”), delaying its availability in the API and only allowing it to be used by monthly plan users, and as an API user I found it very annoying.
Commenters here seem to be missing the larger David vs. Goliath story...
Netflix was a silicon valley start-up with a tech founder (Reed) who teamed up with an LA movie buff (Ted). They tried to solve a problem: it was too hard to watch movies at home, and Hollywood seemed to hate new tech. The movie industry titans alternated between fighting Netflix and making deals. They fought Netflix's ability to bulk purchase and rent out DVDs. Later, they lobbed insults even while taking Netflix's money for content licensing. Here's Jeff Bewkes, CEO of Time Warner, in 2010:
"It’s a little bit like, is the Albanian army going to take over the world? I don’t think so." [1]
Remember: this was the same movie industry that gave us the MPAA and the DMCA. They were trying to ensure the internet, and new tech in general, had zero impact on them. Streaming movies and TV probably wouldn't exist if Netflix had not forced the issue.
Netflix buying HBO is significant, but also just another chapter in this story of Netflix's internet distribution model out-competing the Hollywood incumbents. Even now in 2025, at least 12 years after it was perfectly clear that streaming direct to the consumer would be the future, the industry is still struggling to turn the corner. Instead, they're selling themselves to Netflix.
I was at Netflix 2009-2019. It was shocking how easily our little "Albanian army" overthrew the empire. Our opponents barely fought back, and when they did, they were often incompetent with tech. To me, this is a story about how competent tech carried the day.
Netflix has been rapidly buying and building studio capacity for a decade now. Adding the WB studio production capacity is a huge win for Netflix. It makes those studios more productive: each day of content production is now worth more when distributed via Netflix's global platform.
Same with WB and HBO catalog and IP: it's worth more when its available to Netflix's approx 300 million members. Netflix can make new TV and films based on that IP, and it will be worth more than if it was only on HBO's platforms.
It’s nice to see business that rewarded customers with convenience win in the end.
Well, except for Netflix refusing their catalogue to be indexed in the TV app on macOS and iOS. I won’t pay for Netflix until they drop that anti customer practice.
If you want me to buy the video content you’re selling, it better be searchable in the TV app. And if not, there should be a better reason than you want to keep people trapped in the Netflix app.
Have you considered the possibility that much like App Store rules, Apple's requirements for "catalog indexing" go far, far beyond the Netflix catalog merely showing up in TV app?
Perhaps the judgement about Netflix being anti-consumer might be hard to sustain if you could more fully inspect the details of what Apple requires.
Everyone else allows their content to be indexed, and does not pay Apple anything for it. Disney, Paramount, HBO, Peacock, they all could have refused like Netflix.
Had to read this a couple of times to try and figure out why, as of this moment, you’ve been downvoted because this seems like one of the more insightful comments on here. Maybe it’s too inside baseball about the post-deal opportunity? Anyway not supposed to talk about downvotes so…
You were there for a while. Was/is studio capacity still a constraint on production? You read so many stories about how LA studios are struggling to fill space because all of the productions have left town for tax credits elsewhere. Curious if you’re still plugged in enough and know that it’s still true about their studio space. I assumed their interest was strictly a content play and the extra studio space might actually be an anchor they were willing to drag along to get the content/IP.
Yeah, it's ok, can't win 'em all. Lots of negativity in this thread. Maybe people have a gut feeling that "Netflix buying WB" fits into the preexisting narrative about media consolidation, and they're reacting negatively to media consolidation being a problem. I think that's more of a problem in the news media than in entertainment media. In entertainment, the bigger story is the tech-centered transitions, esp. to internet distribution. I don't think the consolidation narrative is a perfect fit in this case; this is a pretty different type of consolidation than the others in recent memory.
I think this is about Netflix's model reflecting a fundamental technology shift; any company not participating fully in that shift will be operating less and less efficiently compared to those that are. Look at the inside history of HBO's attempts to build a streaming platform; in the early 2010s their leadership knew they probably should, but were their hearts in it? Did they have executives with competence in this area? No, they outsourced it and mismanaged it. Repeatedly. But like you said, my view includes being a former Netflix employee so maybe I'm biased.
I don't have current information on whether or to what degree studio production capacity is a constraint. Content spending was publicly projected to grow, so studio capacity had to grow, which is why Netflix decided to build giant new studio facilities in New Mexico and New Jersey. Those were referenced in the Q&A Netflix held Friday morning [1]. Wild guess: Netflix's own studios run at full capacity, which is why they're continuing to expand them. I'd love to know if WB studios run at capacity.
> I assumed their interest was strictly a content play and the extra studio space might actually be an anchor they were willing to drag along to get the content/IP.
Doubt it. Like I said, I'm not an insider on that question and I'm 6 years out of date. But if I had to guess, it would be that WB studio capacity will be a highly productive asset for Netflix -- most likely, it will be more valuable connected to Netflix's global distribution model that it was when operated under WB's model.
Having a bad manager in past roles can be some of the best "manager training."
If one your past managers did something recommended in this article but it caused problems, that's ok! It just means you have seen another failure mode that the author didn't experience.
I remember being in a meeting with a bunch of the best managers at a former company. "Why did you originally want to be a manager?" was one of the first questions passed around the circle. The most common answer was, "I had this one really bad manager and I figured that surely I could do better."
>"I had this one really bad manager and I figured that surely I could do better."
That doesn't mean that those people are learning from prior bad management. That could instead mean that management tends to be people who are convinced of their own above average competence.
Did anyone ask the reports of those managers whether they were any good?
They recently resolved two bugs affecting model quality, one of which was in production Aug 5-Sep 4. They also wrote:
Importantly, we never intentionally degrade model quality as a result of demand or other factors, and the issues mentioned above stem from unrelated bugs.
Sibling comments are claiming the opposite, attributing malice where the company itself says it was a screw up. Perhaps we should take Anthropic at its word, and also recognize that model performance will follow a probability distribution even for similar tasks, even without bugs making thing worse.
> Importantly, we never intentionally degrade model quality as a result of demand or other factors, and the issues mentioned above stem from unrelated bugs.
Things they could do that would not technically contradict that:
- Quantize KV cache
- Data aware model quantization where their own evals will show "equivalent perf" but the overall model quality suffers.
Simple fact is that it takes longer to deploy physical compute but somehow they are able to serve more and more inference from a slowly growing pool of hardware. Something has to give...
- They're reporting that only impacted Haiku 3.5 and Sonnet 4. I used neither model during the time period I'm concerned with.
- It took them a month to publicly acknowledge that issue, so now we lack confidence there isn't another underlying issue going undetected (or undisclosed, less charitably) that affects Opus.
> We are continuing to monitor for any ongoing quality issues, including reports of degradation for Claude Opus 4.1.
I take that as acknowledgment that there might be an issue with Opus 4.1 (granted, undetected still), but not undisclosed, and they're actively looking for it? I'd not jump to "they must be hiding things" yet. They're building, deploying and scaling their service at incredible pace, they, as we all, are bound to get some things wrong.
To be clear, I'm not one of the people suggesting they're doing something nefarious. As I said elsewhere, I don't know what my expectations are of them at this point. I'd like early disclosure of known performance drops, I guess. But from a business POV, I understand why they're not going to be updating a status page to say "things are worsening but we're not exactly sure why".
I'm also a realist, though, and have built a career on building/operating large systems. There's obviously capability to dynamically shed load built into the system somewhere, there's just no other responsible way to engineer it. I'd prefer they slowed response times rather than harmed response quality, personally.
Unfortunately, California is a terrible benchmark. It is as close to ideal for Solar as it gets. Most places are not going to see this kind of performance
It's the same kind of thing we see with self driving cars. They can navigate sunny California streets so "self driving" must be so close! But put them anywhere with snow, rain, fog, or even just grey skies and they struggle heavily
California represents the easy 80% side of the Pareto curve for a lot of this stuff
Respectfully disagree -- solar isn't the big story here.
One could argue that batteries will have a bigger impact than solar. Batteries obviously let you decouple power generation and consumption, shifting anytime production to peak-time demand.
Less obvious is that local demand can fluctuate 2x. It usually dips mid-day and peaks 5-9pm (see the charts at www.caiso.com) when people come home and turn on their lights, oven, appliances, etc. This pattern happens throughout the year.
So forget solar for a moment; the ability to shift energy that was produced mid-day (even by a natural gas plant) to the evening would allow you to build fewer power plants. Nuclear + batteries might also be a good combination. Batteries get you closer to being able to solve for "average demand" rather than "peak demand."
This has nothing to do w/ California. California is just on the leading edge of battery installation. Solar just exacerbates the issue of the peak-to-trough ratio (evening vs. mid-day demand) due to mid-day solar "overproduction" causing it to be uneconomical to run gas plants mid-day. But solving for "peak demand" is still a problem in the absence of solar.
Still: most of the complaints about solar are answered when paired with large battery systems.
The mid-day peak of solar can be almost entirely mitigated by adding bifacial vertical east-west mounted solar panels.
It works really well: The vertical mounting means the panels stay cooler and are thus more efficient. The east-west mounting also does that, but additionally shifts the peaks of production to close to sundown and sunrise. The only real downside is that you are using (somewhat) more panel surface. And of course you're still not generating more electricity after sundown or on cloudy days, so it is not a panacea.
> I wonder how the equivalent to the "station wagon full of tapes hurtling down the highway" looks for batteries.
Looks terrible.
I think the best are only about 500Wh/kg, so a shipping container* full of them would be 28870kg * 500Wh/kg = 14.44 MWh, so if it takes an hour to the destination it's about the capacity of one not particularly thick cable.
Cables are great for electricity. If the (aluminium) cable had a cross section of one square meter, you could get all the way around the planet and back to where you started with just 1Ω of DC electrical resistance.
I seem to recall that one of the main reasons to doubt batteries was that they relied on minerals that are just too rare on Earth to consider scaling them up to the entire grid.
Was my understanding incorrect? Or perhaps have new technologies emerged that work around this limitation?
The linked article addresses that. Modern batteries are lithium-iron, without the rare cobalt and nickel. Sodium batteries are also in development, but lithium is turning out to be so cheap and abundant that investment in the sodium batteries isn’t economical.
Grid storage batteries don't have the same weight constraint as vehicle batteries which opens the door to many other combinations that have a lower energy density but are cheaper per GW despite weighing more per GW.
Sodium-ion batteries have extreme good performance in low-temperature environments. CATL is working on sodium-LFP dual-power batteries to get the best of both worlds:
That was never the case. Some people looked at "current reserve" amounts for lithium or other minerals and assumed that we had already discovered all usable deposits. That was a very incorrect assumption.
Germany is famous for being cloudy and is much further north, much of it is north of the primary US-Canada border. It is one of the leading solar adopters.
Not being ideal for solar just means you need to install more area, and there's plenty available space. Solar is already the cheapest (if not it's competitive with the cheapest wind power) power source. Also having to, say, double the panel area in lower solar irradiance requires less than double the non-panel costs (you don't need double the inverters or power transmission).
California is leading because the politics/economy/irradiance are the best combination, you would expect a place like that to lead first. It does not follow that other places are unsuitable for solar, it will just cost marginally more.
It's a strange but persistent pattern where success in ideal conditions will draw out a litany of reasons why that success is actually a sign of failure when instead the early success is just a sign of ideal conditions. Why wouldn't something promising succeed first in the place with the best conditions for success?
> Germany is famous for being cloudy and is much further north, much of it is north of the primary US-Canada border. It is one of the leading solar adopters
Looks like it's about 12 cents (in USD) more expensive on average per kwh in Germany than Cali
So maybe Germany isn't exactly a great example of energy production if it's that much more expensive
>Looks like it's about 12 cents (in USD) more expensive on average per kwh in Germany than Cali
What is? Consumer energy prices?
Apples-to-apples consumer energy prices are difficult between the US and EU, Germany has historically had particularly high energy prices for various reasons, taxation is significantly different. Also they were a heavy importer of Russian gas and foolishly dismantled their entire nuclear generation capacity, so there are other reasons why their energy prices are high. The peaking energy costs caused by the Ukraine/Russia war have gone down.
That’s both moving the goalposts and also leaving out the true cost comparison. Solving climate change should be worth a lot more than $0.12/kWh and viability in countries like Germany suggests that fossil fuel advocates saying that it would be economically ruinous are not making a good-faith argument.
Solar is still cheaper than anything else is in Germany. Deutsche Bank thinks so [1]
There's no "moving goalposts" possible. Without any morality, preventing future costs, or any such way of adjusting the "real" costs... solar is just cheaper, and by a large margin which is only expected to grow. Yes you need storage and methods to move power east and west, solar is still cheaper. The greediest, most short sighted, heartless monster would still invest in solar before anything else because it makes them the most money.
Good point. I’m not familiar with the German power market but the amount in question is small enough that I suspect our descendants will consider it trivial compared to the cost of dealing with a broken climate.
I think it has a place in the portfolio but one of the reasons why fossil fuel-backed think tanks like to push nuclear or hydrogen as the green option is that it means years of continued growth in oil consumption before anything competing comes online, whereas solar cuts into their profits within weeks. Deploying solar as much as possible buys the decades we need to get much new nuclear power online and in many cases it doesn’t require much more than buying commodity products.
Germany also generates very little electricity from solar in the winter, use quite a bit of coal plant and imports multiple Twh from its neighbors in the worst months.
So yeah sure, it can pretend to be ahead on green tech, as long as France invests in nuclear and sell it cheap electricity in the winter when it is most needed and most expensive.
I am so tired of the bullshit about Germany "success", I think they should be disconnected from the EU grid so they can walk the walk before talking the talk.
> Unfortunately, California is a terrible benchmark. It is as close to ideal for Solar as it gets. Most places are not going to see this kind of performance
We can also build power lines! Between different places! Such as the places with lots of sun, and the places without lots of sun!!!
Solar has gotten so cheap that it's cheaper to just overbuild solar in cloudy places than to build long distance HVDC lines between sunny and cloudy locations.
The great thing about solar is that you can put it nearly everywhere. Avoid the transmission losses and put the panels close to the load, even if it’s not as ideal as one state over.
It’s easy to put extra solar panels into a system to make up for reduced average sunlight. It’s standard practice to have a ratio of PV capacity to inverter capacity of something like 1.2:1. In a low sun location you could bump that up to 1.5:1 or higher.
It consumes a lot of power to transmit over long distances. From what I understand it's basically always preferable to generate power as close to where you are using it as possible.
Absolutely, but besides rooftop solar we don't have a single power source currently that does that.
Traditional power generation was always centralized big plants. Most people wouldn't want to live next to them and from a health perspective you probably shouldn't.
Also small-scale fossil fuel, hydropower, wind, geothermal, etc. What solar does differently than those is be directly usable and not have significant negatives which made it undesirable to have near non-industrial users because it doesn’t make noise, air, or water pollution.
They’re a fire risk but have you tried fossil fuels and power lines? The newer battery technologies have significantly minimized the heavy metal needs and especially the sodium-ion batteries really reduce the fire and explosion risk.
People use gas generators, though, and it’s not like you have to go back further than my grandfather’s lifetime to have factories or other industrial buildings with oil/coal plants on-premise or directly powering machinery. Pollution and logistics pushed away but we didn’t start with a pervasive power grid.
Well, if you go back far enough there were a ton of factories and other buildings which had their own small plants but the main thing I was thinking of was stuff like that xAI data center in Memphis with the methane-powered turbines where they’re avoiding grid limits and transmission losses at the expense of pollution.
My point is that a nuclear power station near a city is probably better than a wind farm offshore 1000km away even if the wind farm and the nuclear generate the same
Nuclear has significant downsides besides the waste and proliferation risks.
You wouldn't want to build a grid on just nuclear.
Let's assume that city takes 1.2MWh at its peak every day. That would mean you need to be able to supply that. So you build a nuclear plant producing 1.2MWh of energy.
Now you have the argument against renewables (the sun doesn't always shine) in reverse.
The city doesn't always need that peak power. And nuclear is the slowest of all power sources to tune up and down in terms of output.
Nuclear for base load makes a lot of sense as it'll always be fully utilized. But nuclear to power a grid 100% doesn't exist anywhere for a reason either.
That wind farm needs to be like 4% bigger to cancel out transmission losses so the question has to involve the relative costs.
The EIA has it at $81 for advanced nuclear, which is competitive with offshore wind ($88) but not hydro ($58), PV+battery ($53), PV ($31), or onshore wind ($29). Now, both of those could see big improvements with scale but I think the uncertainty of the markets and our political climate are going to complicate that a lot. A big nuclear push needs a lot of upfront funding and while Trump has boosted it a bit and hates wind, I’m not sure how much that counts on a loan that big since right now that plant is guaranteed to lose money most of the time unless it’s near a big industrial user with high baseline demand.
20 Billion is a rough guide. Apparently this estimate is approximate to 100%
It also depends on lots variables, like - for a large powerline the towers can cost $1M each. I also assume the initial cost does not include maintenance costs.
Maintenance costs exist, but they can also extend the lifespan well past 50 years. So it’s all a big cost vs benefit tradeoff where maintenance occurs because it’s a net gain.
The point is the upfront costs aren’t actually that significant.
There are several 50 year old solar installations that are still working fine. They've only degraded 10%-20% or so. The reason that they are typically replaced after ~20 years is that the technology has gotten so much better that you can replace the old panels with new ones and get 4X the power in the same space. The wires and the mounts are worth more than the panels, so reuse those with modern panels.
Considering they work by basically taking electrons from the material, they are guaranteed to get much less efficient and I doubt it's a linear effect.
I think once you have destroyed the first layers, it becomes much more complicated to get meaningful power depending on variables.
Solar panel talk always focuses on the ideal conditions like California, but you have to account that plenty the energy transition is necessary in place where solar panels are not that efficient to begin with.
Perth and that part of WA has its own generators and independent electrical network. Darwin in NT Australia is probably the same situation too. Large expanse's of desert and very sparsely separated areas to the east coast.
You need to persuade the median voter about this, and looking at the state of Western politics right now, it seems to me that peak willingness to spend huge money on environmental projects is behind us.
one state supplying the rest is not what you want, especially if there are chances that something will disturb that states grid.
I remember the uni day discussions about Africa supplying the rest of the world with solar energy and that the material requirements for such an infrastructure should become a thing around 2025 - 2035 ... then someone explained climate change and hinted at the exponential function ...
back to topic: you'd have to maintain an "inert" backup, which isn't portfolio-communist-economically "viable". or you share the load "as much as necessary", which would still become an issue if any of the suppliers have a fallout ...
one of the projects listed is probably somewhere in my bookmarks.
and AFAIK it still is a real plan but, officially, we are not where we wanted to be material science-wise ... heat, distances and batteries are still somewhat an issue, meaning, not efficient and cheap enough to be a no-brainer (despite already being a no-brainer but, you know, wolfs of the wall street bangers are super cool and smart and who wouldn't be partisan with leBuffets et al???)
The white paper they are discussing [0] includes multiple cities around the world:
> Las Vegas can reach 97% of the way to 1 GW constant supply and Muscat in Oman – 99%, using 6 GW solar panels and 17 GWh battery. Even cloudier cities like Birmingham [UK] can get 62% of the way to a constant supply every hour of every day across the year.
I'm all for not letting perfect being the enemy of good, and I used to work in solar. I'm very in favor of it. I just really think people need to be realistic about this stuff
I think the real way to read it is:
Renewables are at a point where, combined with battery storage, they can supply 100% if we chose to while being competitive on cost.
The Birmingham case is a perfect example where solar might not be the source, wind is
What is the harm, specifically? Is it greater or less than the harm for steel? How?
I have never gotten a clear explanation of this from someone who expresses concern for batteries causing environmental damage. They aren't harmful like car batteries, for example. And the people expressing concern about batteries never seem to express concern for the far greater use of other minerals in their daily life.
> Unfortunately, California is a terrible benchmark. It is as close to ideal for Solar as it gets. Most places are not going to see this kind of performance
There are many place that get a lot of sun. As solar panels come down in price, it becomes even easier to compensate for deficits with additional panels.
It’s common practice to install more solar panel capacity than inverter capacity because panels are rarely operating at peak output anyway. If you’re installing 100kW of inverters, you might install 120kW of panels. The panel array wouldn’t exceed 100kW most of the time anyway.
In a location with suboptimal sun, you might install an even higher ratio of panels to inverter and battery capacity.
Some people get bothered by this because they feel like some of the solar power is wasted at peak capacity, but you have to consider that the inverter and battery capacity is also wasted when you’re not sending enough from the panel array. It’s a balancing act.
You also have to consider that the same sunlight that makes California good for solar also creates additional demand for air conditioning. A location with less sun would have less solar heat gain, which is easier to serve for many reasons.
It's also not too dissimilar from Texas, Arizona, Oregon, Nevada, and Utah - which together are a much larger part of the US - and not too far from being able to ship power not too expensively to even more areas.
Florida and Colorado are not much farther below California in total solar radiation per year per sq meter, either.
Texas consumes almost 2x as much electricity as California [1], and is almost 1/8th the entire US consumption, so population is not the most important factor.
What I am saying is "it works in California, it must be ready to roll out globally immediately" is silly
It works in California means "it is ready to roll out in areas with very optimal conditions but long-term it still likely has a lot of speedbumps so temper your timelines accordingly"
That's all. I'm just tired of people thinking timelines are short when stuff performs under super ideal conditions
How about “it works in California, it’s ready to roll out in most places between, say, 45° north or south where most of the human population lives”? It’s not universally solving every problem but pulling the gigawatts of power demand it _can_ solve out of the mix buys time for us to decarbonize harder problems.
The whitepaper says that the optimal power generation for Birmingham is 62% solar. That's about as far from ideal as you can get bar locations north of the Arctic circle, and it still says that solar should supply a small majority of the power to Birmingham UK.
"It works in California" means that it is potentially viable elsewhere. As others have pointed out: California is not the sunniest, not the simplest roads, most consistent weather, or the most favorable regulatory environment.
If it works in California then it will probably work elsewhere. AND more importantly, if it doesn't work in California then it's probably not going to work anywhere else.
==I'm just tired of people thinking timelines are short when stuff performs under super ideal conditions==
This feels like a strawman. We just started doing this in one place with ideal conditions. The next step could be other places with even more ideal conditions New Mexico, Arizona, Wyoming, Colorado, Nevada (all have more sunshine than "ideal" California). [0]
We need to start investing immediately to begin overcoming the speed-bumps you mention. California has gotten to this point in about 5 years. Spend the next 5 years on the states I mentioned above. Then move on to Florida, Georgia, Virginia, Utah, South Carolina, and Kansas.
Solar, self-driving, tax policy, software engineering, cryptocurrency, the list goes on and on. You can have 100,000 successes and 1 failure and someone will say "It just doesn't work!".
For some definition of "long distances" sure. At a certain point it basically becomes more economical to charge batteries and ship them by truck than to build wires :/
Western Washington is a great contrast. We get a decent amount of sun (despite the reputation), however, our electricity prices are insanely low due to close-proximity hydroelectric power.
As a result, solar is rarely cost effective even with subsidies, and basically never without them.
Doesn't mean people don't install it for various other reasons, but it serves as a good contrast to California despite similar political landscapes.
Which is perfectly fine? You're just using what's abundant to you?
And even better, hydro has the ability to control how much it generates. You have a surplus? Let less water flow through the turbines. So it can regulate, something solar can't do, it needs batteries to do that.
The one big upside that I haven't seen mentioned is that rooftop solar is local. So what I overproduce doesn't go on the big grid, it's probably consumed by my neighbor or someone in my street.
All those big power plants, and big consumers of electricity (because they're switching from their current source), will lead to net congestion where you need to decide if you want to increase net capacity... Which is slow and $$$
If I go solar, which seems likely, I will definitely not be sending it back to the grid since I don't think that's something that scales well (8-bit guy talks about it on YouTube and it's persuasive).
I'll be using a local / non-grid-tied system with Ecoflow batteries and a smart panel or transfer switch of some sort. IMO this is the best way to go when solar is non-competitive in terms of selling electricity back - also significantly more robust against disasters, since grid-tied systems do not work when the grid is down (something consumers tend to gloss over or be ignorant of) due to safety regulations.
> But put them anywhere with snow, rain, fog, or even just grey skies and they struggle heavily
I have winter basically 5 months of the year where I am and have no issues being fully off grid with only solar and batteries as energy sources. You do have to compensate for winter by having more panels and more batteries but easily doable.
I think it has been a good benchmark during early development, but you’re right that it becomes less useful now that Solar is further along. Maybe some Midwestern place would be a good benchmark now? Or like England?
Houses don't tend to have consistent electricity usage. Think of the electricity usage for a suburb. Chances are it's almost 0 most days. People get home and turn on the stove, the washer, the TV, etc. Big demand spikes at certain times of day, low demand for the rest
Commercial usage goes the opposite direction. Higher during the day when stores and offices are open, lower overnight
Industrial usages tend to be more consistent, but constant. Factories don't shut down overnight, but they have high constant requirements
But as a power company you are supplying all of this. You have to supply a consistent load that satisfies all of these usecases. It's all one grid after all, outside of some very unusual cases
And if you oversupply too much you break the grid. If you undersupply then stuff that relies on you shuts down and breaks
It's very much a goldilocks problem, which is non-trivial to optimize or automate
Not to mention that combining batteries, i.e., storage, with generation is not exactly accurate or even honest. It is merely an offsetting and also a compensation for the deficiencies of “renewable” energy, which are always mitigated by that type of intentional muddling of classifications.
That is much higher than I would have expected. Good news. What are they using to charge the batteries though? I wonder if it's offpeak renewables or mostly natural gas?
Daytime solar, in the summer especially. Power demand yesterday was negative from 11:30 to 3:30, for instance, meaning batteries can charge for free to absorb excess solar generation during those hours.
RE "...All of the peak demand supplied from batteries...."
This is a very good start.
My question is how do the batteries go , if there is 1 - N days of cloudy weather? Can the batteries supply the peak for more than one day?
Is there ( or where would they go ) transmission lines to bring in the needed peak supply from other storage (battery / pumped hydro ) areas?
Another step would be to include days of peak electricity demand, which in my country occurs in cold weather ( heaters) and very hot weather ( air conditioner's ) OR is there demand limits like in Spain demand limit for some houses is around 3KW (YUKS)
Aww, that old „pick a specific timeframe on a specific day, preferably summer, to get an convenient picture“ trick. incompleteness by design.
Supply itself is an inadequate metric. Yet convenient to obstruct the view upon CA that beacon of the future, suddenly being littered with third world brown and blackouts.
“major studios do use some cloud services” is not the issue. The problem being addressed is that the “script to screen” process is typically an antiquated mishmash of offline-first vendors. Netflix reinvented that with a cloud-first process.
“Digital-first production” can mean lots of things, so when you say Disney, Paramount, etc did this a decade ago, you’ll have to be more specific. Do you mean an end to end digital process? That’s not what this is about. Have you worked on a Netflix production? It’s night and day different from the studios you mentioned.
I was in ATC training in the 90s and this was discussed among teachers and ATC personell. The common saying was that pilots would disappear from cockpits before ATC personell were removed, at least from tower control. There are typically three kinds of ATC: Tower control, approach/departure control and area control for controlling planes when cruising. I haven't followed this in years but my impression is that better monitoring equipment allows for fewer area controllers to control bigger areas. I believe area control is the most likely to get automated but this is quite a guess. Approach control is about using radar (or no radar, procedural approach control is a thing) lining up planes to land on a runway. The planes are handed over from approach to tower control when the plane is on final approach. There is also ground control for taxiing on larger airports. But, not least. Do not underestimate the value of having trained personell using radio to great effect. Any belief that modern touch gadgets are better than radio is silly. Humans are also very capable at speaking while performing advanced tasks.
but forget the focus on automating air traffic control, datalink, complex ground IT, remote controls.. That is way to costly and difficult to do in the context of a collection of decentralized legacy systems.
Instead most people are trying to get rid of paper strips (notes used by ATC), and sell complex system that try to automate conflict management.
The hard thing is to improve the UX, the ATC has to communicate with humans (hard even with the highly codified language used), and DO NOT want to solve technical issues, the system has to indicate potential conflicts well in advance but not nag for it at a bad time. They are a lot of human factors to take in consideration and a system well designed with the air traffic controller at the center of it could help a lot.
It's been reported that the elevation of the helicopter was reported as hundreds of feet off. It's unlikely it was just an issue at the specific tower the crash occurred at. If they can't even get accurate elevation data there's no way they'll be able to automate.
I don't know more about ATC, but it looks like a field ripe for disruption and innovation. AI should be able to handle the coordination of flights without the downside of the delays and limitations of the human training pipeline, worker fatigue, and stress - all for less expense. The more I think about it, the more I feel like I could have something tangible at the end of a weekend or two - at least a prototype.
I sincerely hope this is satire (it sure is very HN in nature). "AI" in its current generative incarnation is prone to hallucinations/confabulations that cannot be avoided. In what world is that compatible with a job where a mistake can kill hundreds of people a few minutes or seconds later?
one that you would trust the lives of thousands of humans to every day? It seems unlikely we are anywhere close to a point where we can ensure that any AI won't hallucinate and cause an issue.
Sure, AI can spit out nonsense, and that’s a real concern. But in engineering, we deal with imperfect tradeoffs every day - it’s baked into the job. If we insist on a flawless solution before shipping anything, we’ll never ship. There’s always an optimum where we uphold safety standards without sacrificing forward progress
Yes, however, we need to hold different bars depending on the consequences of the failure. AI in a game that randomly spins around and breaks emersion for a player is fine, an AI that crashes a plane is not fine.
I have zero issues with research into AI research into those areas, but I think it does a major disservice to claim that a weekend is all it takes to get something close to ready for life or death decisions.
Could Congress support AI research and innovation by asking AI company CEOs found guilty of overpromising to prove the reliability of their latest technology by flying in AI-controlled airplanes and relying on AI-managed air traffic, instead of using private jets with human pilots and air traffic controllers? /s
reply