> It ain't pretty, but it works and you've already paid for it.
I've been following the public weather data scene for a long time and I'm pretty sure that UX for all the NOAA/NWS web sites is horrible by design, so to speak.
After all, there are plenty of well designed .gov sites (e.g. https://recreation.gov/), but just try browsing around weather.gov for a few minutes. It's horrible. I mean, the data is top-notch and it technically works, but the experience is awful. Weather.gov has had the exact same site for at least a decade.
This is completely speculative, but I would not be surprised to find a link between campaign contributions from the likes of the The Weather Channel and the decision to completely underfund the web development teams at NWS. Nothing else explains why some of the United States most valuable public data is presented on a web site barely more functional than Craigslist.
They lobbied to prevent open access to tax payer funded data. NOAA is intentionally underfunded to prevent them from making a modern site as a compromise.
There may be a level of lobbying involved but I would also say websites like recreation are consumer facing, the weather data is used by actual researchers and practitioners. Layouts become a lot more sticky and efficient when people like that are your target.
An interesting story is when Trump nominated the CEO of Accuweather to be the head of NOAA. This is someone who would have benefited from keeping NOAA and the National Weather Service from doing their jobs. Typically the head of NOAA is a scientist, not a businessman with obvious conflicts of interest. Fortunately, it didn't stick. https://www.politico.com/story/2017/10/12/trump-noaa-chief-a...
Dark Sky (formerly forecast.io)'s entire point was that they, unlike most weather apps, aren't just a prettified version of publicly available weather data. Dark Sky's selling point was that they also used crowdsourced data, including user reports of rain and IIRC data from app user's sensors in their phones, to predict "hyperlocal" weather.
I use Dark Sky and it is typically much much more accurate than other weather apps when it comes to heads up of inclement weather coming in the next ~hour, and I'm very sad to see Apple killing it.
On the other hand, if they integrate it into iOS's default weather app, that would open up a whole trove of walking sensors. And potentially make it the most accurate forecasting service by far.
But closed off and only accessible if you buy Apple devices. So this is yet another example of a private corporation taking a useful, accessible resource that people rely on, and gating it behind their own services so they can extract rent. God I love capitalism!
They at provided an API and web interface, one that you could use to get forecasts on any device, the data can be presented in any format. They're closing the website and the API, so you only will get the hyperlocal forecast from an Apple device, and only by Apple's UI choices. Closing off services behind a walled garden and charging an entrance fee is Apple's business model.
My primary use of the Dark Sky API is for minute-by-minute forecasts. I've yet to find another API that can answer the question "Do I have enough time to walk my dog before it starts raining?"
This is exactly what I found it didn't work for. I live in Seattle, and basically gave up on it (with today's news, I'll uninstall it from my Android phone). I could look out the window at steady rain, and Dark Sky would still be telling me it was going to be clear all day. Like that more often than not.
That's bizarre but actually can't be Dark Sky's fault.
Switch to radar view in the app and it will show you the exact geographical distribution of rain in your city right now. This comes from national weather data.
If that's inaccurate then something is wrong with the government data. Or maybe Seattle rain is somehow much patchier at a finer level than government data is able to resolve?
FWIW in New York City its assessment of the next five minutes has always been literally perfect. It might say it's going to go from hard rain to nothing in two minutes... and it does, right on time.
It gets increasingly noisy as you go to 15, 30, and 60 min out, though.
Radar data is hugely noisy and it can be difficult to suss out what is cloud cover with no precipitation, or what is precipitation, plus filtering out biomass and dust. All of these show up as radar returns. There can be shadowing. Dark Sky does its own pre-processing/denoising of radar and so that can lead to different results from NOAA.
Ah that's really interesting, thanks for the info.
I'm curious then what users see in Seattle when it's raining out the window but the Dark Sky radar maps says it isn't.
Are there swiss-cheese holes that are obvious processing artifacts? Is the whole city somehow below some threshold so it doesn't show rain anywhere? Are there patterns of extreme local variation in rain that get mistaken for noise and deleted?
The solution is not necessarily simple, but identifying the culprit should be fairly obvious, no?
For one thing, the resolution of radar is finite where one "pixel" (gate) is quite large, and everything under it is assumed to be constant. Precipitation at a spot on the ground is the result of everything happening from sea level up 40,000', and we don't have exact properties (temperature, dew point, humidity) for the atmosphere at all of those locations. We can fill in some gaps with models on supercomputers, but they're just models.
This was my experience too. Perfect in Boston, recently great in Singapore, pretty much useless in Melbourne.
I’d like to know more about their modeling. How much do they need to specialize on region for accurate prediction or data collection? Are varying prediction qualities based on different climates, or is it from bias in data collection?
I live in Seattle now but used to live in Ohio. I agree that the minute by minute is pretty much useless in Seattle but it was incredibly accurate in Ohio. I have numerous stories where I was able to get to shelter or remain sheltered minutes before a sprinkle turned into a downpour.
I know nothing about weather, but I have to assume the weather patterns in Seattle are just very difficult to predict or analyze at such a fine granularity.
My go-to for this is an app called Radar Scope [0], which is the best $10 I've ever spent on an app. Its very easy to pull it up and look at the radar products with a useful color scale (as opposed to others which make light showers look like the apocalypse). Based on the quality I assume its target market is for stormchasers and weather professionals.
Its fast and has most of the radar products, including velocities. I originally bought it to answer the question "Should I shelter from this tornado?"
The downside is that there is no forecasting, but thats freely available through the NWS [1].
Same here, for the website. I actually find their forecasts beyond a few hours pretty bad¹, so I use weather.gov, but I haven't found anything as good at telling me whether it will rain in the next 20 minutes.
1: Both seemly wrong more often and more importantly unclear whether they mean a greater or lesser chance or amount of rain or snow, which makes a big difference when you are, say, biking to work.
I was about to ask if anyone knows a decent API to build my own product off of, and I found out weather.gov has an API available without requiring a key for now, they just ask that you include contact information in your user agent so they don't accidentally block you.
My team pulls a lot of weather data for utility engineering work we do, and the NOAA APIs have proven very hard to navigate. Some data only available via FTP, lots of different weather station networks with different metrics (e.g. lots of variety in the cloud/irradiance data).
Darksky provided a nice clean interface, but I guess we'll go back to cobbling things together directly from NOAA.
While I agree that the NOAA data is hard to work with, these more 'polished' api providers come with their own set of risks. For example, Dark Sky is forecasting a snow storm in my Bay Area coastal town[1]. That seems unlikely. Also, there is another paid service that I am familiar with that had a nice clean API with consistently formatted data, but it turned out their data was wildly inaccurate.
Any clue how hard it would be to create an open-source alternative to the DarkSky API? My company needs one, too. Maybe we could band forces and organize an open-source alternative. I know it won't be as good, but I'm curious what other people in our position think...
Hi, I make the toolkit that Dark Sky (and others) use for their visuals. Been in the weather/aviation mobile space a long time.
I'd floated a Weather-Service-In-A-Box to my clients a few months ago, but got no takers. The idea was something you could set up on a small number of instances fairly quickly and leave running to harvest NOAA/ECMWF/etc data. Fairly simple front end to answer queries. Data tiles for the visuals. That sort of thing.
If people (with money) are interested now, hit me up.
I process weather simulation data from around the world (22 data sources). My servers process around 400-500 GB of data a day and I store about 120 GB which gives me about 5 days of historic simulation data and up to 16 days forecast data.
Storage and processing is relatively easy since it's just a matter of throwing resources at it. However, it'll be damn expensive to store months or years of historic simulation data.
The difficult part is writing and maintaining the processing scripts. The different weather services store data in different formats, have different ways to download the data and sometimes change data structures or fall over.
If you wouldn't mind: how do you go about creating a simulation around the empirical data? This seems to be very core to how Dark Sky is able to provide such accurate realtime data for geographies in between weather stations.
I think it will be pretty hard to do well in open source though. Server costs and keeping things working when NOAA (etc.) sources change is probably going to be expensive.
There's no reason a running service couldn't also opensource its back end code. It does provide an avenue for people to tinker, potentially improve, and self-host if they have the resources available or the service itself goes under.
There are also a lot of open projects that distribute the work of handling tweaks to parsers and source lists when the upstreams drift.
Even globally I can't imagine the minutely data from reporting stations everywhere being "big data", maybe in the 10s of gigabytes a day (very rough guess, I haven't looked into this specifically)? I'd still be willing to bet API / web requests dwarf the processing and bandwidth requirements of the raw data for a public service like this.
That probably does put it pretty reasonably in the realm of self-hosting if you put a threshold on how much historical data you want to keep and geofence the region you care about.
If it’s open source there’s no specific requirement for the project to provide a hosted instance.
Develop the (Ideally extensible) logic to parse a given government funded weather source into a common format, and have “API consumers” run their own instance of services (or set them up an instance for $ and use the profits to help with project costs).
Not disagreeing, just extending. Dark Sky merges multiple weather data sources, even in a single country, and I use it for a private app that deals with weather all around the world. An open source project could just be code, not a service, but that code should have a config where you specified which sources to fetch and how often instead of choosing "a given govt-funded weather source".
Of course, if it's designed to be extensible then definitely non-public sources could be added, but whether they'd be supplied as included modules/plugins or need to be developed (and then optionally contributed upstream) by users of a given non-public service is a question of practicality and sometimes cost (if it's a paid service an OSS project may not be able to justify the cost of a licence/account for the service to develop/test against).
I develop Flowx, an Android weather app, and I agree, it is difficult to pull data from NOAA and other weather services around the world.
Depending on your needs, I may have data you can use. I only process forecast weather simulation data - not weather stations. I'm a solo dev so I can't offer high levels of service, etc... but feel free to contact me.
More importantly, this should be protected and demanded to be expanded upon by citizens. No one can acquire and shut down the National Weather Service, nor any apps they develop with public money released to the public domain. Protect public goods.
Dark sky is not a public good, it's a for-profit business that puts a shiny interface on the same information available to everyone. It also added user reporting and analytics, so added material value. The app cost money. Why don't you lobby your congressman to have NOAA develop a better app instead? You can't make a privately-developed program a "public good"; citizens can't just "demand" that dark sky remains independent and expands because they didn't pay for it. Either lobby the government for a better app, wait for the private sector to develop a better app, or lobby the government to attach a license to the data prohibiting commercial use (though this would kill the other apps, too).
You might re-read my comment. Nowhere do I say Dark Sky is a public good. I do lobby my congressperson for NOAA to have more open data practices and better funding for app development. Citizens can demand products from government be at parity or better than private corporation products, and government has the funding to do it. So why not do it?
I'll note that you edited your comment after I posted my reply, and are now making it sound like you didn't. It originally included something to the effect of, "This shouldn't be allowed to happen."
I made no such edit, and at no time made such a statement. I take no issue with Dark Sky being acquired (EDIT:) and my comment is only to vocally advocate for government-run services that cannot be acquired (to provide continuity of quality service and delivery of data products to citizens and the systems and apps they use to consume said services and data products).
These forecasts are based on the GFS, which has finite resolution at 9-km I think, so the "point forecasts" you get from weather.gov are limited by that.
... And here's the graph for up-to-the-minute detail: https://forecast.weather.gov/MapClick.php?lat=37.3367&lon=-1...
It ain't pretty, but it works and you've already paid for it.