Hacker Newsnew | past | comments | ask | show | jobs | submit | misnome's commentslogin

Even so, the borrow checker repeatedly profiles as an insignificant part of compile times, so wouldn’t make a difference.

When does this cease to justify any possible retribution? How many murdered palestinian children, or emergency workers, or aid workers balances this out? How much torture of prisoners?

It’s honest. Time was when the number of improvements in every release was impressive and sometimes radical improvements.

For at least a couple of years it’s been nothing but AI, I am happy to ignore updates and should probably just turn them off now.


Oh christ, absolutely this. We spent some time evaluating FPGA for our purposes and ended up GPU instead (algorithms we running can be adapted to strength of either).

The concepts are easy enough but learning the toolsets are an exercise in frustration… the documentation/onboarding is either nonexistent or extremely unhelpful, and getting past the stage of “the entire thing doesn’t work because you misclicked a button in the gui several hours ago”. In theory everything can be scripted, usually in TCL, but this is also unstable and seems liable to break every different version of the toolsets.

Alongside Xilinx, we also looked at Altera/Intel OneAPI/dpcpp and this seemed promising until we realised we were encountering so many toolchain/actual compiler bugs that nobody else could have been actually using this, except the oneapi cloud platform that seemed it had been hotpatched to fix some of the issues. In the end, after selling us some compatible cards they dropped the OS and card from support. I guess this taught us not to trust Intel!

We decided teaching to Juniors would be an exercise in frustration unless hiring explicitly for, and decided to go the GPU route.


All these use at least GPS for timing

No, they don’t. GPS is orders of magnitude less reliable than the most up to date metric time synchronization over fixed topology fiber links.

I doubt that very much. GPS time integrity is a big deal in many very important applications -- not the least of which is GPS itself -- and is treated as such.

Yes, an individual fiber distribution system can be much more accurate than GNSS time, but availability is what actually matters. Five nines at USNO would get somebody fired.


I wonder why we bothered building GPS signal waveguides into the bottom of a mine then. Clearly we should have consulted the experts of hacker news first.

Losing NTP for a day is going to affect fuck-all.


I'm not even sure why you're trying to argue this. It's well established that Time over Fiber is 1-2 orders of magnitude more accurate and precise than GNSS time. Fiber time is also immune to many of the numerous sources of interference GNSS systems encounter, which anyone who's done serious timekeeping will be well acquainted with.

Trying to argue that neutrino experiments use GPS time, because they do?

I’m sure synchronising all the worlds detectors over direct fiber links would… work, but, they aren’t.

Unless you are trying to argue internal synchronisation in which case, obviously, but that has absolutely zero to do with losing NTP for a day, the topic of conversation.


The deployments are still obviously limited, but this is something you can straight up buy if you're near a NIST facility [0]. I believe the longest existing link is NJ<->Chicago, which is used for HFT between the exchanges.

[0] https://shop.nist.gov/ccrz__ProductDetails?sku=78200C


They seem to have gone all-in on AI, for commits and ticket management. Not interested in interacting with that.

Otherwise, the built in admin on one-executable was nice, and support for tiered storage, but single node parallel write performance was pretty unimpressive and started throwing strange errors (investigating of which led to the AI ticket discovery).


Yet.

Gitlab has proven in the past perfectly happy to hike prices above GitHub, after attracting enough switchers.


Because you appear completely oblivious and deliberately naive about the entire purpose of CI.


Based on my experience I really do think most people are using it for things that they could perfectly well do locally with far less complication.

Perhaps that isn't most use of it; the big projects are really big.


Care to provide examples?

Fundamentally, yes, what you run in a CI pipeline can run locally.

That's doesn't mean it should.

Because if we follow this line of thought, then datacenters are useless. Most people could perfectly host their services locally.


> Because if we follow this line of thought, then datacenters are useless. Most people could perfectly host their services locally.

There are a rather lot of people who do argue that? Like, I actually agree that non-local CI is useful, but this is a poor argument for it.


I'm aware of people arguing for self-hosting some services for personal use.

I'm not aware of people arguing for self-hosting team or enterprise services.


Well, they are. Selling the team or enterprise a license to do just that is a rather large part of many businesses.


They… do?


This comment would be more useful if you have the name of the product or linked to it. I’m also not aware of this offering and wasn’t able to find information on it.


iCloud Private Relay

It's a very limited VPN as it only works for Safari/Mail and only anonymizes you to your region/country.


iCloud Private Relay (at least for Safari).


Private relay is an Apple VPN-like service that only covers iOS safari. That means the SoundCloud app or desktop usage will not receive any privacy benefits.


Private Relay also works in macOS Safari.


Just tested Soundcloud with a PWA using iOS Safari and Private Relay enabled. It works fine, albeit a few annoying popups asking to download the app.


They’re not big enough and some sites will hard block it with other VPNs, like the government of Delaware. Bigger sites still soft block it like Instagram which will randomly ban accounts using it, or Google with captchas every couple of searches.


Is Sci-Hub still relevant? Haven’t they been frozen for like 5+ years at this point?


More like: "is Google still relevant?" Specially for the kind of people that browses Sci-Hub. It's been months since I've done a search in ad-ridden Google.


I know lots of people who still use google.

Sci-hub has ceased to be mentioned or considered when scientists/grads I know look for papers. Everything has gone back do “Does your institution have a subscription for X?”.


There is a successor to SciHub which relies on IPFS


You mean Nexus?


Yes. Its not perfect, but it has a decent coverage


I still use sci-hub because the newer the article, the less I trust it.

I am not a student anymore, though.


Well for a student or researcher, that’s completely impractical.


What country are you in, if you don't mind saying?


The research papers from 10, 20 or 50 years ago are at least as valuable and frequently more valuable than the papers from this year.

A lot of "new" discoveries are rediscoveries of old things, which may have been not important at the time of their initial discovery, because in order to be useful they depended on advances in other domains, but when those advances happen, suddenly they become important and they can be the base of state-of-the-art techniques.

Therefore Sci-Hub remains very relevant, as a repository containing a very large number of historically-important research papers, including many research papers from the 19th century or early 20th century, which should have been in the public domain, but which can still be found behind paywalls elsewhere.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: