Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You don't understand where AS-400 real power comes from. DB integrated directly into filesystem, you optimize your work for how the whole system has been designed and get massive benefits and robustness.

Performance between junior basic code and heavily optimized complex queries even on just Oracle can be 1000x easily, just ask any data warehouse guy (seen it few times myself even though I don't do warehouses). You can maybe add another 0 for AS-400 there in extreme cases.

Yes if you end up doing things wrong and expect that CPU to do some heavy math then yes this will be dusted quickly. Otherwise, not so much.

But this goes against most modern 'SV principles' so I don't expect much love from younger generations here. Business love that though, and secretly wish all IT could be as reliable and predictable as them although that ship has sailed long time ago. One of those 'they don't do them like good old times anymore'.



I was part of the project that moved Nintendo’s inventory management system from as/400 to SAP HANA in 2015.

We had massive queries that ran in seconds that HANA with millions blown into the consultants just couldn’t complete in time (nightly billing runs).

The solution in the end was that a lot of complexity and decades of cleverness in adjusting to specific patterns in customer behavior were thrown out and replaced by “the consultant will just adjust that every week by hand”.

Also - I was soooo much faster with a num pad and the terminal than clicking around the SAP GUI. Ultimately part of why I left there.


Iirc as400 hardware and OS was optimized for IO throughout and batch processing. What was the hardware SAP HANA was being deployed on?

/no experience with SAP but I’ve never heard a successful story implementing it ever


Microsoft completed a very successful SAP implementation, on SQL Server and Windows. After two failed attempts. I am probably one of the few people who worked on all 3 tries. And prior to that they had a few AS400s running the business.


Because you only hear of the failures. Viaot the SAP homepage to get an idea of who is using it, all of those companies implemented it auccessfully at one point.

OP isn't saying the SAP implementation wasn't successful so, just expensive (no surprise), and somethings didn't work as begore (no surprise neither).


Buuuut HANA is an in-memory DB and the best thing ever since integrated circuits were invented! (if I was to believe Hasso Plattner)


I love their design architecture, a full OS being bytecode based besides the kernel infrastructures since 1988, while we keep getting from those SV principles how WASM is going to change the world.


... and AoT-compiled at installation time! We didn't get this in mainstream software until Android Runtime started AoT-compiling Dalvik bytecode. As I've mentioned elsewhere, Android Runtime shows that AoT compilation doesn't prevent you from dynamically re-optimizing your binaries based on profiling feedback.


We kind of did in Windows/.NET with NGEN, and Windows Phone 8.x and 10, before Android started doing it.

Naturally NGEN has the caveat of only being good enough for faster startup, and requiring strong binaries (aka signed dlls), which meant not everyone adopted it.

Windows 8.x adopted Bartok from Singularity, where applications would be precompiled on the store, and linked on device at instalation time.

Windows 10 moved full AOT compilation into the store when downloading into specific devices.

Note that full AOT on Android is only on versions 5 and 6, starting with version 7 onwards is a mix of interpretation, JIT and AOT, which is nonetheless quite cool.


Why did android go from full AOT in 5.0 back to a mix of the two? I remember when AOT was "hyped" (not really, but talked about) for Lollipop, but I never heard about why they went for a mix instead?


Imagine LLVM compile times for every single Play Store update, additionally there were some restrictions in reflection usage.

Also there are several optimizations that are only possible after having PGO data, and that was lost on AOT at installation time.


I am very aware that an instance of db2 is deeply embedded into the OS, which is designed to be easily maintained. I deal with one regularly at another corporate site.

You might also find it interesting to consider that the CPU was likely made by Global Foundries (assuming that their arrangement with IBM is still in place).

I am still trying to get my coworkers excited about VMS on x86; our VAX users are completely disinterested.


You have VAX users?


VMS 7.3.


Who is using VMS in 2024? Use case?


An enormous set of COBOL applications using TDMS.


Are they still actually running on VAX, or something more "modern" like Alpha?


We are using the Charon VAX emulator on an HPE Xeon.

I have my own skunkworks VMS 1.0 on SimH, from these instructions:

https://gunkies.org/wiki/Installing_VMS_V1.0_on_SIMH


I came into contact with z/OS, VSAM and the likes, but I couldn't see how they did relational queries (joins), all I remember is that every file is row based structured columns that doesn't require parsing (and language integrated in the case of COBOL). What am I missing ?


z/OS is their mainframe OS. The AS/400 is something different. Different hardware and different software.


The OS/400 operating system ran on AS/400 computers. The successor to OS/400 is the i system:

https://en.wikipedia.org/wiki/IBM_i




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: