Oracle has a horrible reputation among devs. But I think they bypass devs for purchase decisions and straight up wine and dine and bamboozle the low information execs.
When I worked at a megacorp as a dev, I had near zero say in such purchase decisions. I had to work with what I was given. Thankfully I work for a much smaller shop now. Better pay and much better decision autonomy.
jasode 6 hours ago [-]
>Oracle has a horrible reputation among devs. But I think they bypass devs for purchase decisions and straight up wine and dine and bamboozle the low information execs.
That is only partially true. Oracle has a wide portfolio of a bunch of products and the "wine & dine the execs" is the sales cycle for software like ERP Oracle E-Business Suite and Oracle Health (Cerner). E.g. it's the hospital CFO & CIO that are the true "customers" of Oracle Health and not the frontline doctors and nurses that use it.
However, for the Oracle RDBMS database ... it was often the developers that chose it over competitors such as IBM DB2, Sybase, Informix, MS SQL Server.
In the late 1980s and early 1990s, a lot of us devs did a "bake off" with various databases and Oracle often won on technical merit. E.g. Oracle had true row-level locking but MS SQL Server before v6.5 was page-level locking. And the open source alternatives of MySQL and PostgreSQL at that early timeframe were not mature enough to compete with advanced Oracle features such as Parallel Query Execution and Recovery from a Standby database.
So young devs today who aren't aware of history will wonder why Amazon ever got locked into Oracle in the first place?!? It was because in 1994, Oracle db was a very reasonable technical choice for devs to make.
abirch 6 hours ago [-]
I’d take oracle over MS SQL any day. That said I’d take PostgreSQL over oracle any day.
deskamess 5 hours ago [-]
Curious if that is from recent experience? I used Oracle decades ago but these days it's all SQL Server.
abirch 5 hours ago [-]
Yes, it's the isolation levels. I want row level locking. I want to have 1 insert and 20 selects running at the same time without getting a lock.
forinti 7 hours ago [-]
Devs get Oracle's best side. They don't know how lucky they are. PL/SQL is a fine language and Oracle throws everything and the kitchen sink into the APIs available inside the database.
As a sysadmin/dba I get to handle the nasty side of Oracle: the bugs, the patches that fail, the redundant tools, the wordy documentation that always feels like a never-ending advert.
hahn-kev 4 hours ago [-]
The problem is that putting code into your database has all the problems of data migration with no testability. It's really hard to write tests and debug code compared to a proper language.
aaronbaugher 4 hours ago [-]
I took over support for a small ISP after the previous sysadmin died unexpectedly. He was a big fan of putting code in the PostgreSQL database, which I had never done before. That was fun to figure out and debug issues.
vkazanov 6 hours ago [-]
When I was young and beautiful and naive I enjoyed playing with setting things up, even though my main focus was on backend dev. Like, setup postgres, tweak settings to performance, that kind of thing. So I was the first one to get the "figure out on site installation steps" task to share with client admin.
Our small company was building enterprises things for large clients. And one of them wanted Oracle on Windows server. They also wanted a failover setup. How hard could that be?
Now I hate Oracle. I hate Oracle consultants. I despise the ignorance.
My favourite bit was Oracle uninstall procedure. I had like 4 pages printed just for this case.
Oracle bad. Postgres good.
cduzz 5 hours ago [-]
Oracle is in the small family of companies that have a business model of "you give us money and we solve _all_ your problems."
So it's likely that, for a specific task, the oracle solution is crap, but oracle has crap for everything so the oracle sales droid can sell a "one throat to choke, one check to write" policy to a company that likely has technology problems but produces a "not technology" product.
Amazon's moving into this space and their crap is even stickier than oracle's...
reactordev 8 hours ago [-]
This is exactly right - they say the right things, to the right people, over the right sized bribe rib.
anonymousiam 3 hours ago [-]
The same could also be said about Microsoft.
2OEH8eoCRo0 6 hours ago [-]
Steaks and strippers!
InDubioProRubio 6 hours ago [-]
Miss`d Steaks were made!
Pinus 7 hours ago [-]
Is this the same Oracle/Cerner system ("Millennium", I believe) that, despite protests from the medical staff, was deployed "big-bang style" in the Swedish region of Västra Götaland, with much the same results? (And where during a press conference, where the management was explaining how nearly everything was going to plan, a doctor, who had somehow sneaked in, got up, shouted something to the effect of "you’re lying, it’s a bloody disaster!" and stormed out.)
(That was not an outage, though — as far as I understand the system was working, it just didn’t actually work...)
shigawire 6 hours ago [-]
Swedish privacy laws for EHRs are quite different compared to other countries.
So I'm not surprised any software not purpose built for for that market would fail. And we know Oracle is not investing in R&D for Millennium since all efforts are out into their forthcoming AI based EHR - whatever that actually means....
harvey9 6 hours ago [-]
Could be, but oracle only bought cerner a few years ago and millennium has been around much longer. Other EMR systems are also dreadful - my spouse has some stories about EPIC for example.
5 hours ago [-]
wiseowise 11 hours ago [-]
“Engineers”
Must be engineers who write requirements and unrealistic deadlines that lead to such issues.
devoutsalsa 11 hours ago [-]
I just watched a YC video about vibe coding. He says he used Claude Sonnet 3.7 to configure his DNS servers & Heroku hosting. What could possibly go wrong?!
I managed to break production before vibe coding was cool.
20 years old me had rm -fr root access to way too many systems.
I don't think it's much different today.
If anything, I think the youngsters will learn faster from their mistakes because they already have good mentor for the easy stuff and will get grinded on the hard stuff sooner.
devoutsalsa 8 hours ago [-]
Maybe. One thing I've found is the LLMs give me a lot of short term leverage. I can (get an LLM to) program in a language I know almost nothing about. Then I hit a wall where I don't know enough to get the LLM to fix something I don't know how to ask for, and then I'm stuck. When vibe coding, I'm not learning how to solve problems myself so much, and that means anything the LLM can't do, I also can't do. When I'm coding on my own, I'm picking up lessons along the way the help me build a foundation for incrementally harder projects.
apercu 6 hours ago [-]
> If anything, I think the youngsters will learn faster from their mistakes because they already have good mentor for the easy stuff and will get grinded on the hard stuff sooner.
Maybe? Back when I had to troubleshoot coaxial network terminations uphill both ways in the snow, we had to learn how things actually work (e.g., what's in a tcp header, how to read an RFC) and that made debugging things a little more straightforward.
I'm pretty insulated from young developers these days, but my limited experience is that they might know the application, presentation and maybe session layers (if using the old OSI model) but everything below that is black magic.
lmz 6 hours ago [-]
On the other hand, understanding coax termination is much easier than figuring out what 5G or Wifi6 is doing with radio waves.
7 hours ago [-]
bborud 7 hours ago [-]
I'm not so sure. I did an experiment last year. I was writing a prototype in a language I'm not well versed in, using unfamiliar tools and only somewhat familiar libraries. I made heavy use of ChatGPT and other tools to get the job done. (Well, actually, I wrote several pieces of software this way, but only one major for-work prototype).
Some observations:
Initial velocity was high. It felt like I was getting things done. This was exciting and satisfying.
The code I wrote was structurally unsound as it ended up having no overall plan. This became more and more evident as the prototype was completed. Yes, it was a prototype, but usually I use prototypes to think about structure. (Choosing the right way to structure a problem is a key part of solving the problem)
My retention of new knowledge was terrible. Ditto for developing any deep understanding of the tools and libraries I was using. I had skipped the entire process by which I usually learn and just had the LLMs tell me solutions. This provided a poor basis for turning the prototype into production code because I had to catch up on all the thinking that should have been part of producing the prototype. And I had to sit down and read the documentation.
LLMs are only as good as your prompts. You actually need to know quite a bit in order to formulate good prompts. Being a fairly experienced programmer (and also having a some knowledge of LLMs and a former career in web search) I had significant advantages novices do not have.
Now imagine a novice who lacks the experience to see the shortcomings of their code (hey, it works doesn't it!?) and the ability to introspect and think about why they failed.
Half of the time the LLM would lead me astray and pursue paths that resulted in poor solutions when better solutions would have been more obvious if I'd just read documentation. It is easy to become focused on exhausting paths that the LLM sent you down in the first place and poke it to give you something that works.
I have nearly 40 years of programming experience. It scared me how stupid relying too much on a LLM made me. Yes, it got me started quickly and it feels faster. However the inflection point that comes some time into the learning process, where things start to click, didn't materialize. Mostly I was just poking the LLM in various ways to make it spit out solutions to tiny parts of the problem at a time.
I still use LLMs. Every day. But I use it more as a fancy search engine. Sometimes I use it to generate ideas - to try detect blind spots (solutions I don't know about, alternative approaches etc). Having been burnt by LLMs hallucinating I consistently ask LLMs to list their sources and then go and look at those.
LLMs are *NOT* mentors. A mentor isn't someone who does the thinking and the work for you. A mentor is also not an alternative to reference material or the means by which you find information. You're expected to do that yourself. A mentor is not someone who eliminates the need to grind through things. You have to grind through things to learn. There is no alternative if you are going to learn.
A mentor is someone who uses their experience and insight to help your personal growth.
nxm 10 hours ago [-]
How do you know requirements or deadlines were a contributing factor here?
wiseowise 10 hours ago [-]
Because in 99% cases it is “when we succeed it’s a team effort, when we fail it’s the engineers”.
rat9988 9 hours ago [-]
Feels like the exact opposit story when I read engineers commenting online.
aeonik 4 hours ago [-]
Engineers aren't shy about eviscerating each other's work when mistakes are made—sometimes too eager, frankly.
Whole courses are built around forensically dissecting every error in major systems. Entire advanced fields are written in blood.
You probably don't hear about it often because the analysis is too dense and technical to go viral.
At the same time, there's a serious cultural problem: technical expertise is often missing from positions of authority, and even the experts we do have are often too narrow to bridge the levels of complexity modern systems demand.
parthdesai 9 hours ago [-]
Oh come on, let's be honest. All of us have colleagues who we don't trust with mission critical stuff.
iinnPP 8 hours ago [-]
Your manager should also know that though.
parthdesai 6 hours ago [-]
Yes, but that's not the point OP was making here
freehorse 8 hours ago [-]
How do we know that it was just a "human error" as the article/headline implies?
Answer: we do not know either, but this is the standard response so that companies (or governments or whoever this concerns) are absolved of any responsibility. In general, it is not possible to know for a specific case until a proper investigation is carries out (which is rarely the case). But most of the times, experience says that it is company policies/procedures that either create circumstances that make such errors more probable, or simply allow these errors to happen too easily due to lack of enough guardrails. And usually it is due to push for "shipping products/features fast" or similar with little concern to other regards.
It could be a different case here, but seeing it is about oracle and having in mind that oracle is really bad at taking accountability about anything going wrong, I doubt it (very recently they were denying a hack on their systems and the leak of data of companies that use them until the very last moment).
sofixa 10 hours ago [-]
It's a very common cop out by engineers/technical folks when something goes wrong somewhere else, to blame it on management/deadlines/customers/etc.
Sometimes people make mistakes, sometimes people are incompetent, sometimes managers suck, sometimes it's a multi-layered issue.
freehorse 8 hours ago [-]
Because some human errors are bound to happen, what is often missing is procedures to minimize them, catch them or prevent them from having catastrophic effects.
It is not even about being technical. Have a person put data on a spreadsheet and you can get into so many errors if the procedure of doing that is bad.
apercu 6 hours ago [-]
> It's a very common cop out by engineers/technical folks when something goes wrong somewhere else, to blame it on management/deadlines/customers/etc.
Look, I've been an executive and a management consultant for a long time now (started as a sysadmin and developer), and it's quite often the case that everything was late (decisions, budgets, approvals, every other dependency) but for some reason it's ok that the planned 4 months of development is now compressed in to 2.5 months.
I have been involved to some degree or another in probably close to 300 moderately complex to highly complex tech projects over the last 30 years. That's an extremely conservative number.
And the example I describe above is probably true for 85% of them.
DontchaKnowit 5 hours ago [-]
Idk, when I worked at cerner, literally anyone could ssh into a hospitals cerner and drop tables if they wanted to. Nothin stoppin you.
I would regularly write massive update/insert statements on production DBs to fix issues.
So, yeah, id imagine this was the engineers fault.
netdevphoenix 7 hours ago [-]
Title should be "Oracle execs caused five days software outage at US hospitals". If the systems helped save lots of life, you bet the engineers wouldn't be thanked for that and it probably wouldn't even be a news article
nojito 7 hours ago [-]
The engineers are the one's who deleted the db though.
johnwatson11218 2 hours ago [-]
I think the Oracle Transaction Manager is one of the best pieces of software that I had to work with in a professional settings. Lots of other stuff in an enterprise setting is very flaky and follows trends but the Oracle internals seem very nice.
amelius 9 hours ago [-]
Doesn't matter. They have all the good lawyers.
DontchaKnowit 5 hours ago [-]
Used to work on CHS systems for Cerner. If I am not mistaken they were a "communityworks" client, which meant their databases were shared with a number of other clients - a "multitenent" envirnment, we called it. Completely bumblefucked design. Not surprised somthing like this happened.
Also - cerner software in general allowed hospitals to freely completely fuck up their own architecture. Sometimes ireperably.
If anyone has details about how this happened Id love to hear.
phaedrus441 7 hours ago [-]
Can't wait for the overpriced, late, outdated Oracle to get deployed at the VA! We get to go from one bad EHR to another...
xupybd 8 hours ago [-]
I hate working on production systems and I hate any action that can't be easily reversed like deleting stuff.
cheema33 8 hours ago [-]
Yeah, I like working under ideal conditions too. And I also want a toilet made of solid gold.
aeonik 5 hours ago [-]
Why solid gold? It would be way too ductile, heavy, and conduct far too much heat.
Diamond might actually be better: low surface energy means a low coefficient of friction, so it would be much easier to clean. It would still suck the heat right out of your cheeks, though.
Realistically, porcelain or other ceramics are probably the ideal material.
You sound sarcastic. It's not too much to ask to have all actions reversible in 2021.
djoldman 6 hours ago [-]
Is it possible to fix EHR without drastic changes in the health/pharma/hospital/medicine regulation environment?
exabrial 5 hours ago [-]
As a previous Cerner employee, this is actually pretty good.
buyucu 11 hours ago [-]
I don't understand why anyone buys Oracle. Every Oracle product I used over the last 15 years has been awful.
jeroenhd 9 hours ago [-]
Oracle offers everything. Databases, web frameworks, programming languages, CRMs, cloud management tools, auto scaling clusters, identity management, BI, document search, email servers, colocation, mainframes, calendar sync. You name it, they've got it. Oracle has products ready to go to run an entire country if it needs to.
Sure, Postgres beats OracleDB, but Postgres doesn't integrate as well with Oracle Fusion and you need to migrate the code yourself. They're like SAP: they're big enough that you can make a career just out of configuring their software packages and make good money while doing so.
It's expensive and certainly not the best, but it's reasonably stable and has a huge company backing it. Oracle won't disappear any time soon and they're not as likely as Google or Microsoft to shut down their services within a few years notice.
In some countries, Oracle is also very good at doing what Google and Microsoft are doing to students. The Brazilian programmers I've spoken to specifically learned OracleDB when they were taught relational databases. They learned to program in Java, and I'm sure Oracle also sponsored other parts of the professional tooling they got to use for free. Microsoft, on the other hand, didn't seem to generous towards their educational facility (no free MS tooling for their schools like they offer over here). If all you know is Photoshop/Windows/Maya/OracleDB/iOS, you're going to look for jobs where you can use Photoshop/Windows/Maya/OracleDB/iOS, and employers looking for cheap juniors will need to offer Photoshop/Windows/Maya/OracleDB/iOS to make the best use of them.
parthdesai 9 hours ago [-]
> Sure, Postgres beats OracleDB
Are we sure? I'm by no means a DBA, but DBA at our company (who is freaking smart btw) said if money wasn't an issue, he would actually go with OracleDB.
jeroenhd 9 hours ago [-]
OracleDB forbids benchmarking it in the license, so I just presume it's inefficient and too slow to compete.
parthdesai 6 hours ago [-]
An example, Oracle has global unique index when a table is partitioned, Postgres doesn't.
Again, I haven't worked with OracleDB at all, and my postgres knowledge is limited, but assuming without having experience with both systems isn't fair to either DBs IMO.
Dildonics4All 8 hours ago [-]
[dead]
ie21 9 hours ago [-]
Same.
I work with engineers and technical managers with 25+ years of experience, building and maintaining serious business software 'you could run a country with' - people here build react web apps or do scientific research, or work for a SaaS provider - completely different view than building highly complex, regulated, mission critical software that supposed to run for decades and be supported at this level.
collingreen 7 hours ago [-]
I think it's an arrogant and naive mistake to think nobody reading hacker news has ever built anything complex, regulated, mission critical, or intended to last.
A more useful line of conversation might be discussing the vastly different requirements and environments (both physical and bureaucratic) that span our industry. Right now I'm a one man dev team slinging multiple products as fast as I can, trying to find revenue as the runway burns up. It would be silly to think everyone is in my same position with my same tradeoffs and I don't expect that to be particularly controversial.
If you have some good insight about when Oracle products are particularly well suited to the task I think many folks would love to read and discuss it. If you just want to act like you're the only one taking your job seriously then I suggest you just save your keystrokes and everybody's time.
Spooky23 6 hours ago [-]
The Oracle database is an amazing piece of software. The problem with it is as open source and SQL Server started eroding its share, Oracle pivoted to owning vertical line of business software.
My employer probably spends more money on databases for our learning management system than we do for one of our main customer facing apps with thousands of concurrent users. It’s literally a tally of training courses.
metadata 8 hours ago [-]
Postgres is a better choice for 99% of the companies out there. But there are cases where you need ability to massively scale AND control your database cluster perfectly.
Postgres won't even let your force an execution plan and ignores hints (yes, there is an extension for that) so your optimized query can at some point just 10x in execution time and increase load in production when the plan changes.
In Oracle, I am told you can prioritize certain queries at certain time of day - it's crazy what it can do. Yes, it's slow and expensive. If you have money to throw at the problem, it's fast and it solves your problem, whatever the scale. Their Exadata cluster, for example, is wicked fast storage layer pre-filtering the data it sends to the database.
Of course, I despise their business practices - especially the abuse of customers via audits. As a database, it absolutely has its place regardless of lobbying, corruption, and whatever else they are doing.
akoboldfrying 6 hours ago [-]
> Postgres won't even let your force an execution plan and ignores hints
Finally, an actual technical argument. I agree that PostgreSQL's absolute insistence on trusting the query optimiser to Do The Right Thing is weird and annoying (despite being sound general strategy). It even seems to contradict its own general spirit of being on the whole extremely customisable -- you can make your own data types, operators, indexing data structures, complete scripting language plugins... but not, ya know, a way to 100% guarantee that this here query will use this here execution plan.
anonzzzies 8 hours ago [-]
Beats at what? Not mission critical multi decade setups that can be easily repeated by hiring someone from postgres consultancy. Oracle handles every use case you throw at it: maybe not optimally, but that is not what you care about at that level anyway. They suck but what are the actual alternatives: not postgres or supabase for most orgs of significant size.
bn-l 9 hours ago [-]
Damn I’m wondering now if that’s why we studied Java as our first programming language in Australia at uni for CS.
Oracle have a relatively big presence here and there’s a comfortable “mates” system that runs the Australia (soft bribery).
jeroenhd 8 hours ago [-]
As much as I hate Oracle, there are huge advantages to Java. Java is strongly typed, free, and runs damn near anywhere. It's pretty much the go-to language when people think about OOP and for good reason. It has some excellent open source IDE support and it's widely used in the industry. With the current OpenJDK setup, it's also free of Oracle's licensing issues. It runs fast and can support most algorithms you'll probably ever study, without the manual memory management troubles of something like C++.
I was taught C# in uni for very similar reasons except the entire uni ran on Windows and the Microsoft platform, which made doing assignments on Linux rather inconvenient. With the status of dotnet core, I'd say Java finally has a good competitor when it comes to teaching OOP languages.
chickenzzzzu 9 hours ago [-]
Java has been a staple of university courses since before Oracle bought Sun. I guess I am getting old now, if the kids all want to program in Python and Javascript for 4 years...
knifie_spoonie 8 hours ago [-]
I was at Uni in Australia well before Oracle bought Sun. Java was used quite extensively then too.
anonzzzies 8 hours ago [-]
Java was taught far before that: we got taught the first public beta version in the 90s in uni. Thought it was nonsense compared to c, prolog, lisp and Haskell.
newsclues 8 hours ago [-]
I think it’s from Sun Microsystems who had the friendly to edu strategy
buyucu 8 hours ago [-]
Yes Oracle offers everything, but everything it offers is low-quality.
selivanovp 11 hours ago [-]
Because Oracle comes to any country/industry with trucks of money to corrupt officials and lobby their adoption.
And after a few years you find yourself in a situation when you already paid for Oracle so much, integrated it so deeply, so switching to any alternative is a massive pain and in most cases it’s safer and easier to keep paying Oracle.
netdevphoenix 9 hours ago [-]
I believe this is how they grew and how they remain big. While smaller companies aim for managers, Oracle targets CTOs and CEOs, takes them out for expensive dinners and promises them the world. Then, they legacy handcuffed them forever. And virtually no one chooses to invest crazy amounts of money on migrating or even starting a new codebase when your company lifeblood has been poured into the Faustian Oracle machinery.
1718627440 7 hours ago [-]
Are managers really not doing background checks about company behaviour before signing a contract?
netdevphoenix 3 hours ago [-]
The ones targeted by Oracle salespeople don't. And when they do, it's too late. And most just hope things "will work out". I was third party witness to one of these transformations and I correctly predicted the chaos, uncertainty and sense of powerlessness that it would cause
goodpoint 10 hours ago [-]
They also corrupt management in the companies buying their products. Same with two other very big software vendor.
jackvalentine 11 hours ago [-]
Nobody bought Oracle here - they bought Cerner which was gobbled up in 2022.
chickenzzzzu 11 hours ago [-]
Disclaimer: I am not inciting violence. This is polemic humor for entertainment purposes only. Void where prohibited.
Theory: A society collectivey buys Oracle when they no longer view armed revolution as acceptable.
Scoundreller 10 hours ago [-]
Sorta, Cerner uses an Oracle backend
vlovich123 10 hours ago [-]
I know they get a lot of hate but I personally used Oracle’s cloud and it seemed like a decent piece of infrastructure with really solid engineering team behind it. They had their share of problems (and their first line of support is really really bad and not well suited to handle real issues) but not really any different from any other similar products.
ManBeardPc 7 hours ago [-]
I made a similar experience. The database just stopped working one day, triggered by a normal restart without any updates or patches. Fixed the configuration, tested it, restarted multiple times, everything works. After a week it was not working again, without me being able to fix it except doing a fresh reinstall on a clean VM.
Luckily more and more customers switched to Postgres and I no longer have to deal with it.
gonzo41 11 hours ago [-]
It's legacy lock in. You can't just switch stuff. it's incredibly hard to move logic out of store procedures from on DB to another. Its the same reason why the US government runs on Cobol. People don't think about the strategic implication of the tech they choose and how it may come back to limit them in the future.
8fingerlouie 11 hours ago [-]
Most Oracle "shops" i know have used it for decades. When they started using it, there weren't many options, so Oracle was what was used.
COBOL is in the same category. When invented, it was the absolute easiest programming language to learn and use, so of course it gained popularity.
It then turned out to be rather good at what it did, along with running on hardware that "never fails", so most places didn't even think about replacing it until sometime in the 90s.
Also keep in mind that the reason companies are migrating away from COBOL is not due to the language as much as due to young people not taking an interest in COBOL and Mainframes, making it hard to recruit new talent.
Even then, a migration away from a typical mainframe is a huge task. In most cases you're talking 50-60 years of "legacy" (still running, still working, still updated) code, and replacing a system that has evolved for half a century is not an easy undertaking, at least not if you plan on getting it 100% right the first time.
netdevphoenix 9 hours ago [-]
> young people not taking an interest in COBOL and Mainframes
It is more like young people not wanting to:
- throw their careers out of the window by pigeonholing themselves into zombieware tech
- experience high levels of stress trying to debug code older than their parents, writing code that can't be unit tested and pushing said code to production
consp 8 hours ago [-]
> writing code that can't be unit tested and pushing said code to production
Isn't that just "move fast" these days?
abduhl 6 hours ago [-]
It’s called vibe coding now.
pastage 11 hours ago [-]
> due to young people not taking an interest in COBOL and Mainframes
Having only fleeting professional experience with COBOL during a summer my view of it is that it is a mix of dataanalyst job and programming. Where the programming is horrible and the report making is ok though archaic. As long as you modify processes already available it is not so bad, but the developer experience was horrible.
With all that said I actually liked ideas in COBOL but it is an extremely niche language that does not serve you at all in the real world.
decimalenough 10 hours ago [-]
The "real world" of airline reservations, the world's financial infrastructure and basically anything involving hardcore real data processing runs on COBOL.
But yeah, if you're looking to code up a progressive web app or next blockbuster MMORPG, I wouldn't recommend COBOL.
pastage 8 hours ago [-]
Fair point! From a personal perspective it was about interfacing with the real world like parsing data, bitbanging or writing drivers. I actually have no problem with doing BI web apps with COBOL.
sofixa 10 hours ago [-]
The top (in the world) 2 airline reservation system Amadeus have ported everything to Kubernetes for years now. In fact they were among the top contributors to the project a few years back.
It's all a matter of age and maturity. Nobody starting an airline, or financial company, or "hardcore data processing" today even bothers considering a mainframe or COBOL.
8fingerlouie 9 hours ago [-]
In fact, starting over is often the thing that is the easiest to accomplish instead of porting 50 years of legacy code.
oblio 8 hours ago [-]
At least as much mission critical software runs on C++, Java, and of course, plain old C.
I wouldn't portray Cobol to be some sort of magic "hardcore" pixie dust for anything.
baridbelmedar 10 hours ago [-]
It's pretty common for business contracts to just end up stuck on mainframes or scattered across various systems from vendors like Oracle (the paper trails are often thrown away long time ago).
And let's be honest, a lot of folks in IT aren't exactly top performers and don't seem to care all that much. It's really the developers you find on forums like this who are genuinely passionate. You're not likely to bump into the people actually buying those big Oracle or IBM systems around here though :)
surfingdino 8 hours ago [-]
I'd risk to say it was compounding interest on tech debt, but it doesn't sound as sexy, does it?
mrweasel 7 hours ago [-]
> engineers conducting maintenance work mistakenly deleted critical storage connected to a key database
I'd say poor process management. Why is an engineer even deleting critical storage (I take that to mean that they are deleting something of a file-system). Perhaps they where dropping a database, but you wouldn't do it like that in a critical environment. You'd disable the database access first, and then after some time, weeks, you'd drop the database, after doing a final backup.
It could also be disconnecting a SAN, deleting a storage pool, something like that, but your process should say: Check of read activity, off-line the storage, anything non-destructive, and the only later, once you've verified that everything runs without this resource, do you delete.
At previous jobs I've worked with healthcare system. You have processes, you follow those proesses to the letter and you never delete anything as your first step. Deleting is almost always done by going into read-only mode and ageing out the data.
The fact that recovery time is four days tells me that no one followed a single procedure. Because there should be a written step by step plan, including recovery and risk assessment and when the change manager sees: "Recovery time five days" they will ask questions and stop you.
This is probably the only answer less sexy than technical debt.
When I worked at a megacorp as a dev, I had near zero say in such purchase decisions. I had to work with what I was given. Thankfully I work for a much smaller shop now. Better pay and much better decision autonomy.
That is only partially true. Oracle has a wide portfolio of a bunch of products and the "wine & dine the execs" is the sales cycle for software like ERP Oracle E-Business Suite and Oracle Health (Cerner). E.g. it's the hospital CFO & CIO that are the true "customers" of Oracle Health and not the frontline doctors and nurses that use it.
However, for the Oracle RDBMS database ... it was often the developers that chose it over competitors such as IBM DB2, Sybase, Informix, MS SQL Server.
In the late 1980s and early 1990s, a lot of us devs did a "bake off" with various databases and Oracle often won on technical merit. E.g. Oracle had true row-level locking but MS SQL Server before v6.5 was page-level locking. And the open source alternatives of MySQL and PostgreSQL at that early timeframe were not mature enough to compete with advanced Oracle features such as Parallel Query Execution and Recovery from a Standby database.
E.g. the C Language programmer Shel Kaphan at Amazon chose Oracle in 1994: https://www.linkedin.com/posts/jpcharles_in-1994-amazons-fir...
(that anecdote cited this deep link: https://www.youtube.com/watch?v=u3qIWN-ZIPk&t=1h11m56s)
It took Amazon 25 years to finally migrate off of all Oracle databases: https://www.google.com/search?q=oracle+shuts+off+last+oracle...
So young devs today who aren't aware of history will wonder why Amazon ever got locked into Oracle in the first place?!? It was because in 1994, Oracle db was a very reasonable technical choice for devs to make.
As a sysadmin/dba I get to handle the nasty side of Oracle: the bugs, the patches that fail, the redundant tools, the wordy documentation that always feels like a never-ending advert.
Our small company was building enterprises things for large clients. And one of them wanted Oracle on Windows server. They also wanted a failover setup. How hard could that be?
Now I hate Oracle. I hate Oracle consultants. I despise the ignorance.
My favourite bit was Oracle uninstall procedure. I had like 4 pages printed just for this case.
Oracle bad. Postgres good.
So it's likely that, for a specific task, the oracle solution is crap, but oracle has crap for everything so the oracle sales droid can sell a "one throat to choke, one check to write" policy to a company that likely has technology problems but produces a "not technology" product.
Amazon's moving into this space and their crap is even stickier than oracle's...
(That was not an outage, though — as far as I understand the system was working, it just didn’t actually work...)
So I'm not surprised any software not purpose built for for that market would fail. And we know Oracle is not investing in R&D for Millennium since all efforts are out into their forthcoming AI based EHR - whatever that actually means....
Must be engineers who write requirements and unrealistic deadlines that lead to such issues.
"How To Get The Most Out Of Vibe Coding | Startup School " => https://www.youtube.com/watch?v=BJjsfNO5JTo&t=494s
I like his advice on downloading relevant documentation and putting it in the code base. That makes sense to me for targeted use cases.
20 years old me had rm -fr root access to way too many systems.
I don't think it's much different today.
If anything, I think the youngsters will learn faster from their mistakes because they already have good mentor for the easy stuff and will get grinded on the hard stuff sooner.
Maybe? Back when I had to troubleshoot coaxial network terminations uphill both ways in the snow, we had to learn how things actually work (e.g., what's in a tcp header, how to read an RFC) and that made debugging things a little more straightforward.
I'm pretty insulated from young developers these days, but my limited experience is that they might know the application, presentation and maybe session layers (if using the old OSI model) but everything below that is black magic.
Some observations:
Initial velocity was high. It felt like I was getting things done. This was exciting and satisfying.
The code I wrote was structurally unsound as it ended up having no overall plan. This became more and more evident as the prototype was completed. Yes, it was a prototype, but usually I use prototypes to think about structure. (Choosing the right way to structure a problem is a key part of solving the problem)
My retention of new knowledge was terrible. Ditto for developing any deep understanding of the tools and libraries I was using. I had skipped the entire process by which I usually learn and just had the LLMs tell me solutions. This provided a poor basis for turning the prototype into production code because I had to catch up on all the thinking that should have been part of producing the prototype. And I had to sit down and read the documentation.
LLMs are only as good as your prompts. You actually need to know quite a bit in order to formulate good prompts. Being a fairly experienced programmer (and also having a some knowledge of LLMs and a former career in web search) I had significant advantages novices do not have.
Now imagine a novice who lacks the experience to see the shortcomings of their code (hey, it works doesn't it!?) and the ability to introspect and think about why they failed.
Half of the time the LLM would lead me astray and pursue paths that resulted in poor solutions when better solutions would have been more obvious if I'd just read documentation. It is easy to become focused on exhausting paths that the LLM sent you down in the first place and poke it to give you something that works.
I have nearly 40 years of programming experience. It scared me how stupid relying too much on a LLM made me. Yes, it got me started quickly and it feels faster. However the inflection point that comes some time into the learning process, where things start to click, didn't materialize. Mostly I was just poking the LLM in various ways to make it spit out solutions to tiny parts of the problem at a time.
I still use LLMs. Every day. But I use it more as a fancy search engine. Sometimes I use it to generate ideas - to try detect blind spots (solutions I don't know about, alternative approaches etc). Having been burnt by LLMs hallucinating I consistently ask LLMs to list their sources and then go and look at those.
LLMs are *NOT* mentors. A mentor isn't someone who does the thinking and the work for you. A mentor is also not an alternative to reference material or the means by which you find information. You're expected to do that yourself. A mentor is not someone who eliminates the need to grind through things. You have to grind through things to learn. There is no alternative if you are going to learn.
A mentor is someone who uses their experience and insight to help your personal growth.
Whole courses are built around forensically dissecting every error in major systems. Entire advanced fields are written in blood.
You probably don't hear about it often because the analysis is too dense and technical to go viral.
At the same time, there's a serious cultural problem: technical expertise is often missing from positions of authority, and even the experts we do have are often too narrow to bridge the levels of complexity modern systems demand.
Answer: we do not know either, but this is the standard response so that companies (or governments or whoever this concerns) are absolved of any responsibility. In general, it is not possible to know for a specific case until a proper investigation is carries out (which is rarely the case). But most of the times, experience says that it is company policies/procedures that either create circumstances that make such errors more probable, or simply allow these errors to happen too easily due to lack of enough guardrails. And usually it is due to push for "shipping products/features fast" or similar with little concern to other regards.
It could be a different case here, but seeing it is about oracle and having in mind that oracle is really bad at taking accountability about anything going wrong, I doubt it (very recently they were denying a hack on their systems and the leak of data of companies that use them until the very last moment).
Sometimes people make mistakes, sometimes people are incompetent, sometimes managers suck, sometimes it's a multi-layered issue.
It is not even about being technical. Have a person put data on a spreadsheet and you can get into so many errors if the procedure of doing that is bad.
Look, I've been an executive and a management consultant for a long time now (started as a sysadmin and developer), and it's quite often the case that everything was late (decisions, budgets, approvals, every other dependency) but for some reason it's ok that the planned 4 months of development is now compressed in to 2.5 months.
I have been involved to some degree or another in probably close to 300 moderately complex to highly complex tech projects over the last 30 years. That's an extremely conservative number.
And the example I describe above is probably true for 85% of them.
I would regularly write massive update/insert statements on production DBs to fix issues.
So, yeah, id imagine this was the engineers fault.
Also - cerner software in general allowed hospitals to freely completely fuck up their own architecture. Sometimes ireperably.
If anyone has details about how this happened Id love to hear.
Diamond might actually be better: low surface energy means a low coefficient of friction, so it would be much easier to clean. It would still suck the heat right out of your cheeks, though.
Realistically, porcelain or other ceramics are probably the ideal material.
https://www.totousa.com/washlet
https://www.guggenheim.org/exhibition/maurizio-cattelan-amer...
Sure, Postgres beats OracleDB, but Postgres doesn't integrate as well with Oracle Fusion and you need to migrate the code yourself. They're like SAP: they're big enough that you can make a career just out of configuring their software packages and make good money while doing so.
It's expensive and certainly not the best, but it's reasonably stable and has a huge company backing it. Oracle won't disappear any time soon and they're not as likely as Google or Microsoft to shut down their services within a few years notice.
In some countries, Oracle is also very good at doing what Google and Microsoft are doing to students. The Brazilian programmers I've spoken to specifically learned OracleDB when they were taught relational databases. They learned to program in Java, and I'm sure Oracle also sponsored other parts of the professional tooling they got to use for free. Microsoft, on the other hand, didn't seem to generous towards their educational facility (no free MS tooling for their schools like they offer over here). If all you know is Photoshop/Windows/Maya/OracleDB/iOS, you're going to look for jobs where you can use Photoshop/Windows/Maya/OracleDB/iOS, and employers looking for cheap juniors will need to offer Photoshop/Windows/Maya/OracleDB/iOS to make the best use of them.
Are we sure? I'm by no means a DBA, but DBA at our company (who is freaking smart btw) said if money wasn't an issue, he would actually go with OracleDB.
Again, I haven't worked with OracleDB at all, and my postgres knowledge is limited, but assuming without having experience with both systems isn't fair to either DBs IMO.
I work with engineers and technical managers with 25+ years of experience, building and maintaining serious business software 'you could run a country with' - people here build react web apps or do scientific research, or work for a SaaS provider - completely different view than building highly complex, regulated, mission critical software that supposed to run for decades and be supported at this level.
A more useful line of conversation might be discussing the vastly different requirements and environments (both physical and bureaucratic) that span our industry. Right now I'm a one man dev team slinging multiple products as fast as I can, trying to find revenue as the runway burns up. It would be silly to think everyone is in my same position with my same tradeoffs and I don't expect that to be particularly controversial.
If you have some good insight about when Oracle products are particularly well suited to the task I think many folks would love to read and discuss it. If you just want to act like you're the only one taking your job seriously then I suggest you just save your keystrokes and everybody's time.
My employer probably spends more money on databases for our learning management system than we do for one of our main customer facing apps with thousands of concurrent users. It’s literally a tally of training courses.
Postgres won't even let your force an execution plan and ignores hints (yes, there is an extension for that) so your optimized query can at some point just 10x in execution time and increase load in production when the plan changes.
In Oracle, I am told you can prioritize certain queries at certain time of day - it's crazy what it can do. Yes, it's slow and expensive. If you have money to throw at the problem, it's fast and it solves your problem, whatever the scale. Their Exadata cluster, for example, is wicked fast storage layer pre-filtering the data it sends to the database.
Of course, I despise their business practices - especially the abuse of customers via audits. As a database, it absolutely has its place regardless of lobbying, corruption, and whatever else they are doing.
Finally, an actual technical argument. I agree that PostgreSQL's absolute insistence on trusting the query optimiser to Do The Right Thing is weird and annoying (despite being sound general strategy). It even seems to contradict its own general spirit of being on the whole extremely customisable -- you can make your own data types, operators, indexing data structures, complete scripting language plugins... but not, ya know, a way to 100% guarantee that this here query will use this here execution plan.
Oracle have a relatively big presence here and there’s a comfortable “mates” system that runs the Australia (soft bribery).
I was taught C# in uni for very similar reasons except the entire uni ran on Windows and the Microsoft platform, which made doing assignments on Linux rather inconvenient. With the status of dotnet core, I'd say Java finally has a good competitor when it comes to teaching OOP languages.
And after a few years you find yourself in a situation when you already paid for Oracle so much, integrated it so deeply, so switching to any alternative is a massive pain and in most cases it’s safer and easier to keep paying Oracle.
Theory: A society collectivey buys Oracle when they no longer view armed revolution as acceptable.
Luckily more and more customers switched to Postgres and I no longer have to deal with it.
COBOL is in the same category. When invented, it was the absolute easiest programming language to learn and use, so of course it gained popularity.
It then turned out to be rather good at what it did, along with running on hardware that "never fails", so most places didn't even think about replacing it until sometime in the 90s.
Also keep in mind that the reason companies are migrating away from COBOL is not due to the language as much as due to young people not taking an interest in COBOL and Mainframes, making it hard to recruit new talent.
Even then, a migration away from a typical mainframe is a huge task. In most cases you're talking 50-60 years of "legacy" (still running, still working, still updated) code, and replacing a system that has evolved for half a century is not an easy undertaking, at least not if you plan on getting it 100% right the first time.
It is more like young people not wanting to:
- throw their careers out of the window by pigeonholing themselves into zombieware tech
- experience high levels of stress trying to debug code older than their parents, writing code that can't be unit tested and pushing said code to production
Isn't that just "move fast" these days?
Having only fleeting professional experience with COBOL during a summer my view of it is that it is a mix of dataanalyst job and programming. Where the programming is horrible and the report making is ok though archaic. As long as you modify processes already available it is not so bad, but the developer experience was horrible.
With all that said I actually liked ideas in COBOL but it is an extremely niche language that does not serve you at all in the real world.
But yeah, if you're looking to code up a progressive web app or next blockbuster MMORPG, I wouldn't recommend COBOL.
It's all a matter of age and maturity. Nobody starting an airline, or financial company, or "hardcore data processing" today even bothers considering a mainframe or COBOL.
I wouldn't portray Cobol to be some sort of magic "hardcore" pixie dust for anything.
And let's be honest, a lot of folks in IT aren't exactly top performers and don't seem to care all that much. It's really the developers you find on forums like this who are genuinely passionate. You're not likely to bump into the people actually buying those big Oracle or IBM systems around here though :)
I'd say poor process management. Why is an engineer even deleting critical storage (I take that to mean that they are deleting something of a file-system). Perhaps they where dropping a database, but you wouldn't do it like that in a critical environment. You'd disable the database access first, and then after some time, weeks, you'd drop the database, after doing a final backup.
It could also be disconnecting a SAN, deleting a storage pool, something like that, but your process should say: Check of read activity, off-line the storage, anything non-destructive, and the only later, once you've verified that everything runs without this resource, do you delete.
At previous jobs I've worked with healthcare system. You have processes, you follow those proesses to the letter and you never delete anything as your first step. Deleting is almost always done by going into read-only mode and ageing out the data.
The fact that recovery time is four days tells me that no one followed a single procedure. Because there should be a written step by step plan, including recovery and risk assessment and when the change manager sees: "Recovery time five days" they will ask questions and stop you.
This is probably the only answer less sexy than technical debt.