> "I didn't thought this thing about computers would go too far."
I almost didn't major in Computer Science because in the late 90s, there were so many negative articles in the New York Times, vis-a-vis software. People don't remember it now, but the media and the culture were utterly hostile towards us, and loved to say our jobs were going to India, that everything there was to know about Computer Science could be studied in railyard switching, in existing abstract math textbooks, etc.
By a combination of luck, and my dad's insistence, I ended up at Carnegie Mellon, and while I was there, I saw what folks at Google were doing, and I thought to myself, no, this stuff is hard, and this is just going to be the beginning.
> "It was way too tedious to do. You'd spend hours getting the cards just right. We used to put them in a shoebox and mark them with a pen in case we dropped them on the way to the lab. Then you'd wait until the next day to get your results. If you had a mistake you'd repeat the whole process"
Even what came after that, e.g. in C / C++ was considerably tedious compared to what we do today. Folks sometimes had to do objdumps of compiled binaries to debug what was going on. We had to get coredumps, load them up, and try to determine what memory error had caused things to crash (this is an entire class of problems that doesn't exist today). You used to legit need that CS degree in order to code in your day-to-day because you had to understand the function stack, the network stack, basic syscalls like wait and poll, etc.
It was a lot of work, for relatively little product, and I think part of the reason why software is paid more today is in part because of 1. faster processing speeds and 2. better tooling and automation, and higher-level programming languages – all of which were enabled in part by cheaper / faster CPU speeds (e.g. people don't have to care about how slow Python is – you can optimize it after you find product-market-fit), and 3. a better understanding of how software should be developed, at all levels of management.
> I almost didn't major in Computer Science because in the late 90s, there were so many negative articles in the New York Times, vis-a-vis software. People don't remember it now, but the media and the culture were utterly hostile towards us, and loved to say our jobs were going to India, that everything there was to know about Computer Science could be studied in railyard switching, in existing abstract math textbooks, etc
I'm glad I'm not the only one who remembers this - whenever I try to explain it to someone they look at me like I'm crazy. In the late 90s and even early 2000s the common wisdom with guidance counselors and even local recruiters was that programming and software design were dead end in the U.S. I remember one article literally said "the bud is off the blossom". I wound up majoring in electrical engineering instead of computer science as a result.
It all worked out in the end, but not following my instincts at the time is one of my few regrets.
It was hard to figure out at the turn of the century when the career fair was literally cut in half after the dot com bust. Although websites had been around for years, web apps were still pretty clunky and it felt like the world of internet-based possibilities still had a long way to go. I decided to try doing application development for pay because it seemed interesting and I figured I could easily switch to something else down the road. Plenty of relatives and acquaintances did inform me that my job was going to be outsourced abroad, though. :) And things looked dire again with the financial crisis but I was shocked that a few years after that, I discovered when recruiting at my alma mater that CS had become the most popular major whereas it was one of the smallest ones when I was studying it! So, lots of predicting that turned out differently...
Yeah, that's why I don't take re-kindling of the "it'll get offshored any day now" panic post-Covid that seriously. Time zones haven't gone away. The communication-based hard parts of software development haven't gone away. The way that delivering what someone asks for usually leads to them asking for more things, not fewer, hasn't gone away.
Yes, this is one reason I am personally really sensitive when various people say how privileged I was to get into computers and that we somehow got all this encouragement unlike young women, etc.
In the 80s we were mocked and called nerds for being interested in computers, and before and after dot com people thought this was dead end career.
Yes. Even as the internet started to become a thing in 1994-1995 when I was in middle school, I'd reckon less than half of my class had a computer at home - and fewer still of them would ever want to mention it.
OT, but when I search for "the bud is off the blossom" the only references I get from google are 2 links to hacker news comments... There's 0 in bing for that phrase. Never heard it before ever.
In the early days computer programming was considered a clerical job one learned in trade schools. I think people looked down on it partly because many of the early programmers were female, beneath the dignity of a male profession.
It rook my alma mater MIT until 2018 to recognize software worthy of a department in itself (after a huge financial donation). Before then it was a step child of Electrical Engineering. This is kind of ironic because me and most of my classmates ended up writing software for money, though almost none of us majored in that field.
> In the early days computer programming was considered a clerical job one learned in trade schools.
That's because in those days, the term "programming" didn't mean "software development", it referred to data entry. It actually was clerical work, comparable to typing a dictation on a typewriter. Only later, when user interface devices (keyboards, displays) considerably improved and it became more efficient to unify those tasks in one person, did "programming" and "software development" start to become synonymous.
It has nothing to do with "dignity of a male profession", or oppression of women, just a misunderstanding of a shift in the meaning of words.
> In the late 90s and even early 2000s the common wisdom with guidance counselors and even local recruiters was that programming and software design were dead end
My career advice as a teenager was that there wasn't any point doing software, as Microsoft had made it all already with Microsoft Office.
My mother talked me out of going to school for programming, and a decade after I graduated high school that’s what I ended up doing anyway, realizing it was going to lead to better prospects.
Universities are always several years behind the curve. At college in the 90s they were still teaching token ring networking despite Ethernet already being common place. The same college told me that programmers didn’t design any of the code they write; they only transcribe code from flow charts.
Just yesterday I was talking to a grad about DevOps. He said the field sounded boring from what he was taught at uni. Then when we discussed it more it turned out his “DevOps” course was actually just teaching them how to be a scrum master and didn’t include a single thing about automation, infrastructure as code, etc.
I also remember just how garbage general publications were with regards to IT. And to be fair they still are now. But there was always a wealth of better information in specialist publications as well as online (particularly by the late 90s).
That may well be true of some universities today. In 1970, they were pretty much the only place you could get hands on experience with a computer unless you somehow slid into a programming job in the financial industry, or a one of the few other areas that actually used them. And they were not behind the curve on the technology, although they tended to have lower end hardware than industry, because any compute was very expensive. The invoice on a 64k byte HP3000 in 1972, which on a good day could support half a dozen users actually doing any work, was over $100K. Memory upgrades to 128K ran you about $1/byte installed - maybe $8 in today's money. It was a big deal to be allowed hands on use of them.
I was talking about 90s to modern era. Not just modern era.
And having computers doesn’t mean any of the lecturers understand the modern (for that era) trends in computing. More often than not, it’s computer clubs rather than cause material that hold the really interesting content.
I don’t doubt there will be exceptions to this rule. But for most people I’ve spoken to or read interviews from, this seems to have been the trend.
It definitively is true of local universities. I've met people from the local university who have a master in machine learning, yet have never heard of docker.
This is a good thing. Opportunity costs are incredibly important with university educations because students have a limited time to learn.
Why spend the time futzing with a tool like docker? It's not foundational to machine learning, so learning that tool takes away from time that could be spent learning something more relevant. And the student may or may not use it when they get a job.
"Getting shit to work" is more foundational to machine learning than you would think, and containers helps a lot with that. If you want to train models on someone else's machine - and you probably will, for anything big - you need to know a little about how that sort of thing is done today.
And if you want to try two different deep learning frameworks, dependent on different versions of cuda, and want them to not break each other, God help you if you try that without containers.
It's not that they don't have a "course in docker". I understand that. It's that they haven't even heard of it, so they don't even know where to start to look for solutions to problems like that. I have been through that pain myself.
Containers is just one of so many easy things, that make your job so much easier, I've learned the painful way in 20 years as a developer in (mostly) non-elite companies, where no one else knew it either because they hadn't been taught at the local universities, because no one there knew it either.
It's highly dependent on school. The Ivies, including "public Ivies" will teach you proper comp sci. A lot of other big schools will do you well also. When it comes to smaller regional universities or junior colleges and community colleges, then it's hit or miss. Your intro CS course may be great if you manage to get an instructor who knows it well themselves and wants their students to know it, or you may get someone who teaches students how to do Microsoft Office without a shred of programming.
I went to RIT in the early 2000s. I remember the CS and CE departments were quite good (although the prevalent Sun workstations were already getting outdated). Somehow I ended up taking 1 elective from the "Management Information Systems" department and the instructor kept mixing up search engines and web browsers. I think I dropped the class shortly thereafter.
I was having to deal with token ring in '96-'97, and have not touched it since. Seems like it went away quite quickly. Cue up someone replying that they're still maintaining a token ring system in 2022... :)
I had to deal with token ring way up until 2001 when even the most die hard nuts had to admit that you could buy a dozen ethernet cards for the cost of a single TR. IIRC the TR people tried to convince us that ATM was the future.
Not quite 2022, but yeah I was maintaining a token ring based network for some subway at my last gig in 2019. As far as I know, no work is done on it now but the subway-car using the system are schedule to run for at least another decade so another bugfix release of the networking firmware is not entirely out of the question.
Hah, not quite nowadays but I, too, was dealing with one from around '97-2000'ish. What a pain in the ass. That was just one network in the building, I also had to deal with 10base-t, which was also a nightmare. shudder
I remember taking a graduate level networking course at NYU in the early 1990s. The instructor was an IBM consultant. We studied token ring, FDDI, SNA, HDLC/SDLC and several other commercial products.
One evening, I raised my hand and asked when we were going to study TCP/IP.
He simply quipped, "TCP/IP is not a real networking protocol."
So I wouldn't say that universities are always behind the curve :)
In 2015 or 2016 o was taking the computer architectures class at my local university… the processor they based the whole course upon was the motorola 68000.
As far as introductory courses go, the older/simpler the processor,the better it is for everyone. My class groused at being taught "old tech" because we taught the 68k, but very few of us had done any assembly before, I think most of the class would have failed if started of on amd64
And why wouldn't they base it on that CPU? If you're trying to learn the basics of shipbuilding, you don't start by going on a deep dive into the construction of an aircraft carrier.
It's a simple chip, with a simple instruction set, that can actually be taught to you in the time allotted over a three-credit class.
The bit on "DevOps" is pretty egregious. There's two key things at stake here.
1. "DevOps" is an absolutely critical part of automation. It's the reason why we can start tech companies with such small engineering staff compared to 20 years ago. It's as important as all the high-level languages we use. This stuff is the logistics of how software gets deployed. It's the same in business as it is in war. Coding chops is like tactical strategy, and being able to ambush a tank column. It matters, and you won't have an engineering org without it, but the whole chain of how stuff gets deployed and iterated is what keeps the ammo flowing and the fuel pumping.
2. Universities want to teach stuff that'll still be relevant in 50 years. Given their proclivities, that means stuff like algorithms.
On one hand, I think that universities and academics can be somewhat forgiven for their ignorance on this matter. In fact I think we ourselves don't know what's going to be needed in our field in ten, twenty, thirty years. If the folks in industry didn't predict infrastructure-as-code 20 years ago, then the universities couldn't have taught it.
But what I know now is that:
- After all these years, no one is getting rid of shell scripting.
- Old school (i.e. 2nd generation) config management still has its place in many companies. Ansible is great for provisioning an AMI, if you need one, but if you need static infrastructure, puppet and chef are actually better because they track state, which allows you to better manage config drift.
- k8s may be hot and all, but a lot of the underlying "ops" stuff still translates. You average resource usage over pods instead of hosts, for example.
- Put together, there is an "instinct" for ops that is not unlike the "instinct" people learn for math, algorithms, and code. They are completely separate and an engineering org needs both. I think that universities don't "get" ops because computer science is more like math, whereas ops is more like history.
- On one hand, being stuck in an older ops paradigm is pretty awful – if you missed the transition to infrastructure-as-code, then it may be really, really hard to get out of that rut. But the field itself can be pretty bad with being stuck – it took us forever to give up our own datacenter racks.
- But otherwise, the old knowledge about old tools didn't necessarily just go away, in fact it's oftentimes still quite relevant. Linux internals (e.g. iptables) are still useful.
- When I was at CMU, a lot of folks learned some of that ops instinct in the dorm room, and in the computer clusters. But the universities pretty much made it optional. Looking back, I think this was a mistake. Ops is pretty much entirely transmitted through osmosis, whereas we at least try to teach people to code in official uni classes.
> ...and loved to say our jobs were going to India.
They weren't wrong, though; they just omitted delimiting that assertion.
Back in those dark ages, mainframe jobs were still considered by career "experts" the "adult in the room" jobs of programming. It is hard to convey to people who never studied that era or grew up in that era just how much microprocessor-based computers were considered "not real computing" in vast swathes of the industry. The proprietary Unixes thrived under that lay perception, as a "serious business" microprocessor-based computers market segment.
And the mainframe jobs did by and large up and wholesale decamped to India from large chunks of the mainframe account base. Those career experts were right in a way.
Just not quite the way they thought. The scope they thought in was too absolute because they lacked the technical (and business, and financial...) perspective and context to understand why the same wouldn't happen to quite the same extent to sectors outside mainframes, nor of the explosion of re-invention of the wheel of many mainframe tech stacks that would drive the industry forward even to this day and beyond, along with the rapid recombination of new ideas.
I was using objdump and cordumps to debug a kernel crash just last week. Not tedious at all. More like working a difficult puzzle. And very rewarding if you figure it out and fix the crash.
objdump and coredumps today are way less tedious than getting a compiler error the next day (if not few days out!).
At least with punched cards if you kept them sorted (line numbers in front a'la BASIC really helped with that) you could easily edit in place - just replace that one card that was incorrect, because each card = one line.
TECO (which begat EMACS) started out because paper tape which was preferred storage on DEC machines was harder to edit in place than card stacks and instead of retyping whole program you'd summarise your changes (that you dutifully copied on fanfold greenbar printout - or suffered) into few complex commands then used the resulting 4 tapes (TECO load tape, TECO commands tape, incorrect program, fresh unpunched tape) to get one corrected.
For maximum efficiency, the OS/360 team had to work 24h - the programmers would write their changes on first shift, then teams had to prepare cards, submit them for compilation, night shift reprinted modified documentation, and when you'd arrive at work you'd have fresh documentation and results of your compile (unless you had the luck to work on-line that day with more immediate feedback)
You say it like negative articles about Comp Sci/Applied Programming/Really any Tech Co from the NYT is a thing of the past. It's with a sense of irony that articles denouncing Tech is easy, routine clickbait for them now.
> I almost didn't major in Computer Science because in the late 90s
You missed, by a few years at least, the opportunity to study and earn a degree that is no longer available from CMU, the B.S. in Cognitive Linguistics. I got an early acceptance from CMU in late 1988, my first choice of education because I wanted that degree in particular, but I could not afford CMU tuition let alone housing, and I was ineligible for financial aid. I studied CS at Virginia Tech at about a tenth the cost and never regretted it. Though I never met him, Allen Briggs[1] was an underclassman there while I was an upperclassman. He ported NetBSD to 68k Macs while still an undergraduate at Virginia Tech, which always impressed me. A/UX licenses were not cheap, and MacBSD was free.
The backdoor into CMU back then was and maybe still is Pitt. Pitt students had the privilege of signing up for any CMU course and it just meant a slightly longer walk to class.
The Cognitive Linguistics degree at CMU in the 90's was an interdisciplinary combination of cognitive science, neurology, computer science and linguistics, and disappeared when the faculty member that created and sponsored it passed away in the mid-1990s. While Pitt is a quality university, I don't think they offer degrees from CMU. Pitt was on my radar and one of the few places I was accepted to, but out of state tuition at the time iirc was $7K/semester, more reasonable than CMU's ~$12K/semester, but I had moved to Virginia the year before, and Virginia Tech's in-state tuition was about $2K/semester with housing (though I was required to purchase $4500 worth of Mac and A/UX license). Today, Virginia Tech's instate tuition is as much as Pitt's out of state tuition was then, which is now about the same as CMU's private tuition was in 1989, and CMU's annual private tuition today costs a little more than an Audi Q5 Prestige.
Perhaps I wasn’t clear. CMU allowed Pitt students to register for CMU classes. Your degree would say Pitt on it, but you would have attended the exact same classes as the CMU students. As in sat in the same classroom with the same professors at the same time, doing the same assignments and taking the same tests.
Thank you, that is what I understood you to mean, but if Pitt doesn't offer the same degree, how could one graduate accumulating credits for a degree that doesn't exist? While many universities allow non-students to audit courses, and one could take every required course of a subject this way, without actually being awarded the degree one can not claim the degree. Also, as I explained, I was out of state, making Pitt tuition expensive. CMU does grant its FT employees and their children free tuition after a token number of years of employment, but, of course, even a qualified and experienced HS graduate without an undergraduate degree would not make it past HR for an interview. And unfortunately, the Cognitive Linguistics degree only existed for a very short window, about 5-6 years. Personally, my only option to attend CMU was to get about $90K worth of college loans, or conversely, $60K worth of loans to attend Pitt, but I would gave sooner accepted the appointment offered me to Annapolis, an even more selective university than CMU that costs nothing but a commitment of 5 extra years of military service. What I did instead was study CS at Virginia Tech and graduate only $10K in debt, which was not difficult to get out from under. And though I did not study any linguistics there, I did exhaust my curiosity in cognitive science and neurology via an elective in philosophy of mind. CS was itself 60 credits of CS and Math with a built-in Math minor, and was tricky enough completing without idk how many other credits in proper neurology and linguistics that I missed out on at CMU or Pitt, though fascinating, each considerably complicated subjects in their own right.
Very interesting. I am from that era, teaching myself to program starting in 1983 (which I thought was quite possibly too late to catch the microcomputer gold rush ;). I was self-taught and learned from popular computer magazines and well-written, carefully selected books. But now that you mention it I remember looking at course catalogs from good schools and being shocked at how retrograde it all was. Those guys at the universities totally did not get microcomputers for years after they should have.
I majored in CS in the late 90s and this wasn’t my experience at all. The Netscape IPO happened in 1995, followed by 5 years of the dot.com gold rush. Computers flew off the shelves, and everyone wanted to get online.
The dotcom crash happens later in 2001, but if we are talking about the late 90s, then I’d say it was a period of huge energy in the CS field and tech companies were hiring as fast as they could and jobs were plentiful all around.
We're probably about the same age. I decided against comp sci at the turn of the century because of exactly what was being said. The dotcom bust just happened and if the media was to be believed programmers were taking jobs flipping burgers and there were enough programmers without jobs to cover the world's programming needs for the next 50 years.
I wound up going to school for economics and then later found my way into the IT world by circumstance.
When I started uni in 2004 it was still like this. I kind of was ashamed at parties to tell what I study, not to come off as too nerdy. I did a double major, so business was hipper. Just imagine! The status of developers changed so much in two decades. Nowadays people are impressed. And even in my career I see the difference. Not so many VPNs anymore, the move to the cloud made everything much easier.
> Even what came after that, e.g. in C / C++ was considerably tedious compared to what we do today. Folks sometimes had to do objdumps of compiled binaries to debug what was going on.
They used to do objdumps. They still do, but they used to too.
> You used to legit need that CS degree in order to code in your day-to-day
And when people today look back with disdain at ugly VB applications and wonder what simpleton, non-programmer, drag-and-dropper built this piece of excrement (that has somehow been running for 17 years without an update and the replacement project that we hired those consultants for ended up 3x over costs and nobody uses it) as opposed to a Real Software Program, there's the reason.
I was in high school in the early 00s and heard the exact same thing, and that was a major reason why I chose not to major in CS! (The other is that my HS programming curriculum and teacher were inadequate, but at the time I was convinced that I just wasn't wired for programming.) In the end I took the long way around and ended up in the field as a self-taught programmer.
There was also bad programmer job market crash in the 80s that changed the market up a lot by the 90s. In fact, this was about when the gender ratio became very skewed (men and women dropped out of programming at equal rates, but the recovery was lopsided).
Our computer science department chair (Ed Lazowska) at the time brought this up as a reason to be wary about department expansion in the mid 90s.
You’ll see a huge drop off in computer science graduates after a local peak in 1985 (they wouldn’t get back to that level until 2000, note this article also quotes data from Ed Lazowska).
I majored in Computer Science in the late 90s and honestly don't remember any of what you're saying regarding negative/hostile media.
To me it felt like a golden age. The .com bust hadn't happened yet. If you could turn a computer on there were jobs everywhere. The world was starting to get online. Linux was really gaining traction and Slashdot was all time.
I almost didn't major in Computer Science because in the late 90s, there were so many negative articles in the New York Times, vis-a-vis software. People don't remember it now, but the media and the culture were utterly hostile towards us, and loved to say our jobs were going to India, that everything there was to know about Computer Science could be studied in railyard switching, in existing abstract math textbooks, etc.
By a combination of luck, and my dad's insistence, I ended up at Carnegie Mellon, and while I was there, I saw what folks at Google were doing, and I thought to myself, no, this stuff is hard, and this is just going to be the beginning.
> "It was way too tedious to do. You'd spend hours getting the cards just right. We used to put them in a shoebox and mark them with a pen in case we dropped them on the way to the lab. Then you'd wait until the next day to get your results. If you had a mistake you'd repeat the whole process"
Even what came after that, e.g. in C / C++ was considerably tedious compared to what we do today. Folks sometimes had to do objdumps of compiled binaries to debug what was going on. We had to get coredumps, load them up, and try to determine what memory error had caused things to crash (this is an entire class of problems that doesn't exist today). You used to legit need that CS degree in order to code in your day-to-day because you had to understand the function stack, the network stack, basic syscalls like wait and poll, etc.
It was a lot of work, for relatively little product, and I think part of the reason why software is paid more today is in part because of 1. faster processing speeds and 2. better tooling and automation, and higher-level programming languages – all of which were enabled in part by cheaper / faster CPU speeds (e.g. people don't have to care about how slow Python is – you can optimize it after you find product-market-fit), and 3. a better understanding of how software should be developed, at all levels of management.