Wednesday, August 12, 2020

*Sigh* I guess I'm a technological dinosaur . . .


I started working with computers in the 1970's, as an operator, then as a programmer, then as a systems analyst, project leader and manager.  I remember punched cards (I wrote and ran my first programs using them, and heaven help you if you dropped the deck of cards on your way up the stairs to the computer room!);  JCL, with all its demanding, finicky standards to trip up unwary programmers and sabotage their tests;  System/360 and System/370 mainframes (including upgrading the mainframe's magnetic-core memory - using real magnetic cores strung on wire frames - from one to two megabytes, which everyone thought was wildly extravagant and costly at the time);  minicomputers such as DEC's PDP and VAX machines;  the very first PC's (yes, including the Apple 1 and the first IBM PC);  the use of CICS for online systems;  and so on.

In particular I remember the agonizing process of developing programs from scratch.  One would diagram the specification in the form of a flowchart, which one would have to "defend" in a meeting with the senior programmers, who would try to pick apart the logical sequence one had established to accomplish the task(s) concerned.  Once one had passed that step, one coded the program (I used COBOL in business, plus a number of other, more esoteric languages - not excluding profanity - for specialized tasks), compiled it, then ran a series of test data through it to ensure it did what it was supposed to do.  While all this was going on (and it could take months for a big, complex program), changing user requirements and modifications to the specifications kept one busy chasing one's tail (and using even more profanity instead of COBOL!).  When the first artificial-intelligence-based tools came along, later in the 1980's, to help automate the systems design and programming process, they were a godsend.

Things have come a long way since then.  It's bittersweet for me to read about "apps" that don't require much programming knowledge at all.

As a recent MBA graduate, Leytus had plenty of ideas for apps, though he lacked skills in software development, a common barrier to would-be tech entrepreneurs. But then he discovered Bubble, a drag-and-drop builder with a deceptively simple interface. It’s one of several advanced ‘no-code’ tools enabling hundreds of thousands of people without technical backgrounds to create their own apps, effectively eliminating the need to learn a coding language before launching a start-up.

To demonstrate what the tool could do, Leytus relied on the novelist’s adage – ‘show, don’t tell’ – and used Bubble to hack together a fully functional web app he named Not Real Twitter. He gave it a cheeky tagline: “Just like Twitter, but worse… a lot worse.” While it worked like the real thing, his goal wasn’t to give disaffected Twitter users a new home. Leytus was in the early phases of co-founding AirDev, where he today helps start-ups and enterprise clients leverage no-code app builders. He wanted to show his prospective clients what he could quickly build without actually writing code himself.

“It was very difficult to explain to somebody without giving them something to look at,” says Leytus. “[Cloning Twitter was] more convincing than me just saying, hey, this can actually make pretty powerful stuff.”

He added an all-caps note on the clone’s homepage addressed to Twitter: “PLEASE DON’T SUE US.” Luckily, they didn’t. He posted about the app on Hacker News, a social news website, and his story quickly became an example of the no-code movement’s potential.

Five years later, Leytus decided to repeat the challenge again, as the 2015 version is “no longer representative of what you can build with no-code technology”. He and the AirDev team have built an updated clone, dubbed Not Real Twitter v2, with a design that looks like modern Twitter. He says it reflects how much tools like Bubble have matured, with improved functionality and greater support for mobile devices.

It may be surprising just how much you can accomplish without knowing an iota of programming language, or writing any code at all. Projects like Leytus’s show the potential for nearly anyone to jump into development – a field that’s currently opaque to those without certain skills. Could no-code development be the future of web-based innovation – and, if so, what does that mean for how we build the ‘next big thing’?

There's more at the link.

When an entire company - no, almost an entire industry - such as Twitter is based on a program (or series of programs) that can be written without any of the hard-learned expertise that we had to master in the "bad old days" . . . that's depressing.  Talk about feeling redundant!

Of course, I haven't been involved in data processing, except as an end user, for many decades;  but I haven't forgotten how it felt to be a much-sought-after specialist in a top-end field.  Things sure have changed . . .

Peter

28 comments:

  1. It's a Brave New World we live in...

    ReplyDelete
  2. Standing on the shoulders of giants. There is a quote in Jurassic Park (the Book), by character Ian Malcolm, in which he observes that, "You stood on the shoulders of geniuses to accomplish something as fast as you could, and before you even knew what you had, you patented it, and packaged it, and slapped it on a plastic lunchbox, and now you wanna sell it." Seems apt here.

    ReplyDelete
  3. So security is actually completely un thought of....

    ReplyDelete
  4. This November will mark 35 years since I got my first full-time job as a programmer, and I've worked as one ever since. I've made my career supporting Legacy systems, Cobol, Pl/I, SAS, JCL, Adabas/Natural. It's safe to say I've forgotten more about programming than most people will ever know.

    Code generators are nothing new, they're just taking advantage of the increase in speed/power/memory that comes via Moore's Law (computing power doubles every 16 or so months). They're great until something goes wrong, then you need us old folks. Just like a few months ago the state of NJ was asking for volunteers who know Cobol to fix the unemployment system, because it crashed due to volume at the beginning of the Covid-19 lockdowns.

    There's the story about a factory that ran flawlessly for years, and no one currently working there knew anything about the machinery. One day something broke and no one could fix it, so they brought a guy out of retirement to come tell them what the problem was, he agreed to do so for $10,000. He walked around the factory, finally stopping in front of a piece of machinery, drew an X on it with chalk, and said "Replace that and it'll be fine." The manager told him they needed an itemized bill for his services, so his bill read:

    - Drawing an X $5
    - Knowing where to draw the X: $9,995

    ReplyDelete
    Replies
    1. You never know what you have until it's gone. Especially if what you have is an old greybeard in the maintenance or engineering department.

      Delete
  5. If you want to know who your real friends are:

    1. Party 'till your money runs out.
    2. Start calling people at 3:00 AM to come and pick your worthless, drunken fool a** up at the phone booth outside a notorious gay leather bar.
    3. Drops a large stack of punch cards on your way to the keypunch room.

    Yeah, good old Cobol (Common Business Oriented Language) and JCL (Job Control Language), and the winner of crappiest language of the year for 31 years in a row - RPG II (Report Program Generator, Version 2). There's also a version one and version three.

    So - I'm not exactly completely green, but I was working diligently on my three year pin when I landed a contract with a company that was circling the drain in quaint, historical, Toledo, Ohio. The building was downtown, and they had this nice little restaurant two blocks down the street that actually gave away free food! The parking lot was surrounded by an eight foot fence topped with razor wire.

    My first assignment involved creating a table in COBOL, which isn't as easy as one might think. There were design problems, and my immediate supervisor was a 14 carat chrome plated bitch on a stick, and, well... my program wouldn't compile, but produced a cryptic error message that I couldn't correct. I turned to the three other contractors for help, and they all reassured me that the code looked fine. Then the oldest man said with some skepticism, "You know, back in the old days of punch cards, breadboards and paper tape, you had to define this before that except when the other thing - but that's prior to ANSI-74, when they fixed all that.

    Off to the director of I.S. with a question, "Are we on an ANSI-74 compiler?"

    "Well, no. No, we aren't. Our COBOL compiler is pre-ANSI-74."

    Oh hell yes. Two minutes and one fix later I had a clean compile. Then the thing blew up because the table was too large - bigger than 64K.

    ReplyDelete
  6. "... heaven help you if you dropped the deck of cards on your way up the stairs to the computer room!"

    I started my career in IT by going through college. I started with Basic and my next "high level" languages were COBOL and FORTRAN 77 (or FORTRAN V).

    I dislike COBOL and how. The COBOL's equivalent of the "Hello World" program was a stack of punched cards.

    We quickly learned from more advanced students to get magic markers to draw a diagonal line on one the long sides and another on one of the short sides. Made shuffling back to order the dropped deck of cards a mush easier task.

    Mark, I am with you. Once something requires to look under the hood to fix you just got to know how to really code. As to your story, I heard it before and I also have heard it with Mr. Henry Ford and a genius electrical engineer.

    ReplyDelete
  7. And what you're talking about makes me 'glad' I got my draft notice... :-D

    ReplyDelete
  8. I met ADM Grace Hopper at CCSU after her presentation to our classes. I was still in my working uniform (attending night class after work) and she spotted me in the group and came over to introduce herself, sailor to sailor. What a gracious lady, with brains to boot. I never worked in COBOL after graduation but spent most of my time with LSI-11 assembly language level development and trouble shooting with some Unix stuff thrown in. Yes, I HAVE dropped a shoebox full of Hollerith cards. The trick was to draw a diagonal stripe across the top of the deck to help determine the approximate position of the card in the deck.

    ReplyDelete
  9. I seen a grown man cry, actually, nearly melt into a puddle when a gust of wind lurched the stack of punch cards out of his hand. I am smart, sez I. I found the punch cards fit neatly into a box the bank checks came in. I triple rubber banded that box.

    Some years ago I realized that all my professional knowledge added up to diddly squat. I went through Kubler-Ross' five stages. I am better now. You will too.

    ReplyDelete
  10. The history of programming is a history of increasing abstraction. In the beginning we worked with relay relay logic building our circuits. These were replaced by discrete transistors which in turn were replaced by logic IC chips. then came the microprocessor and our logic could be in code.

    At first we used machine language to represent the 0's and 1's. Assembler let us start to think more like humans that machines, but only by a little. higher level languages let us think more like humans. They were not as efficient, but improvements in hardware speed and capacity made efficiency less important.

    Languages like bubble.io still involve programming, but let us think even more like a human and much less like a machine.

    ReplyDelete
  11. When I was in college, the computer lab was a chilly darkened room with a series of terminals. Beside each terminal was a red switch and a warning placard saying DO NOT TOUCH THIS SWITCH. Why that switch was there in the first place I'll never know.

    Of course, one day some idiot flipped the switch. Immediately, every terminal went dark. A few hours later it was found out that just as immediate every terminal at about five colleges in the area also went dark. The main frame at some university miles away also crashed. I forget how long it took to repair the damage but I know that class was cancelled until next year.

    ReplyDelete
  12. Current software engineer here. No-code,d&d style is great for prototyping and teaching but rarely scale well in my experience.

    ReplyDelete
  13. My brother knows more about the ADA language than any mortal human should know. He was on the teams that provided 2 compilers and 3 Compiler Validators for DOD. (covering Army, AF, and Navy.
    He was last seen building a little toy called Athena for Raytheon and building other strange things we don't talk about over the Christmas table when we get together.

    IIRC his terminal degree was Compiler Design...


    THE farthest I ever got was playing in IMS-DB/DC...

    Night Driver

    ReplyDelete
  14. Ha, talking about JCL made me smile. I ran computer operations for an electric utility in the early 80's. My degree was in Applied Data Processing. My, the world has changed since we walked uphill both ways to school...

    ReplyDelete
  15. I remember well the punch cards, programming in Fortran IV. And the computer room, where a hard drive was the size of a dishwasher. Fortunately, that's as far into programming as I went, thanks but no thanks.
    I ended up in medical technology, and my later years of that troubleshooting robotics. With a knack for mechanics, I honed the skill and came close to doing anything the service engineers did. Good times, and I miss it not at all.

    ReplyDelete
  16. Thanks for the trip down memory lane, Peter. I also started off with punch cards, but my field of study was chemistry so we used Fortran with a big CDC (no, not that CDC, this one was Control Data Corp) mainframe. Submit card deck job, wait for it to be run, go pick up printout and card deck, curse at the missing punctuation mark, fix that card, repeat until it worked. Data had to be input on the cards as well. Out of school worked on PDP-11's, booted them with mylar punch tape, at least the BASIC programs were entered via a terminal. Instrument data was acquired via RS-232. Then used assembly language on another PDP-11 with 64K core memory and a drum (not disk) memory storage to control a chemical factory using instrument and sensor data to adjust the processing in real time. Then went to work for HP using their minicomputers to acquire data from lab instruments - big step up using disk drives with 10 MBytes memory - 5MB on a fixed disk, 5MB on a removable platter (like a big floppy disk). Yikes that was a long time ago - sigh...

    ReplyDelete
  17. I work for the Government and the system I use every day we got in the 80's and are still using so I use COBOL every day. I'm not a programmer, I just need it to do my necessary actions in the system.

    ReplyDelete
  18. I wrote a lot of JCL back in the day. IBM 3090 systems. My day job was a tape vault tech but I made money on the side sparing users the JCL chore.

    ReplyDelete
  19. FORTRAN? Luxury. I started with magnetic cards on a TI59 and PORTRAN.

    ReplyDelete
  20. COBOL, RPG and BAL here. Good times. I worked on a weird system where a very skilled but misguided programming manager decided to stick with Assembler when most of the industry went to higher level languages. They wrote a huge number of macros to turn BAL into something approximating a high level. Language, complete with commands for structured programming and, unfortunately their own disk access methods and sorting routines. This was on an IBM 4381 theoretically running—thinking back 35 years—DOS-VE. I say theoretically because the operating system couldn’t even see files they created on the disks. You had to use their macros. They even had their own terminal handling system, which was horrible and primitive and slow. It went in a circle polling each terminal looking for keystrokes. That worked when there were a dozen terminals. When there were 70, not so much.

    Then one day the manager who set this all up got in a dispute with upper management and left along with most of his programmers. I was part of the team who came in to keep the system running until they could transition to something more manageable. Talk about a steep learning curve.

    ReplyDelete
  21. By the way, I didn’t intend to be unknown. Not sure how to set up a screen name on blogger. I usually go by DaleCoz.

    ReplyDelete
  22. Sorry to comment a second time, but I realized I HAVE to share this story:

    Back in the 1990s I worked for a Very Large Insurance Company. We had a report that came out which had been around forever, and one of the fields started overflowing (because when it was written no one imagined one person could have over $100K in health insurance payouts, so I got the job of expanding the field. Well the program was absolutely incomprehensible (and completely uncommented) BAL code (I knew Assembler, but hadn't used it since college about ten years earlier). The program read in a large-ish binary file, and I couldn't make heads-or-tails of what it was doing.

    I finally found a different copy of the program in another library, and this one actually had comments. The assembler program was an IBM 1108 (IIRC) emulator (state of the art mid-1950s), the binary file was the object code for a program written on that platform, and we were just emulating to produce the report.

    Having no desire to learn (another) language or platform designed long before I was born in 1963, we decided to just re-write the report in Ramis (am I the only person here who's used Ramis?). The biggest issue was the users of the report didn't know what some of the fields were. They used them, but had no idea how they were calculated.

    ReplyDelete
  23. Ah yes, memories. In college, we wrote programs in FORTAN II, punched the cards, submitted the card deck, got the printout, debugged the results, repunch the offending card(s), rinse and repeat. And yes there was always some poor schlub who dropped his box of cards. That class convinced me that while I COULD write programs, I most certainly didn't want to.

    ReplyDelete
  24. Wow, this is the most comments I've seen on here in a minute.

    Yeah, agree with the comment about NO CODE not necessarily scaling well.

    Sure the guy made a twitter clone as a demonstration. You think that's going to support 145 million transactions a day.

    I would say, NO CODE is good for small stuff, or prototyping large stuff. But once you need reliability, security and high throughput you need to be able to fine tune pieces of it.

    I've forgotten more computer languages than they teach today. The only reason I struggle with modern languages is my job doesn't use them on a day to day basis.

    ReplyDelete
  25. FORTRAN was taught as a freshman in engineering school (1967). Punch cards and batch runs. A "Do Loop" was a death spiral. A bad control card would kick you out and you had to to the back of the line. My Co-Op job used BASIC and a tape reader. Much easier but the tape punch/reader was noisy. Bought a personal calculator in 1975.

    I still have my slide rule and know how to use it.

    ReplyDelete
  26. I first learned programming on a Commodore 64 in 1985 while I was still on active duty (BASIC and ASSEMBLY). Lots of people think it was just a simple gaming machine. I had a CAD program for mine, a voice recognition module, a light pen, and lots of practical software, in addition to the games. I started college full time in 1993. We wrote FORTRAN, COBOL, CICS, ASSEMBLY and JCL on the mainframe. No punch cards, we used green screen dumb terminals in the lab. We did RPG on a MIDI, then C, C++ and Visual BASIC on a PC. I fooled around a bit with LISP, ADA, Pascal and Delphi. My first job was coding VB3 and SQL. That led to VB6, C# and most recently HTML, CSS, Javascript, LINQ, and ASP. It's been a rapidly changing career. I retired this year.

    ReplyDelete
  27. FYI all you COBOL jocks, you can make large $$ through contracting firms these days if you're willing to either relocate OR spend 5 days a week in a motel.

    IBM mainframe, of course, MVS preferred but DOS-VSE OK, database Cullinet, Adabas, IMS, or VSAM files (most likely) with CICS.

    You're gold. Buy a really high-horsepower wheelchair and/or genuine silver-headed cane!

    ReplyDelete

ALL COMMENTS ARE MODERATED. THEY WILL APPEAR AFTER OWNER APPROVAL, WHICH MAY BE DELAYED.