I started working with computers in the 1970's, as an operator, then as a programmer, then as a systems analyst, project leader and manager. I remember punched cards (I wrote and ran my first programs using them, and heaven help you if you dropped the deck of cards on your way up the stairs to the computer room!); JCL, with all its demanding, finicky standards to trip up unwary programmers and sabotage their tests; System/360 and System/370 mainframes (including upgrading the mainframe's magnetic-core memory - using real magnetic cores strung on wire frames - from one to two megabytes, which everyone thought was wildly extravagant and costly at the time); minicomputers such as DEC's PDP and VAX machines; the very first PC's (yes, including the Apple 1 and the first IBM PC); the use of CICS for online systems; and so on.
In particular I remember the agonizing process of developing programs from scratch. One would diagram the specification in the form of a flowchart, which one would have to "defend" in a meeting with the senior programmers, who would try to pick apart the logical sequence one had established to accomplish the task(s) concerned. Once one had passed that step, one coded the program (I used COBOL in business, plus a number of other, more esoteric languages - not excluding profanity - for specialized tasks), compiled it, then ran a series of test data through it to ensure it did what it was supposed to do. While all this was going on (and it could take months for a big, complex program), changing user requirements and modifications to the specifications kept one busy chasing one's tail (and using even more profanity instead of COBOL!). When the first artificial-intelligence-based tools came along, later in the 1980's, to help automate the systems design and programming process, they were a godsend.
Things have come a long way since then. It's bittersweet for me to read about "apps" that don't require much programming knowledge at all.
As a recent MBA graduate, Leytus had plenty of ideas for apps, though he lacked skills in software development, a common barrier to would-be tech entrepreneurs. But then he discovered Bubble, a drag-and-drop builder with a deceptively simple interface. It’s one of several advanced ‘no-code’ tools enabling hundreds of thousands of people without technical backgrounds to create their own apps, effectively eliminating the need to learn a coding language before launching a start-up.
To demonstrate what the tool could do, Leytus relied on the novelist’s adage – ‘show, don’t tell’ – and used Bubble to hack together a fully functional web app he named Not Real Twitter. He gave it a cheeky tagline: “Just like Twitter, but worse… a lot worse.” While it worked like the real thing, his goal wasn’t to give disaffected Twitter users a new home. Leytus was in the early phases of co-founding AirDev, where he today helps start-ups and enterprise clients leverage no-code app builders. He wanted to show his prospective clients what he could quickly build without actually writing code himself.
“It was very difficult to explain to somebody without giving them something to look at,” says Leytus. “[Cloning Twitter was] more convincing than me just saying, hey, this can actually make pretty powerful stuff.”
He added an all-caps note on the clone’s homepage addressed to Twitter: “PLEASE DON’T SUE US.” Luckily, they didn’t. He posted about the app on Hacker News, a social news website, and his story quickly became an example of the no-code movement’s potential.
Five years later, Leytus decided to repeat the challenge again, as the 2015 version is “no longer representative of what you can build with no-code technology”. He and the AirDev team have built an updated clone, dubbed Not Real Twitter v2, with a design that looks like modern Twitter. He says it reflects how much tools like Bubble have matured, with improved functionality and greater support for mobile devices.
It may be surprising just how much you can accomplish without knowing an iota of programming language, or writing any code at all. Projects like Leytus’s show the potential for nearly anyone to jump into development – a field that’s currently opaque to those without certain skills. Could no-code development be the future of web-based innovation – and, if so, what does that mean for how we build the ‘next big thing’?
There's more at the link.
When an entire company - no, almost an entire industry - such as Twitter is based on a program (or series of programs) that can be written without any of the hard-learned expertise that we had to master in the "bad old days" . . . that's depressing. Talk about feeling redundant!
Of course, I haven't been involved in data processing, except as an end user, for many decades; but I haven't forgotten how it felt to be a much-sought-after specialist in a top-end field. Things sure have changed . . .