Artful Computing

I spent a lot of May 2021 acting as a Section Leader in Stanford University's "CodeInPlace" project, which aimed to teach Python remotely to a very large number of students world-wide (about 12,000), staffed largely by volunteers. It was a great experience and I am posting here some of the general advice I gave to my students.

I saw a number of posts on the CodeInPlace forums from people who are thinking about an alternative career path in computing. I wish you well, because there are, indeed, a great many opportunities: the World is accumulating data faster than we know how to process it, and computation increasingly pervades almost every aspect of our lives. As AI techniques become more prevalent that will only become more apparent. Nevertheless, while software developers are often much in demand and frequently well paid because they are essential to an enterprise, do not expect that any of your non-technical managers will have much understanding of your skills, or visibility of your creations: your beautiful "castles in the air" will be fully appreciated only by you and close colleagues, and you may also find that your contributions to projects are not fully credited. Full appreciation only comes after you leave, and something stops working! (This is a particular problem in academia: promotions go to those who publish the analysis of experimental data and the vital software tool-makers who facilitate their work often do not get joint authorship or even a thank you credit. It is one of the reasons some of the software written by academics may be poorly finished by professional standards: there is no percentage in polishing tools that will then be used by your competitors to burnish their own publication records.)

Here, nevertheless, are some, perhaps slightly cynical, remarks from a person at the end of a rewarding career, which I make so that you can go forward with your eyes open and some realistic expectations.

Like you, I did not start with a formal computing education (very few did when I graduated in 1973). In the early days I was mostly self-taught (we had to be - there was no one to teach us). I was, however, lucky enough to be at Cambridge, which was pioneering the application of computers in science. In fact, in my research field, Radio Astronomy, observations were constrained by the availability of computing power to process data and the field was also driving algorithm development. (Not much changes: it is exactly the same today - but on a very much larger scale.)

Computational physics has, since then, earned me a generous salary, and given me many interesting and challenging opportunities. Even more now than 40 years ago, graduates with numerate backgrounds (e.g. maths, physics and engineering) are still often recruited into computing at a high rate. In the UK, six months after graduating, about 20% of mathematicians and nearly 30% of physicists claim to have employment in IT and computing. Many more, like me, would not claim computing as their primary role: they just regard programming as a tool which supports their work as professional scientists and engineers. In contrast, entry to computing from the humanities tends to be down in the low percentages, but I suspect this is more a matter of inclination, since the capability for analytical thinking is not by any means the exclusive province of the sciences.

It is not just a field for fresh graduates. I once mentored a mid-career teacher who had had enough of the classroom and needed an industrial placement for his MSc dissertation project. I was impressed by his systematic attitude to acquiring new skills (his teaching experience clearly worked both ways) and he subsequently found a job with an academic publisher where his attitude, his teaching background and the recent computing skills he had acquired made him the ideal candidate. The moral here is that previous relevant life experience combined with computing can be very attractive to an employer. Given a reasonable amount of aptitude, employers know they can train to improve skills and knowledge, but attitude, judgement, motivation and general “people” skills take much longer to develop.

I often found it rewarding to work with less experienced people who accepted that they had much yet to learn, and that being an “apprentice” to a “master” was the way to their own mastery. They treated every task as an opportunity to raise their standard of execution. (It is one of the reasons I signed on as a "CodeInPlace" Section Leader.)

In contrast, I disliked working with the “know-it-alls” who did not see the point of acquiring new skills, and often proved to know rather less than they would have you believe (and indeed they were often fooling themselves, as well as recruiting managers). They always wanted to use the same programming language for every job (which was, of course, the first language that they happened to have encountered ten years previously). 

I was also wary of the contractor only intent on acquiring new badges for his (yes, in my experience very nearly always his!) CV, who would treat your job as the opportunity to learn (at your expense) the latest snake-oil technology before moving on. Advertising themselves as, for example, competent in Python it would turn out that they had done one course of about the depth of CodeInPlace with little subsequent practical experience.

The "CodeInPlace"  Python course (or any other introductory programming course) was therefore just a first step (albeit a very import first step). Knowing the basics of a programming language does not turn you into a software engineer or a data scientist, anymore than learning how to hold a spanner turns you into a mechanical engineer, or learning how to operate a microscope turns you into a scientist.

The next step is developing your “computational thinking” skills: that is, abstract problem solving concepts away from particular programming languages. These skills are highly transferable between languages. I did, in fact, once initiate a Python-based project using a team who had no previous experience in the language - but I had used them before and knew and trusted them as very competent software engineers. Within a month they were using Python like the programming experts they were.

Probably the best way to acquire these higher level skills is working with more experienced programmers and observing the way experts tackle difficult challenges: there is no substitute for learning by doing, and no substitute for having your work critically assessed and helped by an acknowledged master.

That that is not always possible for those who must work in relative isolation. There are, however, a great many books giving excellent advice, and you can also study the productions of experts in the mass of open-source material available via the web. But choose carefully: it is not all built to the highest standards. The best examples make the chosen solution look both obvious and easy to understand. “Hard to understand” may mean it is particularly deep and intricate, using a sophisticated design approach that needs to be taught, but more often it is just a sign of ambition exceeding competence. 

The waves of fashion afflict software production more than other areas of engineering: snake-oil and magic bullets are forever being offered: languages come in and out of popularity sometimes for no very good reasons (Python currently rides high, for what I think are good reasons). But there is no substitute for actually taking the trouble to really understand what you are trying to achieve: a language lets you express ideas but does not help you decide what to say. Any electronic or mechanical engineer would say exactly the same.

There is, indeed, a point to learning certain programming languages that you may never use in anger, because they may change the way you are able to think about problems: you are fitted out with new mental tools. Anyone who has completed a course in LISP or PROLOG will always be aware of the advantages of recursive algorithms and “declarative” programming for addressing certain types of problem. (Languages such as Java or Python are "imperative" - we tell the computer what to do. "Declarative" languages describe the properties the solution must exhibit and then let the computer decide what to do.) I have at times, particularly when developing machine learning applications, implemented in Python or Fortran algorithms I first learned in LISP  (using the recursive function calls supported by these languages). I even had colleagues who found it worth while building a PROLOG interpreter into their Fortran-based computational physics code for more intelligent and convenient specification of users’ problems.

A programming expert usually has about four or five languages that they are actively employing and will tend to choose the one best suited for a particular application, rather than the same, often inappropriate, tool for every job. (They also have four or five others with which they have some familiarity, and in which they could quickly reach expert level if necessary.)

You may move on from programming in Python, and certainly will if you aim for a professional career in computing, but I assure you that you will forever think in terms of "for x in <an iterable collection>", "while <condition>", lists (stacks and queues), dictionaries (hashes, associative arrays) even if the language does not directly support these features, and you have to implement them yourself.

Breadcrumbs