The End of the Semicolon Era
Throughout most of my early professional career, I programmed in languages where you terminate or separate statements with a semicolon; it was like an ideograph distinguishing coders from laypeople. I had a motor reflex to type a semicolon (;
) without having to think about it. So when I got involved in the early discussions about the new language that JetBrains was working on back in 2010, the language that would be later called Kotlin, I was not thrilled about the proposal to drop the mandatory semicolons. Who minds the semicolons I thought. I had quite a strong opinion about the need to solve the billion-dollar problem with nulls¹ to make our software safer, but the visuals I cared less about. How narrowminded I was!
As it turns out, the disappearance of the semicolon has become the distinguishing characteristic of the rising modern languages of the late 2000s and 2010s to such an extent that it almost becomes synonymous with the very feeling of programming in the “modern language”. Scala (2004), Go (2009), Kotlin (2011), and Swift (2014) are all statically typed, all follow C/C++/Java syntactic tradition with curly braces, all were introduced and grew big in the last 20 years and all had totally ditched the mandatory semicolon despite the tradition. Is it a coincidence? I don’t think so. I see a trend here.
The ground zero
Let’s take a look at the elephant in the room — Python programming language (1990). While it is not exactly “modern”, it has enjoyed a meteoric rise in popularity only recently. It became the most popular programming language by some forward-looking metrics like PYPL² and is consistently placed into the top three.
Syntactically Python is a thing in itself. It is a rare breed of fully indentation-sensitive language (aka off-side rule³) that is not some kind of an academic or a toy language. The clean and concise looks of Python code are quite appealing. Switching from something as lean as Python (unless you try object-oriented Python, that is) to something as verbose as Java (1995) or C# (2000) is tough, so no surprise that modern languages are racing to find some middle ground here.
With type-inference now a defacto-standard language feature⁴, it is becoming feasible to avoid visual noise and repetitiveness while maintaining type-safety and scalability. The motto becomes:
If both compiler and human can easily guess it then you should not explicitly spell it out in the code.
Don’t Repeat Yourself (DRY) desire is getting stronger, cleanness is treasured. Sans-serif fonts had occupied our code editors for quite some time, with new coding fonts raising clarity bar even higher⁵; programming languages are becoming sans-type and sans-semicolons.
A false-start
The road to the future without semicolons was a bumpy one, though. One major false-start has happened in JavaScript (1995). The rules for Automatic Semicolon Insertion (ASI) in JavaScript lead to numerous gotchas, where a properly formatted JS code with two statements, like this one:
const name = "World"
["Hello", name].forEach(word => console.log(word))
actually gets interpreted as one statement by the rules of JS, forcing the programmer to use the semicolon just to make sure the compiler understands what the author meant by their code, even as it is totally unambiguous to a human reader in the first place. It is so bad in JS that modern JS coding styles recommend always using semicolons to save mental energy thinking about whether you need one or not. TypeScript (2012), being backward-compatible with JavaScript, suffers from it as well.
The solution
The JS semicolons mishap had almost killed the very idea of dropping semicolons in a curly-braced language. Yet, it turns out it is not an insurmountable problem, once the design goal is set straight. The goal is not to omit semicolons sometimes by automatically inserting them here and there using heuristics, but to make sure that every properly-formatted code that is following a language style guide never has to include a semicolon at all.
The rigor of the modern programming style makes it possible to design such rules even for the C-family of indentation-insensitive languages. We, the developers, are taught and are used to split statements into multiple lines after opening braces and operators. Take a look at a multi-line function call statement:
receiver.foo(
parameter
).doSomethingWithIt() // one multi-line statement
When formatted in a good style, it is clearly distinct from two separate statements, both for a human and for a well-engineered compiler, even if you remove all the indentation:
receiver.foo // the first statement
(parameter).doSomethingWithIt() // the second statement
Once it crosses the threshold from “I have to think whether semicolon is needed here” to “I never need a semicolon”, the bliss of not having to write and never having to see semicolons in the code becomes truly liberating.
As with all quality-of-life improvements they only become apparent when you have to do without them for a while. Have you ever tried to read code that is typeset in Times New Roman? You may not even realize it at first, but you get this distinct feeling that something is not right — there’s too much visual noise from all that serif. Nowadays I get the same feeling when I see semicolons in the modern code, especially if it is written in some seemingly modern language. Every time I cannot stop myself but open Wikipedia to double-check in what year the language I’m reading was introduced in.
We are living in the end of the semicolon era. The recipe is out there. It is only a matter of time for it to become truly ubiquitous.