I once asked Guido van Rossum (the creator of Python, and my former teammate working on Google App Engine) what he thought of functional programming languages. I had told him about my recent frustration in trying to learn Haskell, and my skepticism from using Lisp. His reply was eye-opening: He spoke of the potential for functional languages to provide a significant, intrinsic advantage when it comes to parallel computing. I had never observed this myself, so I found the concept intriguing and it stuck with me.
Since that conversation, I spent many years building big data processing systems that leveraged functional concepts (such as Map Reduce and Flume). I also built high-scale, low-latency, globally distributed systems, and felt the pain of shared state, replication delays, idempotent operations, etc. Although our tools have continued to improve (such as golang), it's still too difficult to solve these types of problems, even for experienced programmers. This has made me wonder if perhaps the issue is not the quality of the languages, libraries, and ecosystems, but something more fundamental like the predominant paradigm (imperative programming).
If that were true, you'd expect that the many existing functional programming languages would have already satisfied this need. But in my opinion, they haven't. For example, it's hard to reconcile how the users of these functional languages can be so passionate about them, while non-users find them so perplexing. Why is the gap between these two perspectives so large? My conclusion is that there are major, well-justified reasons for why people strongly dislike functional languages. The hypothesis I aim to test with a new programming language is: By making different tradeoffs, these issues can be mitigated and a more broadly-appealing functional language can be built. With such a language, I hope that the true promise of functional programming for parallel computing can be widely realized.
This opportunity is not limited to the server-side and distributed systems. The future of all computing will leverage more parallelism. The machines we have on our desktops, in our laps, and in our hands include an increasing number of cores and heterogeneous accelerators. They function more like a micro-datacenter than a computer of old. Network connectivity has improved so much that the boundaries between datacenters, edges, and devices continue to blur. So whether you're trying to build a mobile app or optimize a scientific workload, parallelism is and will continue to be the most critical aspect of programming.
I'm Brett Slatkin and this is where I write about programming and related topics. You can contact me here or view my projects.
04 March 2023
About
I'm the author of the book
Effective Python. I'm a software engineer at Google (currently in the Office of the CTO) where I've worked for the past 19 years. Follow @haxor on Twitter
if you'd like to read posts from me in the future. You can also
email me here.
© 2009-2024 Brett Slatkin