Note: I plan to probably make this a twice-a-week type cadence for the blog depending on workload/etc. Probably one on the weekend, one during the week.
I watched a talk at Google yesterday by Professor John Ousterhout titled "A Philosophy of Software Design." The overarching principle of the talk was around techniques on writing extensible, modular code and taking it slow at the start before your entire code base turns into spaghetti. I don't claim to be an expert here, but I'm just providing my thoughts around what I learned from the talk. Let's jump in.
There were several points that he talked about that made me question a lot of what I was taught and learned over the years. First was that classes should be deep. What he meant by that was the interface should have very little overhead and a lot of functionality built into each class. Java seems to suffer from what he called "classitis." This comes from the common thought process that classes and methods should be as small as possible. An example:
FileInputStream fileStream =
new FileInputStream(fileName);
BufferedInputStream bufferedStream =
new BufferedInputStream(fileStream);
ObjectInputStream objectStream =
new ObjectInputStream(bufferedStream);
Why require all of these different instantiations when all you want to create is a buffered stream? I don't claim to know the answer to that question but I know that this is definitely what I was taught all throughout the classes I took in high school and college. "Keep your methods as small as possible and do as little as you can with each individual method." This part of the talk alone made me question what I had learned over the years.
Another point he brought up was around errors and exception handling. The common line we hold is to detect and throw as many errors as possible (and throw them as far as we can away from us.) In the Windows world, you can't delete a file if it's open. This seems to make sense. If you try to delete a file and another process has a handle to it.. you shouldn't be able to delete it. John made me ask the question why though? Unix handles this differently. If you delete a file that's open by another process, it will delete it, and remove it from the file system but the system that still has open access to that file still has what it needs and continues it's work. When it inevitably finishes, and runs cleanup, then and only then, is the file really deleted.
Another area that I don't fully understand but am still wrestling with it but he stated that the overall goal should be to "minimize the number of places where exceptions must be handled." Exceptions are a huge source of complexity in any code, if you throw an exception back to the caller, that caller might then fall into another exception, and before you know it you have a nest of exceptions and might never untie it. Although we all hate to see it, sometimes a crash might be a better answer. If some application has a memory leak, it might be better to just let it leak and then eventually crash vs. the alternative of trying to exception handle the complicated process that is memory handling. The underlying goal here is semantics around what your interface does. If you aren't so narrow in your definition of what your code does, you might not need as many exceptions.
The last point I'll discuss in this post was related to his thought that there are two different styles of programming. Tactical programming vs strategic programming. With tactical programming, the initial goal is noble, your boss says "we need x feature working by y date" and you set off and work to get that working ASAP. You don't like it, but you accept that you'll take a few shortcuts here and there because once the feature is working, well then you'll have time to go back and fix it. But then the next feature is requested and you accept that last shortcut is no longer a shortcut and you get into the mess of spaghetti I mentioned earlier. This results in badly designed (and probably highly complex) products. The sad part, he brought up, is that people who fall into this category are typically rewarded. The boss sees them as the people who can get work done and so their poor programming practices are rewarded and it further perpetuates the problem (he aptly calls these people tactical tornadoes.)
The idea though, is that simply working code isn't enough. That's where, John argues, strategic programming comes into play. The goal here is to produce a great design, minimize complexity, and in turn, simplify your future development. To do this, you have to fret about the small stuff. The shortcut that you are intending to take is a mistake when viewing it through the lens of strategic programming.
source: https://www.youtube.com/watch?v=bmSAYlu0NcYIt's an investment. If you take the extra time today it will pay off in the long term in terms of your design and the progress that you and your team make.
Professor Ousterhout wrote a book about all of this titled "A Philosophy of Software Design" and his big hope and goal is to teach software design to the masses. This is an area of learning for me, so I plan to grab the book and dig in because as I (re)learn to code, I want to go about it the right way, even if it's just for a pet project.
What are your thoughts about John's approach to programming? Is he wrong? Am I wrong? Let me know!
--Jeff
No comments:
Post a Comment