Wake Up, It's the 21st Century — On Modern Programming Standards

A plea to C++ developers to stop writing code as if it were the 1990s, covering TR1, C++0x features, the futility of manual optimizations, reinventing the wheel, and unnecessary memory savings.

I'd like to start this article by stating the fact that right outside the window it's 2011, mid-April. I'm reminding myself of this first and foremost, since I'm periodically overcome with doubt. The thing is, both for work and as a hobby, I often read C++ code written 10-20 years ago (but still maintained today), or code written very recently by people who learned to program in C++ those same 20 years ago.

Introduction

The specifics of programming 20 years ago were completely different. Memory and CPU resources were counted in bytes and clock cycles, many things hadn't been invented yet, and you had to get creative. But that's no reason to still write code based on those premises today. The world changes. You need to keep up.

This article covers only mainstream compilers (gcc, Intel, Microsoft) and application programming for desktop operating systems.

TR1

There's this thing called TR1. This may come as a revelation, but almost all modern compilers have built-in smart pointers, decent random number generators, many special mathematical functions, regular expression support, and other interesting things. It works quite well. Use it.

C++0x

There's this thing called C++0x. Rejoice, brethren! Already at your disposal are:

  • Lambda expressions
  • Rvalue references
  • Generalized constant expressions
  • Extern templates
  • Initializer lists
  • Range-based for loop
  • Improved object constructors
  • nullptr
  • Local and unnamed types as template arguments
  • Explicit conversion operators
  • Unicode characters and strings
  • Raw string literals
  • Static assertions
  • Template typedefs
  • The auto keyword

For example, instead of:

vector<int>::const_iterator itr = myvec.begin();

You can now write:

auto itr = myvec.begin();

You can iterate over collections with an analog of the for_each loop:

int my_array[5] = {1, 2, 3, 4, 5};
for(int &x : my_array)
  x *= 2;

Passing Everything Everywhere by Pointer (Reference)

The ability to pass entities to functions and methods both by reference and by value is a very powerful mechanism. A common practice is to pass everything by pointer, with these arguments:

  • A pointer is passed faster than a data structure
  • When passing by pointer, there's no need for an additional copy

Both arguments are insignificant. The gain is often a couple of bytes and clock cycles.

But there are downsides:

  • The receiving function is forced to check all arguments at least for NULL
  • The receiving function has the right to do anything with the passed entity — modify it, delete it
  • The calling function must either trust or validate data after every call
  • Many objects are already optimized (for example, string classes)

An analogy: passing by pointer is like taking your computer, unplugging it from the wall, and handing it over with the words: "Here, take it — have a look." You're giving the other person full control, and you have to hope they won't break anything.

Computing Constants

Here's an example of bad code:

#define PI      3.1415926535897932384626433832795
#define PI_DIV_BY_2   1.5707963267948966192313216916398
#define PI_DIV_BY_4   0.78539816339744830961566084581988

Constants are now computed by the compiler at compile time, not at runtime. So "PI/2" will be more readable, take up less space, and work just as fast.

Reinventing the Wheel

Example of criticized code:

class MySuperVector 
{ // my very fast implementation of vector
...
}

STL and Boost exist, where the best minds on the planet refine a multitude of excellent algorithms and data structures. You should only write your own in three cases:

  • When learning (lab work, coursework)
  • When writing a research paper specifically on that topic
  • When you know the source code of the main libraries by heart

In reality, what happens is: people have no idea these libraries exist; people don't read smart books; people have inflated self-esteem.

Unnecessary Optimizations

int a = 10;
a <<= 1;

All compilers are smart enough to independently replace multiplication and division with bit shifts. Not all people are smart enough to understand this code. The result: worse readability with no speed advantage.

Unnecessary Memory Savings

The author provides historical examples:

  • Y2K — saving two digits on the year
  • IPv4 — insufficient addresses
  • Therac-25 — medical device catastrophe

We already have 2 to 4 GB of RAM on average. Think ahead. Save megabytes, not individual bits.

Recommendation: use long, long long for data about quantities, dates, and file sizes. Avoid byte, short, and int.

Conclusion

Don't program cave paintings. Future generations won't appreciate it.

FAQ

What is this article about in one sentence?

This article explains the core idea in practical terms and focuses on what you can apply in real work.

Who is this article for?

It is written for engineers, technical leaders, and curious readers who want a clear, implementation-focused explanation.

What should I read next?

Use the related articles below to continue with closely connected topics and concrete examples.