Many misconceptions exist regarding the meaning of real time (in application programming and OS contexts). Real time essentially means that not only do the real-time thread (or threads) perform their work correctly, but they must perform within a given worst-case deadline. Actually, the key factor in a real time system is called determinism. Deterministic systems have a guaranteed worst-case response time to real-world (or artificially generated) events; they will process them within a bounded time constraint. Determinism leads to predictable response, under any conditions—even extreme load. One way in which computer scientists classify algorithms is via their time complexity: the big-O notation. O(1) algorithms are deterministic; they guarantee that they will complete within a certain worst-case time, no matter the input load. True real-time systems require O(1) algorithms for implementing their performance-sensitive code paths.
Interestingly, real time does not necessarily mean real fast. A VDC survey (refer to the Further reading section on the GitHub repository for more details) shows that the majority of real-time systems have a deadline (real-time response time) requirement of 1 to 9 milliseconds. As long as the system can consistently and without fail service the event within its given deadline (which could be fairly large), it's real time.