I'm running on the current theory that the major the major distinction between Research and Engineering is determining precise enough. For most of the engineering work I've done, we never really needed answers with really high precision. An 80% confidence interval with a 80 - 90% approximate answer was good enough. We could act on it, build in some tolerance, and be on our merry way. It allowed us to move fast and generally not get stuck in holes deep diving to find out exactly why something was happening. We had fairly high confidence some effect was happening and how to solve it. For example, if the system profiler states there is a large amount of CPU time spent in some function, we optimize it, measure it, and call it a day.
Science and Research on the other hand, isn't satisfied with 80% approximations. Research demands high precision, with 99% confidence intervals. It's almost impossible to state anything as fact with 100% confidence, but Research and Science inches closer and closer to that precision. Science exists to determine facts, characterize facts, and understand what's happening in a system. If the system profiler states a large amount of CPU time is spent in some function, Science wants to understand why it's spending a lot of time. What's happening to the CPU, to the cache lines, is it an expensive CPU instruction, then optimize it and characterize it to a much further detail. What engineers call holes, Researchers call characterization.