TL;DR this is gonna have some old war stories, and maybe those uninterested oughta just skip past it.
Ah, therein lies a tale. And I’ll start with the funny part, b/c it’s the part I like best. In the story below, there’s a point where Jim Colson (if I accidentally ran over him, I’d back up and run over him again, the fuckin’ plagiarist) took my code, broke it, open-sourced it, and forbade me to publish the original version (owned by IBM). I was so angry, that when Paul Horn, the Director of IBM Research (and Colson’s “executive mentor”) (by this time, it’s 2005) called me up and told me I had to go clean up a gynormous Websphere nightmare at a big custodial bank (turned out it was AIX, JVM, Websphere, Portal Server, and customer code bugs, took six months to one-by-one hunt them down) I got so goddamn angry I threw my IBM-issued laptop across my kitchen and destroyed it. But I’d already told him I’d go (b/c the alternative was quitting, and stupid me, I didn’t realize that that was the intelligent thing to do) so I went and cleaned up the mess. They never charged me for that laptop, and for all I know, my manager still has the busted hulk in his office.
(1) In C++, lots is possible using templates (which as I’m sure you know, were inspired by SML modules, “back in the day”).
(2) even in C (hence in C++ also) you can use conditional-compilation and linking to write a piece of code that uses some subsystem, and then run it in environments with different versions of that subsystem. It’s very commonly done. I have a (former) friend who wrote a complete S(torage)A(rea)N(etwork)-on-IP system, and at every level from the bottom-to-the-top, he used mocking of various kinds to allow him to test both subsystems and entire deployed systems in single processes with pseudo-random event-clocks for reproducibility.
(3) But in Java, first you don’t have conditional compilation, and second you really don’t have “linking” in any real sense. And this was by design (you can look at the original writeups for Java from SUN, and see them boasting about the lack of #defines, linking, etc. Hell, they didn’t support #line directives, so even IF you wanted to write a preprocessor, you had to jump thru INSANE hoops to make it work (I know: I did it twice).
(4) But even worse, because Java was so damn BIG (and remember, we’re talking a time when a Gig was a LOT of memory on a server) you were forced to use “app-servers” which were meant to stay up for a long time. Even today, I’d bet that it’s rare to see toplevel programs written in Java: almost all Java code is “loaded/managed” by runtimes of some sort. Those runtimes enforce the linking, so you get what they give you.
All of this means that you really can’t use any of the tricks we’re used-to from C to do mocking, or swap-out implementations of some interface “at the last moment”. And since you don’t actually have a static modularity mechanism, you can’t use CLR/C++/ML’s tricks.
[I do take your point that C++ doesn’t have “interfaces”. So you don’t get the typechecking. But you -do- get the “visibility of names” and “access control” that you get with ML functors.]
So the runtime implementors started building DI as a way to get the value of those above techniques, for Java code.
OK, now here’s where I get on my soapbox. IT DIDN’T HAVE TO BE THIS WAY. AT THE TIME, it was clear that MSFT’s assemblies were a fine, fine approach. AT THE TIME (in fricken’ 2003), I wrote a paper[1] (with Corwin, Bacon, and Gross) and an implementation, applying it to Tomcat as a proof, that you could modularize Java servers in a way that allowed you to then apply the tricks above, without some dynamic insanity. But you see, technical correctness doesn’t matter in the Java world: at the time, Jim Colson and the “pervasive computing” idiots at IBM had built this thing called OSGI (R2) that was deeply broken, and they needed a solution. So they did two things: (a) took the implementation and modified it (breaking critical bits) to turn it into OSGI R3, which they open-sourced; (b) forbade me to open-source my implementation, since it was (of course) owned by IBM,and as senior architects, they could do that. It’s a feudal kingdom after all, not a well-run tech company.
I distinctly remember an email and conversation with Glyn Normington (then of IBM Hursley) who assured me that he’d done everything he could to convince them to keep static modularity, but well, them’s the breaks. Glyn went on to SpringSource, of course.
The upshot of this long tale, is that Java still has no static modularity, not even of the kind that the CLR has enjoyed since its inception. So everybody uses DI, and DI is driven by runtimes that take various config-files (last I checked, it was XML) and build graphs of objects (“beans” bleccch) in the runtime. It’s all dynamic, it all depends on the runtime getting things right, nothing is truly statically checked, and (haha) no wonder, to most programmers it’s a black-box (as my friend at that streaming company described to me).
[1] dblp: MJ: a rational module system for Java and its applications.
I don’t claim that this was revolutionary. C#'s assemblies were basically this. Rather, this was something that could be -done- in Java at the time. When you work in “the business” you try to propose solutions that can be implemented now, and don’t require boiling the ocean. That’s all this was.