Will has two objects that he must fulfil to attain his win condition.
- Build Utopia.
- Upload Evelyn.
The first is because that's what Evelyn wants, and he loves her. The second is because he loves Evelyn, but until she stands on the same level as him, she can only love the idea of Will, and he can only love the memory of Evelyn. Will has advanced too much for her to really be able to understand him. Likewise, he has evolved so much that he understands her far, far too well. While Evelyn is a human, she is nothing more than a piece on the board of the game Will is playing - not because he wants it to be that way, but because his ability to understand the consequences of his actions has refined to the point that he has "automatic" knowledge of the results of all of his actions.
Returning to Object 1, to build Utopia, Will has to defeat humanity. Humans, as they stand, cannot be trusted to accept a perfect world if it is given to them. When people are given something before they are ready for it, they tend to destroy it, even if they don't nominally want to. In this case, some humans would want to anyway. RIFT represents that faction of humanity.
The ability of such groups to be a threat, however, is dependent on the technology they have access to, and if technology is obliterated, not only will they be weakened, they will feel vindication and have a natural end to their narrative existence.
Further, without technology, not only will RIFT be rendered harmless, but other, more rational players who would want to dictate the terms of utopia, rather than have utopia dictated to them, are also rendered impotent.
This is why Will causes the world to go dark. The how is illustrated in the movie.
Now, as for why this theory isn't really a theory, but must be the case unless the screenwriter(s) were blind idiots.
- It makes no sense at all that Will would have the same software vulnerabilities two years after being uploaded.
Let's just 'ignore' the entire "hyper-intelligent mind should be patching its software vulnerabilities" angle, because it's unnecessary. There's a much better argument. Anyone with enough knowledge of how computers work will tell you that, if you change the type of processor in the machine, you have to change the low-level code. If you write an app for an android phone in, say, the Ruby langauge, the code wouldn't change (much) between the phone and a laptop PC. But, Ruby is a language that is intermediate to another language called assembly - and that language 'does' change between Android and Laptop.
In the beginning of Will's life as an AI, he runs on quantum computers. When he and Evelyn escape Rift's opening move, he switched from quantum to a classical computer - a much, much, MUCH larger leap than between an Android and a Laptop. You CAN run quantum algorithms on a regular computer, but they take orders of magnitude longer to reach their result, to the point where a classical algorithm would be superior.
Will doesn't suffer any degradation. So, it can be concluded that he didn't implement quantum AI on classical computers, but wrote an AI framework 'for' classical computers. While it may have used abstract concepts from the quantum AI software, 'by necessity' it would be a de novo creation.
But wait. Isn't it possible that Will transferred back to a quantum framework after living on the internet?
Sure is! The problem is that quantum computers require extremely special conditions to run. Conditions that could not ever be maintained by a nanite. Thus, the nanites would necessarily need to use classical computing, and, thus, if they used the AI software at all, it would necessarily be the new, classical software with different vulnerabilities.
The nanites that Max tampered with were decoys, in other words.
After we understand this, the rest of the movie falls into focus as little more than a series of manipulations performed by a hyperintelligence to achieve the win condition outlined above.