Why is it an industry-standard to have a window automatically grab focus and how do we change it? (in Windows at least)

Why is it an industry-standard to have a windows automatically grab focus?

From a usability point of view, why is it considered "good practice" for a desktop application to grab (more like hijack) the window focus when opening or whenever the program decides to do so? i.e. why is it even an option to let a window "violently" grab the focus like that? I am asuming it's considered good practice because it seems prevalent on all Windows versions since I can remember. Can't comment on Linux or Mac, but maybe someone else can.

I understand that if I open a program that requires input (a password field or something) I would maybe like that element to be the one that gets focus so that I can start typing the required info ASAP.

However this only works (in my opinion) if I only open one window at a time. The problem arises when I open several programs in succesion and one or more of those request auto-focus.

For example (and this happens to me quite frequently), I open Outlook and Pidgin and SublimeText in succesion. When Oulook gets focused, I start typing my password to login, but in the middle of typing SublimeText gets auto-focused and then half my password is sitting in plain view for anyone to see.

Why is this deemed a good practice? Considering that most people has to look at the keyboard when typing, that means you don't realize the focus has changed to somethig that might expose your password, potentially while someone is standing right next to you.

And even if no passwords or other sensitive information is involved, it's very annoying to be, for example, typing a long document and after several seconds of typing you turn up your head to the screen and find out half of what you typed is not there because some window just hijacked the focus.

So my question is, what are the usability considerations for doing auto-focus, regardless of what the user wants, i.e. if I have a focused window it's because I chose to be that way, why would a computer have the ability to decide otherwise? What would be a better alternative? for example, if programs wouldn't just take the focus and just flash in the task bar (like they already do), is that not sufficient notice to the user? And finally, is there ANY way to disable this annoying behavior in Windows? I'm not sure about details on Linux or Mac, so I can't comment on that.