Sometimes you grow to utilize the enhanced capabilities to a greater extent than others, and time frame can be the major consideration. Also maybe it's just a faster processor you need for your own work, or OTOH a hundred new PC's for an office building, and that's just computing examples.
Usually, the owner will not even explore all of the advantages of the new hardware as long as the purchase is barely justified by the original need. The faster-moving situations are the ones where fewest of the available new possibilities have a chance to be experimented with. IOW the hardware gets replaced before anybody actually learns how to get the most out of it in any way that was not foreseen before purchase.
Talk about scaling, there is real massive momentum when it's literally tonnes of electronics.
Like some people who can often buy a new car without ever utilizing all of the features of their previous car, and others who will take the time to learn about the new internals each time so they make the most of the vehicle while they do have it. Either way is very popular, and the hardware is engineered so both are satisfying. But only one is "research".
So whether you're just getting a new home entertainment center that's your most powerful yet, or kilos of additional PC's that would theoretically allow you to do more of what you are already doing (if nothing else), it's easy for anybody to purchase more than they will be able to technically master or even fully deploy sometimes.
Anybody know the feeling?
The root problem can be that the purchasing gets too far ahead of the research needed to make the most of the purchase :\
And if the time & effort that can be put in is at a premium, there will be more waste than necessary and it will be many times more costly. Plus if borrowed money is involved, you could end up with debts that are not just technical.
Scale a little too far, and you've got some research to catch up on :)
The history of modern ML is just fascinating, and as far as I can tell it's utterly unprecedented.
1945-2012 Let's figure out how we can build smart machines
2012-2017 Wait, what the hell... we just needed more gates?
2017-? Let's figure out how this machine we built actually works
Far from unprecedented, just radically different than what people are used to in computing.
There are a lot other domains whose history (and present) on semi-informed stumbling into something effective and then spending decades (or lifetimes) trying to reverse engineer how it works, when it doesn't work, what the peripheral consequences are, and how impactful those consequences are.
Metallurgy and material science, agriculture, chemistry, pharmaceuticals, psychology, etc etc
20th century discrete/digital computing and computer science, having hewn close to mathematics and logic for most of its life, is actually the more unprecedented history as far as practical sciences go.
The flipside of all this is that all those other practical sciences have come with really very messy histories in terms of unintended consequences and premature applications, and (for better or worse) we can anticipate the same here.
Great points re: pharmacology and psychology, certainly. I was thinking more in terms of technological applications. Normally the science comes first, followed by the tech, but AI has flipped the paradigm.
Usually, the owner will not even explore all of the advantages of the new hardware as long as the purchase is barely justified by the original need. The faster-moving situations are the ones where fewest of the available new possibilities have a chance to be experimented with. IOW the hardware gets replaced before anybody actually learns how to get the most out of it in any way that was not foreseen before purchase.
Talk about scaling, there is real massive momentum when it's literally tonnes of electronics.
Like some people who can often buy a new car without ever utilizing all of the features of their previous car, and others who will take the time to learn about the new internals each time so they make the most of the vehicle while they do have it. Either way is very popular, and the hardware is engineered so both are satisfying. But only one is "research".
So whether you're just getting a new home entertainment center that's your most powerful yet, or kilos of additional PC's that would theoretically allow you to do more of what you are already doing (if nothing else), it's easy for anybody to purchase more than they will be able to technically master or even fully deploy sometimes.
Anybody know the feeling?
The root problem can be that the purchasing gets too far ahead of the research needed to make the most of the purchase :\
And if the time & effort that can be put in is at a premium, there will be more waste than necessary and it will be many times more costly. Plus if borrowed money is involved, you could end up with debts that are not just technical.
Scale a little too far, and you've got some research to catch up on :)
There are a lot other domains whose history (and present) on semi-informed stumbling into something effective and then spending decades (or lifetimes) trying to reverse engineer how it works, when it doesn't work, what the peripheral consequences are, and how impactful those consequences are.
Metallurgy and material science, agriculture, chemistry, pharmaceuticals, psychology, etc etc
20th century discrete/digital computing and computer science, having hewn close to mathematics and logic for most of its life, is actually the more unprecedented history as far as practical sciences go.
The flipside of all this is that all those other practical sciences have come with really very messy histories in terms of unintended consequences and premature applications, and (for better or worse) we can anticipate the same here.