Recommendations For Noob In Data Degree

Langues: JP EN DE FR
users online
Forum » Everything Else » Chatterbox » Recommendations For Noob in Data Degree
Recommendations For Noob in Data Degree
First Page 2 3 4 5 6
 Asura.Saevel
Offline
Serveur: Asura
Game: FFXI
Posts: 9933
By Asura.Saevel 2024-10-14 16:28:45
Link | Citer | R
 
As to the whole question on programming, everyone in IT should learn some level of basic programming. This is because the concepts that you learn can be applied to literally everything in the entire industry.

All programming of every language on every platform boils down to a few kept concepts.

Data collection (input)
Logical analysis (what should I do based on that input)
Action execution (how should I do the action I decided to do)

Everything else is just different ways to do go about doing those three steps. Once someone understands that they can pickup any language very quickly and have functional code in hours. Honestly getting the build chain to work right (f*ck you MSVC) takes longer them banging out a functional program.
[+]
VIP
Offline
Posts: 787
By Lili 2024-10-15 07:31:53
Link | Citer | R
 
K123 said: »
You can literally find thousands of such posts from artists 2-3 years ago, replacing LLM for CLIP or GAN in that statement. Oh how that did not age well.

You keep saying this, but this isn't the sort of gotcha that you think it is.

Current "AI" are Large Language Models, they're basically predictive text on immense amounts of steroids. The distant cousin of your phone ability to suggest "are" after you type "How". They're getting better and better, but have two huge limitations:
- they cannot come up with something new by any definition of the word (yeah yeah anything is derived of something else blabla that's philosophy and not useful in this context)
- they are imprecise.

The latter part is the bigger issue when it comes to coding. Image generation works because the human brain is amazing at finding patters where it wants to find them, so an imperfect image (little too blurry, little out of proportion, little off shade, etc etc) still registers correctly to us. Same with text - it registers at correct, but have you noticed how easy it is to spot text written by an AI? They overuse certain terms and have a weird over-eager "vibe" to them, no matter how much you you filter/process them before or after.

Well, programming languages do not work like that. Code is exact. A misplaced comma, an extra space, a missing }, a word that is capitalized here but not capitalized there, and the entire program falls apart and stops working. Current development "AIs" are good at providing small snippets of highly specialized code, but they are not good at generalizing entire programs, and they won't be until the underlying design changes. Which is possible eventually, but not with the current design principles. LLMs are also amazing at writing code documentation, because the internet contains millions of billions of commented code, and it's easy to extrapolate.

Now, this does not mean that a programmer won't ever be able to be replaced - heck, 95% of "web developers" are just reiterating over the same wordpress theme over and over again a million times with just a slightly different shade of green or font size.
But it does mean that the current design of "AIs" won't ever get to a point to replace an actual developer, nevermind a team of them. And the current design is the only thing that we have that has given us any semblance of reasoning so far.

Now, research is working hard and strong on this, so maybe in two weeks we'll get ChatGPT[*] version 16 that is able to actually think quadrimensionally. But we're not there, by far.

[*] someone pointed out to me how "ChatGPT" sounds like "Chat, j'ai pété" in french, and I cannot think of anything else.
[+]
 Asura.Saevel
Offline
Serveur: Asura
Game: FFXI
Posts: 9933
By Asura.Saevel 2024-10-15 08:31:43
Link | Citer | R
 
Lili said: »
Now, this does not mean that a programmer won't ever be able to be replaced - heck, 95% of "web developers" are just reiterating over the same wordpress theme over and over again a million times with just a slightly different shade of green or font size.
But it does mean that the current design of "AIs" won't ever get to a point to replace an actual developer, nevermind a team of them. And the current design is the only thing that we have that has given us any semblance of reasoning so far.

Everything that ChatGPT / CoPilot does can be done by google searches, because that is basically how they got their training data to begin with. People who haven't ever built anything imagine coding is like writing a novel or something, where you lock yourself in a room until inspiration hits you.

How it really works is you take a problem and break it down into sub-components like "retrieve authentication credentials", "use credentials to obtain header token", "connect to REST API with header token and retrieve data", "parse data to extract values", "make decision based on those values", and so forth. We then write out small blocks for each of those and then stitch them together.

If I was to google "Python config info from file" I easily get several methods of how to do it and immediately can see configparser would be a way to go and StackOverflow comments with various ways to use it to retrieve values from a file.

https://stackoverflow.com/questions/19379120/how-to-read-a-config-file-using-python
https://docs.python.org/3/library/configparser.html

Then I google "Python REST to get header token" and see that requests is a good library for it and a bunch of examples of how to do it. And our friends StackOverflow and python.org pops up again.

https://stackoverflow.com/questions/19069701/python-requests-library-how-to-pass-authorization-header-with-single-token
https://discuss.python.org/t/obtain-api-data-token-using-requests/54430

And we just keep doing this for every step until we have a notepad++ page full of code snippets and library references. We then put it together and go from there, once it works we end up storing those in our tool bucket for future use. Experienced people have massive personal took buckets full of code snippets in various languages that they just copy paste edit when building stuff.

This only works because the developer is aware of context, they know the bigger picture, which is something stochastic parrots can never understand. At best ChatGPT / CoPilot are just replacing the googling aspect and fetching stored procedures they learned from StackOverflow during training.
[+]
Offline
By Dodik 2024-10-15 10:12:28
Link | Citer | R
 
Idk if it has been brought up already, but "data science" is not a real thing. There is no (useful) degree in "data science".

Anyone working in anything called "data science" is one of two things:

* Computer science background. These are the developers implementing logic in code.
* Mathematics and/or finance background. These are the brains trying to analyze data and come up with logic.

The former needs a comp sci degree.

The latter needs some basic understanding of how to implement formulas in code. Usually this is in Python, R is not used much anymore.

The former cannot do the latter's job nor vice versa. They are different skill sets and need different backgrounds and experience.

If you want to get into "data science" that tells me neither you nor your advisors have any experience nor knowledge of what these jobs actually are.

You should be asking yourself if you want to do comp sci or mathematics/finance. One is more practical, the other more theoretical.

FYI, anyone calling them selves a "data scientist" is an immediate red flag for anyone interviewing them.
[+]
Offline
By K123 2024-10-15 10:36:40
Link | Citer | R
 
Lili said: »
K123 said: »
You can literally find thousands of such posts from artists 2-3 years ago, replacing LLM for CLIP or GAN in that statement. Oh how that did not age well.

You keep saying this, but this isn't the sort of gotcha that you think it is.

Current "AI" are Large Language Models, they're basically predictive text on immense amounts of steroids. The distant cousin of your phone ability to suggest "are" after you type "How". They're getting better and better, but have two huge limitations:
- they cannot come up with something new by any definition of the word (yeah yeah anything is derived of something else blabla that's philosophy and not useful in this context)
- they are imprecise.

The latter part is the bigger issue when it comes to coding. Image generation works because the human brain is amazing at finding patters where it wants to find them, so an imperfect image (little too blurry, little out of proportion, little off shade, etc etc) still registers correctly to us. Same with text - it registers at correct, but have you noticed how easy it is to spot text written by an AI? They overuse certain terms and have a weird over-eager "vibe" to them, no matter how much you you filter/process them before or after.

Well, programming languages do not work like that. Code is exact. A misplaced comma, an extra space, a missing }, a word that is capitalized here but not capitalized there, and the entire program falls apart and stops working. Current development "AIs" are good at providing small snippets of highly specialized code, but they are not good at generalizing entire programs, and they won't be until the underlying design changes. Which is possible eventually, but not with the current design principles. LLMs are also amazing at writing code documentation, because the internet contains millions of billions of commented code, and it's easy to extrapolate.

Now, this does not mean that a programmer won't ever be able to be replaced - heck, 95% of "web developers" are just reiterating over the same wordpress theme over and over again a million times with just a slightly different shade of green or font size.
But it does mean that the current design of "AIs" won't ever get to a point to replace an actual developer, nevermind a team of them. And the current design is the only thing that we have that has given us any semblance of reasoning so far.

Now, research is working hard and strong on this, so maybe in two weeks we'll get ChatGPT[*] version 16 that is able to actually think quadrimensionally. But we're not there, by far.

[*] someone pointed out to me how "ChatGPT" sounds like "Chat, j'ai pété" in french, and I cannot think of anything else.
1. I'm not saying they're the same thing in the way you note the differences between image and code generation. I'ml saying they're both going to be impacted hugely, and people were ignorant in both.
2. No, you can't just selectively dismiss that everything new or novel is just someone combining things in ways that haven't been before, that everything is a slight progression of what already exists, to bias the argument.
3. Define "new". Once you get past accepting that most things out there are pretty damn simple and the same with slight tweaks as you already have, then you get into the domain of designing. There's only one expert in design here. I believe AI (both LLM and image generators) can already "design" new and novel things to a limited extent. I'll let you know what I conclude in my PhD.
Offline
By K123 2024-10-15 10:40:11
Link | Citer | R
 
Also maths and numbers are absolute and code could ultimately be understood by an LLM in binary. It wouldn't necessarily need to understand why code does or does not work based on something being capitalised or there being an additional or missing bracket. To limit yourself to thinking that an LLM even needs to understand code as you use it is only limiting your comprehension of how it could be achieved.
 Shiva.Thorny
Offline
Serveur: Shiva
Game: FFXI
user: Rairin
Posts: 2850
By Shiva.Thorny 2024-10-15 11:57:23
Link | Citer | R
 
K123 said: »
code could ultimately be understood by an LLM in binary. It wouldn't necessarily need to understand why code does or does not work based on something being capitalised or there being an additional or missing bracket.
'Code' does not exist in binary. Code can be compiled to assembly, which is processor-specific. One of the benefits of python, the language you made this argument about, is that it is cross-platform and typically ran by an interpreter. So, 'binary' (assembly) python is not a thing. While some methods exist to compile python to assembly in a limited form for performance, if you were to have a LLM work with that it would not be able to be reverted back to python syntax to be used on other platforms. It defeats the purpose of using python in the first place.

K123 said: »
To limit yourself to thinking that an LLM even needs to understand code as you use it is only limiting your comprehension of how it could be achieved.
It needs to understand the code, because without code you don't have cross-platform support. You cannot simply recognize existing patterns to fill in the missing code for anything complex enough to require paying developers.

A major limitation of current models is also the amount of active memory. A real developer has to understand all the parts being connected to connect them. Current generation of LLM cannot handle that much data at once, so it is incapable of simultaneously processing all of the different parts. Even if it were able to evaluate the intent of the code well enough to handle the task, it cannot hold enough data simultaneously to do that evaluation.

Moore's Law is dying, there's no reason to think we're going to continue the level of exponential growth we've seen in the past for processing capacity. The size of model required to expand working memory this many times is obscene. Our current generation platforms/cpus are considerably more unstable than prior generations because we can't maintain the manufacturing tolerances as is [see 13/14th gen intel and recent ryzen].

I cannot say without a doubt that LLM will not replace Python developers. I do not think it is likely that they will be better than any human within 2 years, 5 years, or even 15 years. I don't even think they'll be better than a slightly above average human in 15 years. But, you're obviously coming from a place where you have personally been impacted and your bias is showing.

Keep in mind that shipping groups just settled the port strike in the US by giving an immense raise and 6 year guarantee against automation. If AI isn't sufficient to move a box from one fixed location to another, and it won't be for 6 years, why do you think a task exponentially more complicated will be possible? How about the automated trucking routes? If you go back 6 years, everyone was crying about how truckers would be outdated. Didn't pan out. Decades away.

The final few % of successfully automating a task is more work than the first 97%. Writing complete and functional software is well within the final few % of language models. Unfortunately, design is not.
[+]
 Asura.Saevel
Offline
Serveur: Asura
Game: FFXI
Posts: 9933
By Asura.Saevel 2024-10-15 12:03:25
Link | Citer | R
 
Evangelical still trying to sell snake oil? Damn AI-Bro's just don't give up.
Offline
By K123 2024-10-15 13:04:14
Link | Citer | R
 
You're limiting everything down to python. If AI was to write a new language it might not need to be python. I was talking theoretically. This is separate to the point about the OP and python.

How have you concluded that I've been negatively impacted by AI?

Other than getting me funding for a PhD and even more esteem in my field, AI hasn't impacted me negatively yet. Probably will in the future but I'm also at the forefront of it so...
Offline
By K123 2024-10-15 13:06:43
Link | Citer | R
 
Also agree about Moore's law and x86 being over, but that's never been a law and we will move away from silicon which changes that.
Offline
Posts: 9072
By Afania 2024-10-15 13:12:07
Link | Citer | R
 
K123 said: »
You're limiting everything down to python. If AI was to write a new language it might not need to be python. I was talking theoretically. This is separate to the point about the OP and python.

How have you concluded that I've been negatively impacted by AI?

Other than getting me funding for a PhD and even more esteem in my field, AI hasn't impacted me negatively yet. Probably will in the future but I'm also at the forefront of it so...


Just curious, if AI hasn't impact you irl, why do you feel like posting the importance of it over and over?

You haven't experience such thing yet. Plenty of people in this thread haven't experience such thing either. Plenty of people here, including myself, have tried AI in the past, or studied how it works. And got the conclusion that AI replacing our jobs isn't happening soon based on our real experience using AI and learning how it works.

Why do you keep saying the opposite despite your life isn't impacted by it either?

To me it feels like your conclusions are based on theory, not real experience. Which is why plenty of people disagree with it because your theory and real life experience don't match.
 Asura.Aquatiq
Offline
Serveur: Asura
Game: FFXI
user: Aquatiq
Posts: 284
By Asura.Aquatiq 2024-10-15 13:13:32
Link | Citer | R
 
[+]
Offline
By K123 2024-10-15 13:17:22
Link | Citer | R
 
Afania said: »
K123 said: »
You're limiting everything down to python. If AI was to write a new language it might not need to be python. I was talking theoretically. This is separate to the point about the OP and python.

How have you concluded that I've been negatively impacted by AI?

Other than getting me funding for a PhD and even more esteem in my field, AI hasn't impacted me negatively yet. Probably will in the future but I'm also at the forefront of it so...


Just curious, if AI hasn't impact you irl, why do you feel like posting the importance of it over and over?

You haven't experience such thing yet. Plenty of people in this thread haven't experience such thing either. Plenty of people here, including myself, have tried AI in the past, or studied how it works. And got the conclusion that AI replacing our jobs isn't happening soon based on our real experience using AI and learning how it works.

Why do you keep saying the opposite despite your life isn't impacted by it either?

To me it feels like your conclusions are based on theory, not real experience. Which is why plenty of people disagree with it because your theory and real life experience don't match.
I heavily suggest you try rereading my post

AI has impacted me, massively beneficially thus far.
Offline
Posts: 9072
By Afania 2024-10-15 13:23:05
Link | Citer | R
 
K123 said: »
Afania said: »
K123 said: »
You're limiting everything down to python. If AI was to write a new language it might not need to be python. I was talking theoretically. This is separate to the point about the OP and python.

How have you concluded that I've been negatively impacted by AI?

Other than getting me funding for a PhD and even more esteem in my field, AI hasn't impacted me negatively yet. Probably will in the future but I'm also at the forefront of it so...


Just curious, if AI hasn't impact you irl, why do you feel like posting the importance of it over and over?

You haven't experience such thing yet. Plenty of people in this thread haven't experience such thing either. Plenty of people here, including myself, have tried AI in the past, or studied how it works. And got the conclusion that AI replacing our jobs isn't happening soon based on our real experience using AI and learning how it works.

Why do you keep saying the opposite despite your life isn't impacted by it either?

To me it feels like your conclusions are based on theory, not real experience. Which is why plenty of people disagree with it because your theory and real life experience don't match.
I heavily suggest you try rereading my post

AI has impacted me, massively beneficially thus far.


I did. You said you got funding from AI, but it didn't take your job. So how did you get the conclusion that you posted despite it doesn't match the rl experience that you described?
Offline
By K123 2024-10-15 13:23:10
Link | Citer | R
 
Also I don't recall using the word "replace"? I said I believe it will completely gut out the bottom end of most knowledge work, and then the middle, leaving only the top.

My point was, quite clearly and specifically that I don't believe the OP can learn faster than AI will gain proficiency. We can assume he's over 35. His brain will almost definitely lack the elasticity he'd have had in his early 20s which is when most people gain their expertise at latest.
Offline
Posts: 9072
By Afania 2024-10-15 13:43:28
Link | Citer | R
 
K123 said: »
Also I don't recall using the word "replace"? I said I believe it will completely gut out the bottom end of most knowledge work, and then the middle, leaving only the top.

My point was, quite clearly and specifically that I don't believe the OP can learn faster than AI will gain proficiency. We can assume he's over 35. His brain will almost definitely lack the elasticity he'd have had in his early 20s which is when most people gain their expertise at latest.


That's a lot of speculation here....


You don't know OP personally. And you don't know when will AI "completely gut out the bottom end of most knowledge work, and then the middle, leaving only the top" happen in the future. Nobody can say that for sure.

Which means your conclusion and advice are based on speculation, without solid evidence.

I am not saying your advice has no value at all, but don't you think you pushed it a little too hard?
Offline
By K123 2024-10-15 13:51:44
Link | Citer | R
 
No, I don't think I'm being bullish. Like I said after 2 posts, let's talk reality in 2 years and not opine for pages.

The gutting out of the bottom end is already happening. Look at Klarna in the UK. Cutting employees from 5000 to 1500 and explicitly because of AI.
Offline
By K123 2024-10-15 13:58:24
Link | Citer | R
 
https://www.google.com/amp/s/www.bbc.co.uk/news/articles/c80e1gp9m9zo.amp

It says they're aiming for 2000 here but I've seen other articles saying they're aiming for 1500.
Offline
Posts: 9072
By Afania 2024-10-15 15:16:52
Link | Citer | R
 
K123 said: »
https://www.google.com/amp/s/www.bbc.co.uk/news/articles/c80e1gp9m9zo.amp

It says they're aiming for 2000 here but I've seen other articles saying they're aiming for 1500.

Yeah, an article. So I wasn't wrong, your conclusion doesn't came from rl experience, but articles written by journalists.

I'll share a real experience. I am over 35. I don't have a STEM degree, failed high school math. So my background is far from in a good position to learn programming fast.

Some time ago I needed to build and hook up a system for a software. AI does zero help here, like others said the code quality was terrible.

so I took a programming lesson online for a month. Then spent a few months of copying codes on Google and asking people questions when error happened, and got all the core feature built after a few months. The job is done, happy end.

Meanwhile, AI still can't get the same job done atm. And I'm glad that I didn't sit around waiting for AI technology to mature to solve an immediate problem.

Not only the job is done, I feel my ability to understand "system" and implementing features in a software improved greatly. Which helped my ability on planning and management.

From this experience, I draw the conclusion that Saev's advice here is 100% correct.

Asura.Saevel said: »
As to the whole question on programming, everyone in IT should learn some level of basic programming. This is because the concepts that you learn can be applied to literally everything in the entire industry.

I have zero interest in becoming a programmer as career, nor I have talent in STEM. And even then I find great value in learning programming, in a way that AI can't replace.


And this is all from rl experience, not some random internet articles saying XYZ skill is useless because of AI.

So I find your conclusion on the value of learning programming(even as a non-programmer) a little bit harsh. It doesn't align with rl experience at all. I am more inclined to agree with what Saev said. Even if you are over 35 in AI era, the knowledge still has values. It depends on what you do.
[+]
Offline
By K123 2024-10-15 15:48:48
Link | Citer | R
 
The article quotes the company.. how did you deduce that it was the opinion of a journalist? This is a major international company openly proving it is already happening.

Meanwhile you use anecdotal experience to try and prove the factual reality I've evidenced as somehow untrue?

Here's my anecdotal experience: I'm not a programmer, don't care to learn in top of the 100s of other skills I already have expertise in (including the use of generative AI in design);that students literally travel from all around the world to learn. I have used AI (personally always start with ChatGPT then Claude) to write basic plugins for blender (python), make geometry directly and set up geometry nodes in blender, make geometry directly in Rhino, and write grasshopper script for Rhino. Where ChatGPT fails, Claude usually fixes it. Sometimes you can generate once in ChatGPT and it fails, try again and it fails (from scratch, not even asking it to check that code), and a third time and it works. Yes this is more efficient for my use case than bothering to learn to code. No this isn't as complex as designing and writing a program from scratch, but yes I believe that will be possible. Even if it's only 97% done and requiring tweaking, yes that's will still cull millions of basic *** programmers.
Offline
Posts: 9072
By Afania 2024-10-15 16:22:29
Link | Citer | R
 
K123 said: »
The article quotes the company.. how did you deduce that it was the opinion of a journalist?


.....as someone who continue to educate me on reading, you are the one who can't read here.

I said the "article is written by a journalist", not "the article is an opinion of a journalist". My sentence is 100% correct.

The article even has The journalist's name is on it:Tom Gerken, technology reporter. Who summarized what the company's representative said and wrote an article for it.

It's fine to read it wrong, just please don't educate other people for not reading if you can't do it yourself.


Quote:
but yes I believe that will be possible.

We are talking in circles and repeating the same ***here. No one here is trying to say that full replacement of jobs will never happen. We just don't know when, so before that has happened the knowledge still has values. It depends on what you do.
[+]
Offline
Posts: 9072
By Afania 2024-10-15 16:34:49
Link | Citer | R
 
K123 said: »
Meanwhile you use anecdotal experience to try and prove the factual reality I've evidenced as somehow untrue?


I didn't say what you said is "untrue", I only said everyone has different circumstances. No single advice can apply to everyone in the world.

Maybe someone don't need to learn programming, maybe someone greatly benefits from it. It depends.

Your advice sounds like there is only one possibility and leaves no room for any other possibilities.


Quote:
My point was, quite clearly and specifically that I don't believe the OP can learn faster than AI will gain proficiency. We can assume he's over 35. His brain will almost definitely lack the elasticity he'd have had in his early 20s which is when most people gain their expertise at latest.



how do you know that OP can't learn past 35? How do you know if they gain 0 benefit from this knowledge? Because my rl experience pretty much all proved it wrong. So how do you know only your experience can applies to OP's situation but not mine?


If you said something more neutral, like "whether learning programming now is useful or not depends on your age and professional needs", it would have been way less controversial than what you said above.
[+]
 Bahamut.Senaki
Offline
Serveur: Bahamut
Game: FFXI
user: Senaki
Posts: 122
By Bahamut.Senaki 2024-10-15 17:53:23
Link | Citer | R
 
Honestly this has kinda gotten to the point of unproductively. I’d be ok with Rooks locking or purging this thread.
 Bahamut.Senaki
Offline
Serveur: Bahamut
Game: FFXI
user: Senaki
Posts: 122
By Bahamut.Senaki 2024-10-15 17:54:14
Link | Citer | R
 
@ those that provided constructive advice.

Thank you, I appreciate it profoundly.
[+]
Offline
By K123 2024-10-15 18:18:13
Link | Citer | R
 
Nothing is productive when Afania starts posting. I'm not wasting time responding to those.

Good luck with whatever you decide to do.
VIP
Offline
Posts: 787
By Lili 2024-10-15 18:20:47
Link | Citer | R
 
Bahamut.Senaki said: »
Honestly this has kinda gotten to the point of unproductively.

I agree. Dude made me like a post by Afania. Truly outrageous.

(kidding, when someone's right they're right)

K123 said: »
[This content was removed by CutGPT]

The more you talk the more you appear like someone who has never done the barest minimum amount of development work in an actual real life work environment scenario, and thus has no idea what programmers actually do.

There's about a dozen different "flavors" of development work[1], as the workflow of someone who produces mobile shovelware[2] looks extremely different than, dunno, someone who writes and mantains diagnostic tools for central and peripherial processors of a manufacturing line, which is in itself unfathomably different from the work done by the person(s) who write a mantain the software that drives the manufacturing line, not to mention the people who write the software that makes the two systems communicate. And there's an universe besides these four examples. Game dev anyone?

At this point I am deciding that your convinctions are merely dogmatic, which is typical for the people in certain academic fields who become convinced that they can derive an entire system's mechanisms from a few basic original principles, forgetting that experience is of extreme importance as it gives insight, and insight lets one see where the system diverges from the obligatorily oversimplified original principles. But you think you know better, so feel free to happily go back to your PhD and have fun smashing your face on the ineluctable wall of Reality Does Not Give A Crap About Your Strongly Opinionated Statements.

[1] or more, but I stopped counting after 12 in my head as I ran out of allocated memory for this reply.
[2] which I could actually see being replaced by AI within a year or so, if it hasn't been already
[3] there's no three.
[+]
Offline
By K123 2024-10-15 18:25:35
Link | Citer | R
 
As salty and overly confident as artists were 2 years ago.

Also entirely irrelevant to the OP as basically all but Thorny's posts have been. We're talking about someone who said they have a "basket weaving degree".
Offline
By K123 2024-10-15 18:33:50
Link | Citer | R
 
For those that think AI is a bubble and a gamble, don't you think Google and Microsoft would be pretty confident before spending $5bn+ each to fund nuclear power plants to produce electricity for training?

Also, on the topic of processing which Thorny raised earlier - GPU compute already gets us over Moore's law in a way. My understanding is that Nvidia are building an ARM SoC which has unified memory and a GPU with cuda cores too.
Log in to post.