Is the end of programming?

11:14

Now don't get me wrong, there are still a lot of issues with AI coding. I still obviously think that coders are necessary right now, I actually still think coders will be necessary in the future as well. We do run into a lot of issues when it comes to writing code with Ai. Being somebody who is not very proficient or very good coder myself, I run into a lot of issues trying to generate code using AI.

For instance, when I try to have something like ChatGPT write code for me, it's usually pretty buggy on the first try. I will tell ChatGPT what the bug is, ChatGPT will try to fix that bug and oftentimes breaks something else in the process and if you continue to code more and more and more, a lot of these tools sort of loose the memory of the things that you talked about earlier in your chat, so if you're going back and forth trying to code something up, some of that early conversation of what you wanted the code to do might actually get lost and ChatGPT or whatever code assistant you're using will start to kind of accidentally remove features that it originally put in.

As of right now, it's still not great. You also have the issue of context windows. Most of these chatbots have context windows that aren't suitable for very very large chunks of code. You can upload large document with code in it, and if it's beyond the context window of the chatbot, it's not going to be able to read a large chunk of that code.

You also have the
issues of stuff getting lost in the middle of the code. A lot of times these chatbots, when you feed them long documents, they're really good at reading what's in the beginning of the document and the end of the document, but stuff in the middle tends to get lost for some reason, which isn't 't great for code, cuz you need you need it to use all of that code for context.

Don't get me wrong, there are still a lot of challenges with writing code with AI. I still believe that we cannot do without programmers now, and I also think that they will be necessary in the future. We actually face a lot of challenges when it comes to writing code with AI. As someone who is not very good at programming myself, I have a lot of difficulties trying to write code with AI.

For example, when I try to have a tool like ChatGPT write the code for me, after the first iteration the code usually turns out to be quite buggy. I explain to ChatGPT what the error is, ChatGPT tries to fix it and often during this process the error appears in another place, and if the program body gets larger, many of these AI tools sort of forget about the original tasks, so after several iterations of the process code, the tasks you originally set may be forgotten, and ChatGPT or whatever AI coding assistant you use will begin to randomly remove features it originally introduced.

At the moment, things are not working very well. In addition, there is the problem of context windows. The contextual windows of most chatbots are not very suitable for working with large pieces of code. You can upload a large document with code, but if it extends beyond the context window, then the chatbot will not be able to read most of such code.

In addition, there are problems associated with the fact that code can get lost in the middle of the document. When you feed long documents to chatbots, they often do very well
recognize what is at the beginning and end of the document, but what is in the middle is lost for some reason, and this is not very good from the point of view
code compilation because all code is needed to fully understand the context.

12:57

However things like that are getting even better with things like Gemini 1.5 coming out, which is capable of working with 1,000,000 tokens or 750,000 words, making it more and more likely that you're going to be able to plug in huge chunks of code and have it read it all. They also did what's called a needle in a haystack test (for LLM) inside of Gemini, where they gave it a huge amount of text and somewhere in the middle gave it a little sentence and then asked it a question about that sentence to see if it would find that sentence located in this huge massive amount of text and it performed at 99%. It was able to find the embedded text 99% of the time, which means that issue of thing getting lost in the middle is going to be a thing of the past pretty soon.

However, this is slowly changing for the better with the release of tools like Gemini 1.5, which can handle 1,000,000 tokens or 750,000 words, making it increasingly likely that you can load huge chunks of code and force the AI ​​to take everything into account. Experts also conducted a so-called “Needle in a Haystack” test (NIAH / “Needle In A Haystack” for large language models) with Gemini, in which a huge amount of text was loaded into the AI ​​and a short sentence was hidden somewhere in the middle, and then asked the AI ​​a question about this sentence to see if the AI ​​could find it in a huge amount of text, and it was 99% successful. It was able to find embedded text 99% of the time, meaning that the problem of losing data in the middle of texts will soon be a thing of the past.

13:41

Where do we stand today? AI is a great coding assistant. It's going to help you write a lot better code, it's going to help you debug you code, it's going to help write a lot of monotonous code that's been written over and over and over again, that can be found on places like GitHub and Stack Overflow. But it's not really great at creating a huge piece of software from scratch for you yet, but I do think it's going to there, and it's going to get there a lot sooner than most people realize.

Now, does that mean I think nobody should learn coding ever? Absolutely not! I have a 9-year-old son. He actually is in coding classes right now and I've been pushing him to continue to learn code and he really loves it in the similar way that if you love art and you love painting or drawing, you shouldn't give up painting or drawing because AI art can do it as well. I think a lot of people learn to code because it is enjoyable to understand what is going on underneath and to feel like they built something using their own brains as opposed to AI's brain. I think we're going to get to a point where AI is really damn good at coding and it's going to be able to code the majority of a piece of software for you, but I still think there's probably going to need to be humans in the loop to help debug code. You know, if you're a game developer, you still need humans to create a fun game loop. If you want a good user interface and user experience, I still think humans are going to be the best at determining what a good user interface and good user experience is.

My view of the future of coding is yes, I do believe AI is going to do the majority of the coding work, but I think humans still need to have the ideas for what to code, I think
humans are still going to need to help problem solve and help guide the AI ​​to
fix issues with the code. I think humans are still going to steer the UI, the
user experience of the code and I think there's going to be almost like a craft
element to it as well. I think people are going to value things that humans
coded versus AI coded in the same way that seems to be playing out in AI art.
People are more impressed by an amazing piece of work that was created by a human, than they are an amazing piece of art that was created by AI. I think
coding will be the same.

What is the situation today? AI is a great coding aid. It will help you write much better code, debug it, write a large amount of monotonous code that you have to write over and over again (it is posted on sites like GitHub and Stack Overflow). However, in terms of developing complex software from scratch, AI is not very good yet, but I think that this will change, and it will happen much sooner than many people think.

Does this mean I think no one should learn to code? In no case! I have a 9 year old son. He takes a programming class and I guide him towards learning more about code and he really enjoys it, in the same way that if you love art and you like to draw or sketch, you shouldn't give up painting or sketching because artificial intelligence can do this as well as you can. I think a lot of people learn to code because they like to understand the process and realize that they are creating something with their minds and not with artificial intelligence. I think we'll get to the point where AI gets really good at coding and can write most of the software for us, however, I still think it will likely require human input to help debug the code. . You know, if you're a game developer, you still need people if you want to make the game fun. Additionally, I still believe that only humans will be able to determine what the user's game window should look like and what determines an enjoyable gaming experience.

In my opinion there is a place for AI in the future of programming, I truly believe there will be AI
do most of the coding work, however, I think people should determine what topic to code in the first place, and I also think people should still help solve problems and guide the AI ​​in fixing problems in the code. I think people should still define what the user's window looks like, then what the user's experience should be when interacting with
application, and I think there will be a craft element to it as well. I think people will value code written by humans more as opposed to something created by AI, similar to how this happens
in art. People are more impressed by an amazing piece of art,
created by humans rather than something produced by AI. I think it will be the same in programming.

15:53

When Jensen Huang made his quote about how he thinks people should stop learning coding, John Carmack, the creator of ID Software, a programmer himself and one of the creators of the Oculus had this to say about it and think he nails it, I think this is a spot-on representation of where I think coding is heading.

When Jensen Huang said that he thought people should stop learning to code, John Carmack, the creator of ID Software, a programmer himself, and one of the creators of Oculus, had this to say:

“I think this is as accurate a representation as I can get of where I think programming is going.”

16:11

He says:

“Coding” was never the source of value, and people should never get overly attached to it. Problem solving is the core skill. The discipline and precision demanded by traditional programming will remain valuable transferable attributes, but they won't be a barrier to entry anymore.

Many times over the years I have thought about a great programmer I knew that loved assembly language to the point of not wanting to move to C. I have to fight some similar feelings of my own around using existing massive codebases and ineffective languages, but I pushed through.

I had somewhat resigned myself to the fact that I might be missing out on the “final abstraction”, where you realize that managing people is more powerful than any personal tool. I just don't like it, and I can live with the limitations that puts on me.

I suspect that I will enjoy managing AIs more, even if they wind up being better programmers than I am.

He said:

“Programming as a process of writing code” has never been a source of benefits, and people should not become overly dependent on it. A key skill is the ability to solve problems.. The discipline and precision that traditional programming requires will remain valuable skills that can be learned, but they will no longer be a barrier to entry into the profession.

Over the years, I've been reminded of a programmer friend who loved assembly language to the point where he didn't want to switch to C. I had to struggle with similar feelings about using huge source code bases and inefficient languages, but I persevered.

I've come to terms with the fact that I'm probably missing out on the “final level of abstraction” because I realize that managing people is more powerful than any PC software. I just don't like this fact, and I am able to live with the restrictions it imposes on me.

I imagine that controlling an AI will be more fun for me, even if it turns out to be a better programmer than me.”

16:59

So, when he's referring to that “final abstraction”, he's talking about what's that next layer on top of code, cuz over the history, we've always added new layers to make think easier and easier and easier.

So when he talks about “the last level of abstraction,” he means what is in the hierarchy above the code level, because throughout history, we have constantly added layer after layer, which made the process of implementing ideas easier and easier.

And all the people that are super resistant to letting AI help them with code or letting AI write that code for them are essentially saying that yes, there's a new layer of abstraction that makes it easier, but I'm going to ignore that and stay at the layer before it. And that's my thoughts on AI coding, that's where coding is going. I think it's the next layer of abstraction for coding I think it's always kind of been headed in that direction.

If it wasn't AI that added that next layer of abstraction to make it easier for humans to write code, something else would have been that next layer of abstraction and people would have been just as frustrated and upset and resistant to whatever that was as well. But again, not a coder myself, this is just the perspective of somebody, who pays very close attention to AI and has some of that fundamental knowledge of how computers and operating systems and software work. I just feel like this was the direction things were headed.

And everyone who is very opposed to AI helping us write code or doesn't want AI to write code for them is essentially saying, yeah, there's a new level of abstraction that makes the process easier, but I'll ignore it and move on. remain at the previous level. Here's what I think about AI programming and where the process is headed in general. I think that's the next level of abstraction in terms of programming, and I think it's always been going in that direction.

If AI hadn't been the cause of the next level of abstraction that allowed humans to simplify the process of writing code, then something else would have been, and people would have been just as angry, frustrated, and resistant to whatever it was. However, it is necessary to remind you that I am not a programmer, and the above is the opinion of someone who pays very close attention to AI and has some fundamental knowledge of how computers, operating systems work
and software. It just seems to me that everything was going in that direction.

18:09

What do you think? Let me know in the comments. I know this is a very hotly debated topic, there's a lot of negativity, there's a lot of fear, but O do want to know your opinion, where do you think the coding is heading and what is the timeline you think.

I actually believe we're going to get to the point where Ai is really proficient at writing good code within the next 18 months, possibly even sooner.

That's where I stand, what are your thoughts? Thanks for tuning in! I'll see you in the next video.

What do you think? Let me know in the comments. I know this is a very hotly debated topic with a lot of negativity, a lot of fear, but I want to know your thoughts on where you think programming is heading and what the time frame is for things to happen in the future.

I believe that we will come to the point that within a year and a half, and perhaps even earlier, AI will be able to write high-quality code.

This is what I think, what do you think? Thank you for your attention! See you in the next video.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *