How useful do you think AI tools are/will be?

AI has been a hot topic here on Devtalk recently, so along that theme: How useful do you think AI dev tools are right now and how useful do you think they might become?

There are multiple polls below, please vote in each one :icon_biggrin:

How useful do you think AI tools are to developers right now?

  • Not useful
  • Somewhat useful
  • Moderately useful
  • Very useful
  • So useful that I consider them a necessity
0 voters

How useful do you think AI tools will be 5 years from now?

  • Not useful
  • Somewhat useful
  • Moderately useful
  • Very useful
  • So useful that they would be considered a necessity
0 voters

How useful do you think AI tools will be 10 years from now?

  • Not useful
  • Somewhat useful
  • Moderately useful
  • Very useful
  • So useful that they would be considered a necessity
0 voters

How useful do you think AI tools will be 15 years from now?

  • Not useful
  • Somewhat useful
  • Moderately useful
  • Very useful
  • So useful that they would be considered a necessity
0 voters

How useful do you think AI tools will be 20+ years from now?

  • Not useful
  • Somewhat useful
  • Moderately useful
  • Very useful
  • So useful that they would be considered a necessity
0 voters

10-20 years out is a long long guess. Small pool of results so far but I was a bit surprised that although being pretty skeptical about the claims I tended to think they are more useful in the near term and less useful in the long term. I have pretty deep concerns about the ability for these things to learn or keep up with improvements with the current level of usefulness already resulting in a deluge of slop being the learning pool of future training of statistical models. The long term feels to me like it needs another paradigm breakthrough in the order of the current LLM hype to even keep these things as useful as they were before people started using them.

2 Likes

I think you’re the first person I hear say that AI tools will be less useful in future Toby… what kind of paradigm shift or breakthrough do you think we’ll see?

1 Like

Well I’m not saying anything for sure about the future. What I am saying is that the breakthrough that just happened was a question of consuming all the content on earth to train on (with questionable respect to who owns said content). So now we have statistical models that output the average of what was available (or in other words the bottom 20% in terms of quality, if you’re lucky). Then everyone uses said tools to massively increase the quantity of content that is in the bottom end of quality. So we are dramatically shrinking the statistical impact of new valuable information to train on relative to junk being emitted into the digital world.. and we have consumed what it took to get here.

Let’s just say we keep going and it does get better. Some tomorrow there is a new security issue, deep fundamentals like everything we do is a sql injection attack level crisis on every request. Who is going to get this to a statistically significant data point to actually influence the training? Not anyone using AI tools right. (On this point, copilot has flagged several hundred prs I’ve created in work repos over it not knowing about new features in languages or frameworks and it has only been a reviewer on those repos since early this year - it has to be fair also made a few reasonable suggestions but those don’t relate to this specific point about it saying random outdated nonsense that is objectively false).

Further, how any new technology or approach become part of the solution? We end up in a world where everyone uses a tool based on the current situation to create the next situation and how it sees the world dominates what it can learn from to the exclusion of other inputs unless someone intervenes.

Which brings a me to the next point, what happens when these companies don’t want to lose millions per day setting fire to rainforests to power this hugely inefficient monster of mediocrity? The obvious move is they sell training bias to influence what people use as a form of advertising.

On the other hand, let’s pretend that it doesn’t play out like this. It is some exponential curve of god like power that takes over all need to do any work. Who is going to buy this? Who are their customers? When you create a situation where some 60%-70% of the workforce becomes jobless in a 5 year window like the hype sells, who are you selling your product to? Best case is the collapse of capitalism, but it could approach an extinction level event if played out like hacker news and twitter seems to sell CEOs.

Whatever way, I see the current bump as a short term thing. Either it balances out or we are all living mad max style in a few years. No one really wins because neither the idea or advantage is sustainable. (But I’m not saying anything about the future, every day it seems like some significant section of humanity is doing something more stupid than I thought possible… maybe for the second time)

2 Likes

I really haven’t used any coding assistant, like Cursor, in my work. Although I sometimes prompt ChatGPT is ask for some help. :slight_smile:

1 Like

Oh you should play around. Copilot, cursor, the new jetbrains stuff etc are mainly similar with different UX over interfacing with similar models (e.g Claude sonnet 3.7), sometimes the workflow or the stuff being injected into the model improves the output but largely these give a sense of what is going on between autocomplete, chat and agent features with different experiences around them (copilot is more in the side of ingrate with your workflow and do small chunks of things, cursor is more vibe rewrite your project YOLO, some other stuff has different ideas on how it interacts with what you want to do). It is a powerful and useful tool, I’m just not convinced that vibe coding the future will result in a better iteration given that in some respects they are regressing already. But seriously you can try out this stuff for free and you should get an idea of what the situation is for yourself, it is truly amazing in what it is if you lay to the side the insanely overblown hype and if it is less overblown than I’m suggesting at least you will understand your situation better. It will help you do a lot of basic things faster such as generate sample data for test scenarios, inline parameters into queries from logs, transform data formats, spot differences in large outputs semi semantically in ways that would be lost in a diff tool, autocomplete patterns in your code such as competition of a set of similar cases over a switch case, offer suggestions as to approaches when you ask it questions…, it is a good tool, it can right now help you do every day things much faster. I just don’t see how it gets where they say it is going if people go further into it, I don’t even get how it stays as good at these types of things as it is if it’s learning pool is heavily polluted with AI results. It is still the case that it is a tool you can make use of at the moment and should at least get up to speed with to get a sense of how it might shape future opportunities.

1 Like

I think they are very useful now, and will be almost a necessity in five years. After that, I didn’t vote. I don’t know if/when it may happen, but it is possible that AI would become more than just a tool. It may become a personal assistant. I’m not saying it will become AGI at that point, but a self-enhancing AI agent may become harder to distinguish from that.

1 Like

Thanks. I am probably an old head in the sense that I still want to do most of my coding and thinking, and not just depend on an AI assistant to do most of my work.

2 Likes

10-20 years is a long run, but I think in the near future AI will be used for many spheres and industries, as well as a helpful tool for devs.

1 Like