HACKER Q&A
📣 fedeb95

Am I the only one not caring about GitHub Copilot?


I see often links about it here but honestly I don't see why I should care.


  👤 armchairhacker Accepted Answer ✓
If I wrote a post for every major discussion or controversy i didn’t care about, that would be a lot of posts

👤 d23
Do you only care about things that directly affect you negatively and personally? There are a lot of things that don’t affect me directly personally that I still care about.

I don’t write open source public software with a restricted license, but I understand why some people do. I respect that, and I probably use a lot of the software without realizing it. I can understand why they’d be bothered by theft of their work, and I could see downstream effects such as them stopping their work that would eventually affect me.


👤 yuvadam
I share your apathy.

The legal and ethical aspects are fascinating, but I have no interest in that debate.

I care much more about my job security as a software engineer and systems architect, and on that front Copilot does nothing to challenge that.


👤 thesketh
No, you are not the only one. There are unfortunately plenty of people who don't care about massive corporations asserting ownership over the work of other people whilst jealously guarding their own IP, and even some apologists for them.

👤 ferrocarraiges
Agreed, it's in the same vein as self-driving cars.

Sure, it could put a lot of programmers out of a job in theory.

In practice, the emergency brake alert goes off for cars waiting to turn in the oncoming lane, the blind spot warnings trigger on cars that are ahead of you, the lane departure warnings are usually wrong, and so on.

I'm not worried, and I prefer to avoid the uncertainty even if it means outsourcing a bit less effort to the magic AI pixies.


👤 lonlazarus
I am ignoring it for the time being. and maybe I am misunderstanding its usefulness. I'm having a hard enough time with less experienced engineers generating large blocks of code and pushing it on me to play find the errors; the idea of an AI doing it as part of my workflow doesn't sound very appealing.

👤 lexandstuff
I trialled it. I found that most of the code it wrote was wrong so I didn't bother paying for a subscription. It was like pairing with a super enthusiastic developer that is eager to write code at every opportunity but doesn't understand what we're doing.

👤 fswd
My problem is the that copilot (actually codex) training data and model isn't available. If it was derived from open source code, that work must be released.

I'm bewildered why nobody else has brought this up.


👤 cjoelrun
Why should I care if you care. If you value your time you'll maybe be interested in having it write unit tests for you (It worked for me™). If you value protecting intellectual property you might care that it's sucking up a lot of it and making available to people who do care. I think it's a losing game because there's enough people willing to give it their intellectual property for free that by the time your get your stuff removed it's already found something similar somewhere else (Drainage my boy!)

👤 sergiotapia
I care, very much. I will no longer host any new personal projects on Github, and I will not use Github at companies I lead. I'm looking for alternatives, major one at the moment being Gitlab.

👤 bcrosby95
I care enough to add a comment to your question, but I don't care enough to post an Ask HN about it.

👤 JackC
For me the most interesting thing about using large language models is they offer a kind of conversation with the average of the human data they were trained on. They surprise me by telling me when I'm doing something boring.

When Copilot guesses the next method name or comment I was going to type, it's doing that by saying "this would be the most boring, average string of tokens to come next, so here you go," and it's fascinating how often that's right -- how often I'm wrong about how surprising the next line was. It's like how terrible humans are at generating unique passwords, except for everything I type. Copilot doesn't help by knowing things I don't, because it doesn't know anything, but it does help by guessing what I was obviously going to do next without me having to call out to memory.

Once I have access to that average-of-humanity information for a while, I start to want it for the rest of my life too. OK, fine, that's the next method name I was going to write. [tab, autocomplete]. OK, fine, that's how I was going to close out my email. [tab, autocomplete]. Well, huh, I wonder if it knew what I was going to type next on the command line? [yes, probably]. I wonder if it knew which things I was going to buy in the grocery store? [yes, probably]. It starts to feel limiting to not have access to what the average next step in the sequence would be.

And then it turns out that average-of-humanity models have all kinds of potential impacts on political power and labor and property law and so on, so all of that is pretty interesting too. But for me it starts with just poking at the model and going, oh, hey, it's ... everyone, how are you all doing?


👤 _dwt
I find the legal and ethical implications more interesting than the technology as it stands. Maybe I just don't do the kind of work it's suited to, but I didn't find it terribly useful when I gave it a go. I had an experience that went "oh wow, that turned out a lot of code fast and it looks pretty good" followed by "oh, it made the exact subtle mistakes everyone makes when they write this kind of thing". No surprise given the nature of ML.

On the other hand, I find myself torn between "information wants to be free and this is one more nail in the coffin of our odd and ahistorical concepts of 'intellectual property' and 'plagiarism'" and "oh great, another way for giant corporations to reap all the benefits from work done by individuals and smaller businesses".

I don't think I'll ever use the thing, and I have ~0 power over the societal implications, so overall - yeah, can't get that exercised over it either.


👤 kbelder
I have no desire to use it, and no objection to being used by it.

👤 bugfix-66
Systems like Copilot and Dall-E and so on turn their training data into anonymous common property. Your work becomes my work. This may appeal to naive people (students, hippies, etc.), for whom socialist/communist ideas are attractive, but it's poison in the real world.

These systems are a mechanism that can regurgitate (digest, remix, emit) without attribution all of the world's open code and all of the world's art.

With these systems, you're giving everyone the ability to plagiarize everything, effortlessly and unknowingly. No skill, no effort, no time required. No awareness of the sources of the derivative work.

My work is now your work. Everyone and his 10-year old brother can "write" my code (and derivatives), without ever knowing I wrote it, without ever knowing I existed. Everyone can use my hard work, regurgitated anonymously, stripped of all credit, stripped of all attribution, stripped of all identity and ancestry and citation.

It's a new kind of use not known (or imagined?) when the copyright laws were written.

Training must be opt in, not opt out.

Every artist, every creative individual, must EXPLICITLY OPT IN to having their hard work regurgitated anonymously by Copilot or Dall-E or whatever.

If you want to donate your code or your painting or your music so it can easily be "written" or "painted", in whole or in part, by everyone else, without attribution, then go ahead and opt in.

But if an author or artist does not EXPLICITLY OPT IN, you can't use their creative work to train these systems.

All these code/art washing systems, that absorb and mix and regurgitate the hard work of creative people must be strictly opt in.

If you don't care about this, it's naivete, or a lack of foresight, or apathy as these companies pillage the commons. Not something to be proud of.

Microsoft and OpenAI (and others) are robbing us and you should care.


👤 _fat_santa
At least from my perspective as a web developer, I just never found it all that useful. When I finally got an invite to it a few months ago, I used it for a bit but it always just felt like more of a toy than a real tool. If you're someone that's learning to code then Copilot looks like this fantastical tool that can write all your code for you, and that's why I think it gets so much attention. But from my perspective, Copilot just completely breaks down inside of a large established codebase. It's great if you're just standing up an app for the first time but once you lay down standards, common services and components, etc, Copilot just starts getting in your way.

I was excited when it first came out but I'm just over it now.


👤 mrtranscendence
I tried it out on the kind of work I do (Spark data pipelines, machine learning), and found it thoroughly useless. Getting it to generate code that was remotely close to what I wanted was pointlessly difficult. I tried throwing it at a parsing function in a date library I maintain and it spit out nonsense that, I suppose, looked like it was doing something reasonable (but wasn't, which is worse).

Maybe it's good for some things, but I wasn't impressed. My job's safe for a little while at least.


👤 incomingpain
I gave it a try in pycharm about a year ago. It literally not once gave me good suggestions. It gave lots of suggestions to be sure but it was all worthless.

The way it felt to me, it samples all kinds of other people's code(probably from github)(who owns the copyrights here?) and pastes their code. Except, what's the quality of that code? I'm by no means a top developer but the recommendations were always trash and not what I wanted.


👤 bilsbie
I wouldn’t mind an AI that does regular expressions for me.

👤 jemmyw
I think if you write code for a living then you should care and see what it does. Some folks might find it gives them a productivity boost. Some might have serious misgivings about the output quality / IP / security aspects. It might do nothing for you, as it does nothing for me, and you move on. But I guess I think it's something one should keep an eye on.

👤 zitterbewegung
Yea it basically is concerning if you have code in GitHub or you want to play with copilot. If anyone can reply to me that uses copilot in a business setting because to me it’s a nightmare to use because of the probability of it being a liability.

👤 ChrisMarshallNY
I don't have any opinion on the matter.

I don't actually see myself using it, and all my source is out there (for the most part, anyway), as MIT-licensed.


👤 gardenhedge
Haven't looked into it at all. Can it be used as a VS code plugin? I guess I couldn't use it at work. Is it even available to everyone?

👤 kkfx
You are not the only, I do not care about CoPilot (and GitHub services, even if it's good for many aspect at a whole)...

👤 bergenty
One of my directs does basic application development. His productivity has atleast 20x’d. Keep missing out.

👤 cyanydeez
Caring in the "they're profiting off my work" or "this seems like a useful tool"

👤 Supermancho
It's common for young people to not care what Microsoft does, when they haven't experienced a different world. Apple doesn't look much different from MS of today, so it seems perfectly reasonable to see software slowly weaponized against developers. Microsoft has viciously abused their marketshare in the past and it's likely they will do so again. How does that affect you? Doesn't seem to right now (unless you're unlucky, right?). Check back in 20 years.

👤 jasebell
Not looked, can't see me looking at this rate.

👤 yrgulation
I have no interest in it either. I am fascinated tho by how software developers have played themselves again. No wonder management and corporations treat most of them like children.

👤 CelticBard
Yep