partycoder 6 years ago

A safer way in my opinion is to:

0. read module license

1. checkout the module to be patched

2. modify it

3. do "npm pack" to save a tarball

4. set the dependency version in your package.json to the tarball. and make sure to the tarball is available during build time.

5. report the issue in their issue tracker for long term fix.

In contrast:

with this "quick fix", the dependency can change, or the transient dependencies can change. This would cause the quickfix patch to not apply anymore.

The only way to guarantee that it won't break is to save the module and all its dependencies. Something that is equivalent to npm pack.

Finally... don't get yourself into a licensing problem. Take a good look at the license before modifying a project.

  • charlesetc 6 years ago

    Does npm even allow you to use projects that don't allow you to make changes to their code?

    • drostie 6 years ago

      No because how the hell would they know?

      I author a module for my employer and describe it with their developers@ email and as UNLICENSED; how does NPM discover that I am legally authorized to use and modify this package?

    • jogjayr 6 years ago

      As in it checks the license in the source repo and prevents installation of dependencies with licenses that prohibit source modification?

andreineculau 6 years ago

Leaving aside the comedic factor, albeit problem-solving, if you want a sane alternative check out pnpm. It will not only make the installation of deps faster but it will also allow you to override a dependency on any n-level e.g. a dependency of a dependency.

A dependency published a patch version with a breaking change and the author is on vacation? No problem, you can lock it within your project to the previous version.

Did you fork a dependency and you want to use your version? You can do that too. In all of your dependency chain. You can even change the version specifier to a git url so no need to publish your fork to the npm registry.

I whole heatedly recommend anyone to have a look at pnpm. As a unix dev, there's so much in there that is well thought and kept simple this fast.

https://github.com/pnpm/pnpm/blob/master/README.md#hooks

  • exogen 6 years ago

    That’s an interesting approach to it (modifying how the package.json is processed).

    If anyone is wondering if their current package manager supports this, it does if you are using Yarn. Yarn has a `resolutions` field to override subdeps (either globally or in a targeted way): https://yarnpkg.com/lang/en/docs/selective-version-resolutio...

    It’s not as low-level and flexible as pnpm’s approach, but is perfect for the use case described here.

    • andreineculau 6 years ago

      Correct. The pnpm approach is more versatile. You can do anything to a dependency's package.json . I have recently used it to debug and then monkey patch "scripts" (property) invocations without even forking the dependency.

fiatjaf 6 years ago

Oh, that's terrible, awful, the worst idea ever! You're promoting bad practices, the quality of code will only get worse from now on! Someone needs to delete this thread! You must have quality guidelines, code review, follow best practices! Don't you dare modifying the content of node_modules!

[secretly uses quickfix when no one is looking]

  • maxchehab 6 years ago

    As the creator, I shame you :)

Benjamin_Dobell 6 years ago

One of the really nice things about NPM (and other modern source-level dependency managers) is that it's trivial to fork dependencies and reference commits in remote Git repositories. This workflow makes it really straight forward to patch third-party dependencies. Combine it with a lock file for reproducible builds and you're good to go with minimal fuss.

  • Sujan 6 years ago

    How (well) does this work in a dependency of a dependency of a dependency?

    • Benjamin_Dobell 6 years ago

      You have three options:

      1. Fork every dependency in the chain. Which typically isn't more than two or three dependencies.

      2. Fork the distant dependency, and specify it as a direct dependency.

      3. Fork the distant dependency, and only update your lock file.

      Option 1 is probably the "correct" approach, and sometimes necessary anyway, if you've made contract/API changes to a dependency of a dependency. It will also make it easier to contribute your changes upstream - which presumably you're wanting to do to avoid the maintenance burden of maintaining forks.

      Option 2 is the "easiest", but probably least correct as you're adding a hard dependency where one shouldn't exist.

      Option 3 works fine until you try regenerate your lock file.

      Quickfix doesn't seem to take itself too seriously, which is nice, but I'd happily argue that any one of the 3 approaches given above is better than the approach Quickfix is taking.

nautical 6 years ago

We had same issue and we used diff and patch.

  • zbentley 6 years ago

    That's basically all this is, but with fewer commands to remember and modify. I think that's quite useful, especially when under lots of time pressure.

arthurcolle 6 years ago

This is a terrible idea and I encourage you to consider the slipperiness of this slope

  • skrebbel 6 years ago

    Pragmatism is on a slipperly slope almost by definition.

    • zbentley 6 years ago

      That is a very concise way of saying something I've struggled to articulate well in the past.

      Sometimes stuff like this is a horrible idea. Sometimes you need it. If you lock out the entirety of the "sometimes you need it" use cases as "too dangerous if applied en masse", you cripple yourself. Remember: not everyone has the time to do it right (the time they have might be measured in minutes or seconds before a push or bugfix), or the wherewithal or ability to even learn how. You can lament that reality if you like, but tools that accept it will be useful anyway.

rendall 6 years ago

Clever and funny. I might find this useful for broken Typescript typings that have been somehow included in the package.

limsup 6 years ago

Check in your node_modules. You are badly correcting the wrong problem. npm repo should not be hit for every checkout, deploy, and CI test. Just commit your dependencies - and if you need to hack them, you can float a patch with git.

  • dissent 6 years ago

    You're also correcting the wrong problem. You can have your deps outside of your repo and still not go off to the net each time. You should cache or mirror the repo and build from that.

    At some point the convenience of public repos made people forget this.

  • denisw 6 years ago

    One important argument against this practice is that node_modules is usually not cross-platform. It's likely that somewhere within your dependency graph, there is a package that contains native code, which results in platform-specific binaries in your git repository.

  • shabbyrobe 6 years ago

    Watch out! The "pure and perfect git history is more important than any other consideration" crowd are going to take you apart for this heretical notion!

    • orf 6 years ago

      More like the "we don't want a 10gb git repo" crowd to be honest.

      Checking in not only your code, but your entire history of every dependency and every change within that dependency graph is an amazingly bad idea when frontend apps often have 600mb+ of dependencies that change somewhat frequently, and git keeps a version of every file in its history.

      • mattwad 6 years ago

        But you're going to have to install them either way. We only update dependencies when they are needed, so in practice this is rarely an issue. We have also gotten in the practice of making separate branches for updates to keep Pull Requests sane.

        • orf 6 years ago

          Part of my current project has a node_modules folder that is 400MB and contains 37853 files. Seeing as updating a dependency can cause a cascading effect that can touch a lot of different files, it's not a good idea to store them in git. It's akin to the reason you don't store binary files in git.

          If installs are too slow, switch to a caching yarn/npm mirror. Cloudflare has one. Or run a local caching proxy for your team.

        • zbentley 6 years ago

          The repo doesn't get big because the dependencies are huge at a point in time (though they may be); it gets big because they change over time, and history gets huge and intensely messy in this situation. Shallow clones, branching, and fetch tuning are band-aids on this, which, over enough time (which in practice can be as short as months on a big project being iterated on by tons of people) will still cause clone/deployment problems, repo storage problems, and lost time. And don't start with the "well then just break it up into multiple git repos!" stuff. That's not only incredibly hard to do in practice/in the face of legacy, but it's fundamentally just another constant-factor mitigation on the problem. Sometimes, checking in all the deps makes sense. Sometimes it doesn't. Big node projects are a situation where it often doesn't.

jwilk 6 years ago

README would be 42% better without megabytes of pointless animated GIFs.

  • zbentley 6 years ago

    Chill out. It's a project that is at least partially (if not entirely) comedic in nature. It doesn't need a 100% professional rapidly-loading README.

    And if you don't like that, just view it via a "raw" link (there are userscripts that even do this for you whenever you're on github/gitlab), or a text browser, or "wget" for all I care.

  • jchw 6 years ago

    I'm not that amused either, but to each their own. It's only 2 megabytes, which is probably a lot less than your average `yarn`/`npm install` session.

    • jwilk 6 years ago

      2 megabytes is at least 30 seconds of hogging my Internet connection.

      I never use npm on the same machine that I browse web, so for me the two things aren't comparable.