1
0
Fork 0
mirror of https://github.com/denoland/deno.git synced 2025-01-21 04:52:26 -05:00

Compare commits

...

84 commits

Author SHA1 Message Date
Nayeem Rahman
971ec9b194 Merge remote-tracking branch 'upstream/main' into check-workspace-member-compiler-options 2025-01-20 16:57:12 +00:00
Nayeem Rahman
36fcff8670 coverage fixture 2025-01-20 16:56:57 +00:00
Luca Casonato
5e9b3712de
feat(unstable): add basic support for otel trace links (#27727)
Currently only links with no attributes.
2025-01-20 15:39:59 +01:00
Bartek Iwańczuk
395628026f
fix(ext/os): pass SignalState to web worker (#27741)
Closes https://github.com/denoland/deno/issues/27717

Made a mistake in https://github.com/denoland/deno/pull/27655 and
didn't add the `SignalStore` for web worker.
2025-01-20 19:43:15 +05:30
Divy Srivastava
4f27d7cdc0
fix(ext/node): GCM auth tag check on DechiperIv#final (#27733) 2025-01-20 18:16:44 +05:30
Nayeem Rahman
798f733c09 workspace_files 2025-01-20 10:13:42 +00:00
Nayeem Rahman
1dd361492d remote CliFactoryWithWorkspaceFiles::initial_cwd() 2025-01-20 09:35:28 +00:00
Nayeem Rahman
1e073ee1d6 don't store specifier info 2025-01-20 09:26:08 +00:00
Nayeem Rahman
766452fca4 single factory 2025-01-20 08:15:23 +00:00
Nayeem Rahman
9d9a88f7c3 Merge remote-tracking branch 'upstream/main' into check-workspace-member-compiler-options 2025-01-20 05:13:32 +00:00
ryu
e4a16e91fa
docs(readme): update redirected links (#27726) 2025-01-20 03:01:25 +00:00
David Sherret
9aa02769c8
perf(compile): remove swc from denort (#27721)
This is achieved by storing CJS export analysis ahead of time in the
executable, which should also improve the performance of `denort` by
this never being done anymore (I'm too lazy atm to bench this, but it
will be significant for some programs).
2025-01-19 14:23:07 -05:00
David Sherret
b962b87cfe
chore: fix canary version (#27723)
Broken by
57dd66ec3d

Closes https://github.com/denoland/deno/issues/27719
2025-01-19 11:19:47 +01:00
Nayeem Rahman
0ae3090fa0 fix merge 2025-01-18 06:08:17 +00:00
Nayeem Rahman
8d5fb5afca Merge remote-tracking branch 'upstream/main' into check-workspace-member-compiler-options 2025-01-18 06:01:12 +00:00
David Sherret
57dd66ec3d
refactor: move denort to separate crate (#27688)
This slightly degrades the performance of CJS export analysis on
subsequent runs because I changed it to no longer cache in the DENO_DIR
with this PR (denort now properly has no idea about the DENO_DIR). We'll
have to change it to embed this data in the binary and that will also
allow us to get rid of swc in denort (will do that in a follow-up PR).
2025-01-17 20:39:29 +00:00
Leo Kettmeir
054075730c
refactor: update deno_core and use more concrete errors (#27620)
waiting for https://github.com/denoland/deno_core/pull/1043

Fixes #27672
2025-01-17 09:41:52 -08:00
Yoshiya Hinosawa
b55451b178
fix(ext/node): tls.connect regression (#27707)
The TLS start sequence has been broken since #26661 because of the way
how we wrap TCP handle to create TLS handle.

#26661 introduced happy-eyeballs algorithm and some connection could be
dropped because of happy-eyeball attempt timeout. The current
implementation doesn't consider that case and it could start TLS
handshake with timed out TCP connection. That caused #27652 .

This PR fixes it by changing the initialization steps. Now `wrapHandle`
of TLSSocket set up `afterConnectTls` callback in TCP handle, and
`afterConnect` of TCP handle calls it at `connect` event timing if it
exists. This avoids starting TLS session with timed out connection.

closes #27652
2025-01-18 00:10:26 +09:00
Bartek Iwańczuk
342ccbb99d
ci: use self-hosted mac arm runner on tags (#27708) 2025-01-17 14:31:58 +01:00
Bartek Iwańczuk
0050857f51
refactor: add 'deno_process' crate (#27680)
Untangled the whole `runtime/ops/process.rs` from `ext/node/` and moved
to a separate `ext/process` crate.
2025-01-17 13:30:14 +01:00
Yoshiya Hinosawa
339bc44c58
fix(ext/node): propagate socket error to client request object (#27678)
Co-authored-by: Satya Rohith <me@satyarohith.com>
2025-01-17 12:30:00 +09:00
denobot
94dc5b16f5
chore: forward v2.1.6 release commit to main (#27705)
This is the release commit being forwarded back to main for 2.1.6

Co-authored-by: bartlomieju <bartlomieju@users.noreply.github.com>
Co-authored-by: Bartek Iwańczuk <biwanczuk@gmail.com>
2025-01-17 02:09:13 +01:00
Nathan Whitaker
a5ba198b9a
fix(outdated): Use latest tag even when it's the same as the current version (#27699)
Fixes https://github.com/denoland/deno/issues/27696.

Just a `>` that should've been a `>=`. Also made sure to filter out
deprecated versions.
2025-01-16 20:03:25 +00:00
Nathan Whitaker
256950ddb6
fix(outdated): retain strict semver specifier when updating (#27701)
Fixes https://github.com/denoland/deno/issues/27697

If it's a strict bound (e.g. `1.0.0` as opposed to `^1.0.0` or other),
retain the strictness when we update
2025-01-16 19:33:38 +00:00
Nathan Whitaker
464ee9155e
fix(check/lsp): fix bugs with tsc type resolution, allow npm packages to augment ImportMeta (#27690)
Fixes #26224.
Fixes #27042.

There were three bugs here:
- we were only resolving `/// <reference types` directives starting with
`npm:`, which meant we failed to resolve bare specifiers (this broke the
`/// <reference types="vite/client">` directive in most of the vite
templates)
- the `$node_modules` workaround caused us to fail to read files for
tsc. For instance tsc would construct new paths based on specifiers
containing `$node_modules`, and since we hadn't created those we weren't
mapping them back to the original (this broke some type resolution
within `vite/client`)
- our separation of `ImportMeta` across node and deno globals in tsc
meant that npm packages couldn't augment `ImportMeta` (this broke
`vite/client`'s augmentation to add `import.meta.env` and others)


After this, the only remaining issue in the vanilla vite template is our
error on `/vite.svg` (which is an ambient module), and I'll look into
that next.
2025-01-16 19:20:04 +00:00
Bartek Iwańczuk
2debe9c8dd
fix(ext/console): change Temporal color (#27684)
This commit changes output color of `Temporal` instances from
"magenta" to "cyan" to discriminate them from `Date` instances.

Closes https://github.com/denoland/deno/issues/27585
2025-01-16 18:27:54 +00:00
Jo Franchetti
17d6e66ee3
docs: adding jsdocs info for console interface (#27666)
Signed-off-by: Jo Franchetti <jofranchetti@gmail.com>
Co-authored-by: Marvin Hagemeister <marvin@deno.com>
2025-01-16 14:48:13 +00:00
Phil Hawksworth
8d2f76ae36
docs: JSDocs examples for prompt, confirm, and alert (#27695)
Adds examples
2025-01-16 14:20:45 +00:00
Phil Hawksworth
e54d467812
docs:Adds examples in JSDocs for localStorage and sessionStorage (#27668)
Improves docs for:

- http://docs.deno.com/api/web/~/localStorage
- http://docs.deno.com/api/web/~/sessionStorage
2025-01-16 12:33:08 +00:00
Muthuraj Ramalingakumar
e49d6f2d45
chore: add missing internal core_import_map file paths (#27691)
Noted this when working locally, will help with vscode intellisense.

fixes: https://github.com/denoland/deno/issues/27689
2025-01-16 04:38:43 +00:00
Nayeem Rahman
c8a0404848 Merge remote-tracking branch 'upstream/main' into check-workspace-member-compiler-options 2025-01-16 03:08:23 +00:00
Nathan Whitaker
32708213d5
fix(check/lsp): correctly resolve compilerOptions.types (#27686)
Fixes https://github.com/denoland/deno/issues/27062

In the LSP we were passing `npm` specifiers to TSC as roots, but TSC
needs fully resolved specifiers (like the actual file path).

In `deno check` we were often excluding the specifiers entirely from the
roots.

In both cases, we need to resolve the specifiers fully and then pass
them to tsc
2025-01-15 18:48:10 -08:00
Nayeem Rahman
8719cb7a57 Merge remote-tracking branch 'upstream/main' into check-workspace-member-compiler-options 2025-01-15 22:50:05 +00:00
Bartek Iwańczuk
a02ee7adf9
ci: try to fix caching on Mac ARM (#27685) 2025-01-15 10:09:08 -08:00
David Sherret
05dc69932d
refactor: create deno_lib crate (#27673)
Shifts just some code down for now. I'll do the rest of the refactor in
the next pr, but didn't want to drop a huge refactor.
2025-01-15 09:35:46 -05:00
Bartek Iwańczuk
836a623d99
ci: use self-hosted mac arm runner (#27568) 2025-01-15 11:03:05 +00:00
Masato Yoshioka
b22a50cb0c
fix(ext/node): add chown method to FileHandle class (#27638) 2025-01-15 17:15:07 +09:00
David Sherret
afc23fb2e0
chore: fix ci by removing remote server dependent test (#27674)
This was using the lockfile and esm.sh changed breaking the lockfile. We
could pin to a specific esm.sh version, but ideally we shouldn't have
the test suite dependent on remote servers.
2025-01-15 04:06:57 +00:00
Nayeem Rahman
6128282d6e Merge remote-tracking branch 'upstream/main' into check-workspace-member-compiler-options 2025-01-15 02:53:07 +00:00
Bartek Iwańczuk
974e2f44b2
refactor: add 'deno_os' crate (#27655)
This commit creates "deno_os" extension crate and moves
numerous ops from "runtime/" crate to the new crate.
2025-01-14 17:29:36 +01:00
Yoshiya Hinosawa
c943f56949
fix(ext/node): fix playwright http client (#27662) 2025-01-15 01:00:55 +09:00
David Sherret
0b033140c0
refactor: move CliNpmResolver to deno_resolver::npm::NpmResolver (#27659)
As title. After this PR all npm resolution will be out of the CLI crate.
2025-01-14 10:01:05 -05:00
Marvin Hagemeister
3fb8fc1ba7
feat(unstable): refactor js lint plugin AST (#27615)
This PR changes the underlying buffer backed AST format we use for
JavaScript-based linting plugins. It adds support for various new types,
makes traversal code a lot easier and is more polished compared to
previous iterations.

Here is a quick summary (in no particular order):

- Node prop data is separate from traversal, which makes traversal code
so much easier to reason about. Previously, it was interleaved with node
prop data
- spans are in a separate table as well, as they are rarely needed.
- schema is separate from SWC conversion logic, which makes 
- supports recursive plain objects
- supports numbers
- supports bigint
- supports regex
- adds all SWC nodes

Apologies, this is kinda a big PR, but it's worth it imo.

_Marking as draft because I need to update some tests tomorrow._
2025-01-14 13:31:02 +01:00
David Sherret
1e95c20561
refactor: deno_config 0.45 (#27660) 2025-01-14 13:00:31 +01:00
siaeyy
a1f50a7422
fix(node/fs): add utimes method to the FileHandle class (#27582) 2025-01-14 18:08:22 +09:00
Aaron Ang
9cb089f6db
fix(ext/node): add writev method to FileHandle (#27563)
Part of #25554
2025-01-14 18:01:14 +09:00
David Sherret
7616429436
fix(compile/windows): better handling of deno_dir on different drive letter than code (#27654)
Closes https://github.com/denoland/deno/issues/27651
2025-01-13 22:29:21 -05:00
Nayeem Rahman
5a1bb6b854 Merge remote-tracking branch 'upstream/main' into check-workspace-member-compiler-options 2025-01-14 03:16:38 +00:00
David Sherret
9dbb99a83c
refactor: create NpmInstaller (#27626)
This separates npm resolution code from npm installation (more work
towards moving resolution code out of the CLI and cleaning up this
code).
2025-01-13 17:35:18 -05:00
TateKennington
5a39f2f096
fix(node): Prevent node:child_process from always inheriting the parent environment (#27343) (#27340)
Fixes #27343

Currently the node:child_process polyfill is always passing the full
parent environment to all spawned subprocesses. In the case where
`options.env` is provided those keys are overridden but the rest of the
parent environment is still passed through.

On Node the behaviour is for child processes to only inherit the parent
environment when `options.env` isn't specified. When `options.env` is
specified the child process inherits only those keys.

This PR updates the internal node child_process polyfill so that the
`clearEnv` argument is set to true when spawning the subprocess to
prevent the parent environment always being inherited by default. It
also fixes an issue where `normalizeSpawnArguments` wasn't returning the
`env` option if `options.env` was unset.
2025-01-13 13:46:56 -08:00
David Sherret
2a2b39eb2e
fix(compile): store embedded fs case sensitivity (#27653) 2025-01-13 12:02:37 -05:00
Benjamin Swerdlow
714b40262e
refactor(node_resolver): make conditions_from_resolution_mode configurable (#27596) 2025-01-13 11:34:37 -05:00
Nayeem Rahman
f912aac2cb
fix(lsp): handle pathless untitled URIs (#27637) 2025-01-13 15:31:08 +00:00
Yoshiya Hinosawa
2091691164
fix(ext/node): apply @npmcli/agent workaround to npm-check-updates (#27639)
See the comment
https://github.com/denoland/deno/pull/25470#issuecomment-2435077722 for
the reason why we do this workaround to make `make-fetch-happen` work in
Deno

This PR applies the same workaround to `npm-check-updates` package.
`npm-check-updates` internally uses
[`npm-registry-fetch`](https://www.npmjs.com/package/npm-registry-fetch)
which uses
[`make-fetch-happen`](https://www.npmjs.com/package/make-fetch-happen)
(the problematic package) for making http request to npm registry.

The detection of `make-fetch-happen` doesn't work for
`npm-check-updates` because we use call stack at `net.Socket`
constructor to check if it's called from `make-fetch-happen`, but
`npm-check-updates` bundles its dependency and the check doesn't work.

This PR adds the check of `npm-check-updates` string in call stack in
net.Socket constructor to trigger the workaroud.

closes #27629
2025-01-13 18:11:26 +09:00
Nathan Whitaker
70c822bfe2
fix(lsp/check): don't resolve unknown media types to a .js extension (#27631)
Fixes https://github.com/denoland/deno/issues/25762. Note that some of
the things in that issue are not resolved (vite/client types not working
properly which has other root causes), but the wildcard module
augmentation specifically is fixed by this.

We were telling TSC that files with unknown media types had an extension
of `.js`, so the ambient module declarations weren't applying. Instead,
just don't resolve them, so the ambient declaration applies.
2025-01-11 03:26:01 +00:00
David Sherret
f6dcc13537
fix(regression): show bare-node-builtin hint when using an import map (#27632) 2025-01-11 01:39:43 +00:00
David Sherret
c27248a8f3
refactor: remove CliNpmReqResolver trait in deno_resolver (#27616) 2025-01-10 14:48:43 -05:00
Rajhans Jadhao
1dd5bd667c
fix(ext/node): use primordials in ext/node/polyfills/_fs_common.ts (#27589)
Related to #24236
2025-01-10 13:51:50 +01:00
David Sherret
475793f94d
refactor: implement NpmPackageFolderResolver in deno_resolver (#27614) 2025-01-10 00:01:47 +00:00
David Sherret
34beeb7703
refactor(npm): move SloppyImportsCachedFs to deno_resolver (#27610) 2025-01-09 18:30:48 -05:00
denobot
8bafb182ef
chore: forward v2.1.5 release commit to main (#27613)
Co-authored-by: dsherret <dsherret@users.noreply.github.com>
2025-01-09 17:38:18 -05:00
Ryan Dahl
1d64670f9c
docs: added jsdoc for window.close() (#27608) 2025-01-09 15:05:39 -05:00
David Sherret
966370c908
refactor(npm): move InNpmPackageChecker code to deno_resolver (#27609)
As title. Will allow consumers to create this struct and use our
behaviour.

Closes #27409
2025-01-09 14:04:52 -05:00
Nayeem Rahman
318f524c5c
fix(lsp): use verbatim specifier for URL auto-imports (#27605) 2025-01-09 17:54:14 +00:00
David Sherret
093f3ba565
refactor(npm): extract out some npm fs resolution code from the cli (#27607)
Moves the npm fs resolvers into the deno_resolution crate.

This does not entirely move things out, but is a step in that direction.
2025-01-09 12:10:07 -05:00
David Sherret
ce0968ef3a
refactor(npm): split some resolution from installation (#27595)
This splits away some npm resolution code from installation. It will
allow for more easily extracting out resolution code in the future.
2025-01-08 23:46:37 +00:00
Leo Kettmeir
ea30e188a8
refactor: update deno_core for error refactor (#26867)
Closes #26171

---------

Co-authored-by: David Sherret <dsherret@gmail.com>
2025-01-08 14:52:32 -08:00
Tatsuya Kawano
814da49dff
fix(ext/net): update moka cache to avoid potential panic in Deno.resolveDns on some laptops with Ryzen CPU (#27572) 2025-01-08 16:48:23 -05:00
David Sherret
fc2788bfd7
fix(jsr): Wasm imports fail to load (#27594)
* https://github.com/denoland/deno_graph/pull/562

Closes https://github.com/denoland/deno/issues/27593
2025-01-08 19:46:15 +00:00
Divy Srivastava
fffa3804aa
fix(ext/node): Fix os.cpus() on Linux (#27592)
Populate `speed` using current scaling frequency and fix times
multiplier.

Fixes https://github.com/denoland/deno/issues/27555

<table>
<tr>
<th>Node.js</th>
<th>Deno</th>
</tr>
<tr>
<td>

```
> os.cpus()
[
  {
    model: 'AMD Ryzen 5 7530U with Radeon Graphics',
    speed: 1396,
    times: {
      user: 1769930,
      nice: 20,
      sys: 525630,
      idle: 41325700,
      irq: 110060
    }
  },
```

</td>
<td>

```
> os.cpus()
[
  {
    model: "AMD Ryzen 5 7530U with Radeon Graphics",
    speed: 1630,
    times: [Object: null prototype] {
      user: 1795620,
      nice: 20,
      sys: 537840,
      idle: 41589390,
      irq: 111230
    }
  },
```

</td>
</tr>
</table>
2025-01-08 22:09:55 +05:30
Divy Srivastava
e233173653
fix(ext/websocket): Fix close code without reason (#27578)
Fixes https://github.com/denoland/deno/issues/27566

The close code wasn't sent if reason was None, defaulting to 1005. This
patch allows sending close code without reason.
2025-01-08 20:07:47 +05:30
Yusuke Tanaka
1661ddd9ca
fix(ext/node): have process global available in Node context (#27562)
This commit makes `process` global always available in Node context.

`process` global was previously available explicitly in `deno_node`, but then
got removed in #25291 and made globally available regardless of whether it's in
Deno or Node context, so this commit does not have any effect on Deno CLI.
However, for users who want to use `deno_node` ext only, it makes sense to have
`process` available to simulate the Node environment individually.

This change may bring some negative performance impact. To measure how large the
impact would be, a very simple benchmark was performed whose results can be
found at https://github.com/magurotuna/process_global_bench.
2025-01-08 13:14:57 +09:00
Marvin Hagemeister
cabdfa8c2d
fix(lint): fix single char selectors being ignored (#27576)
The selector splitting code that's used for JS linting plugins didn't
properly account for selectors being a single character. This can happen
in the case of `*`.

Instead of comparing against the length, we'll now check if the
remaining string portion is not empty, which is more robust. It also
allows us to detect trailing whitespace, which we didn't before.
2025-01-08 00:21:50 +01:00
David Sherret
3f5cad38aa
fix(no-slow-types): handle rest param with internal assignments (#27581)
Closes #27575
2025-01-07 12:34:34 -08:00
Nayeem Rahman
b5e4a303d5
fix(lsp): don't skip dirs with enabled subdirs (#27580) 2025-01-07 19:04:06 +00:00
Nayeem Rahman
b0cbae7486 Merge remote-tracking branch 'upstream/main' into check-workspace-member-compiler-options 2025-01-07 17:35:47 +00:00
Nikolay Karadzhov
8cda4cf53d
feat(node/fs): Add a chmod method to the FileHandle class (#27522)
Add the chmod method to the FileHandle class in node compat as part of
#25554
2025-01-07 14:58:14 +01:00
Bartek Iwańczuk
b7fb5a5547
Revert "perf: build denort with panic = "abort" for releases (#27507)" (#27573)
Also reverts #27518

The reason is that it takes too long to build these two
binaries on Mac ARM runners as it stands.

We're gonna try to reland this next week, after sorting out
situation with these runners.
2025-01-07 02:32:51 +00:00
Bartek Iwańczuk
b6f2646c1c
refactor: make IncrementalCache accept a CacheDBHash (#27570) 2025-01-06 23:56:36 +00:00
Nayeem Rahman
2eae9a99ed Merge remote-tracking branch 'upstream/main' into check-workspace-member-compiler-options 2025-01-06 23:07:10 +00:00
Bartek Iwańczuk
6750aa61eb
ci: increase timeout to 240 minutes (#27571)
With https://github.com/denoland/deno/pull/27507 landed we need to do
two builds that don't really share much codegen units.

This should be reverted once we move to self-hosted runner on `main`
(https://github.com/denoland/deno/pull/27568)
2025-01-06 22:33:52 +00:00
Yoshiya Hinosawa
888ab9f4f7
test(ext/node): disable flaky dgram tests (#27549)
Closes #27316
2025-01-06 23:18:45 +01:00
Luca Casonato
f483996658
feat(unstable): no config npm:@opentelemetry/api integration (#27541)
After this PR, one does not need to import `jsr:@deno/otel` anymore.
2025-01-06 17:00:32 +01:00
snek
ccd375802a
refactor(quic): introduce endpoint, 0rtt, cleanup (#27444)
A QUIC endpoint is a UDP socket which multiplexes QUIC sessions, which
may be initiated in either direction. This PR exposes endpoints and
moves things around as needed.

Now that endpoints can be reused between client connections, we have a
way to share tls tickets between them and allow 0rtt. This interface
currently works by conditionally returning a promise.

Also cleaned up the rust op names, fixed some lingering problems in the
data transmission, and switched to explicit error types.
2025-01-06 15:24:59 +01:00
474 changed files with 33321 additions and 18750 deletions

View file

@ -5,7 +5,7 @@ import { stringify } from "jsr:@std/yaml@^0.221/stringify";
// Bump this number when you want to purge the cache.
// Note: the tools/release/01_bump_crate_versions.ts script will update this version
// automatically via regex, so ensure that this line maintains this format.
const cacheVersion = 32;
const cacheVersion = 36;
const ubuntuX86Runner = "ubuntu-24.04";
const ubuntuX86XlRunner = "ubuntu-24.04-xl";
@ -14,7 +14,7 @@ const windowsX86Runner = "windows-2022";
const windowsX86XlRunner = "windows-2022-xl";
const macosX86Runner = "macos-13";
const macosArmRunner = "macos-14";
const selfHostedMacosArmRunner = "self-hosted";
const selfHostedMacosArmRunner = "ghcr.io/cirruslabs/macos-runner:sonoma";
const Runners = {
linuxX86: {
@ -41,8 +41,14 @@ const Runners = {
macosArm: {
os: "macos",
arch: "aarch64",
runner: macosArmRunner,
},
macosArmSelfHosted: {
os: "macos",
arch: "aarch64",
// Actually use self-hosted runner only in denoland/deno on `main` branch and for tags (release) builds.
runner:
`\${{ github.repository == 'denoland/deno' && startsWith(github.ref, 'refs/tags/') && '${selfHostedMacosArmRunner}' || '${macosArmRunner}' }}`,
`\${{ github.repository == 'denoland/deno' && (github.ref == 'refs/heads/main' || startsWith(github.ref, 'refs/tags/')) && '${selfHostedMacosArmRunner}' || '${macosArmRunner}' }}`,
},
windowsX86: {
os: "windows",
@ -360,7 +366,7 @@ const ci = {
needs: ["pre_build"],
if: "${{ needs.pre_build.outputs.skip_build != 'true' }}",
"runs-on": "${{ matrix.runner }}",
"timeout-minutes": 180,
"timeout-minutes": 240,
defaults: {
run: {
// GH actions does not fail fast by default on
@ -384,7 +390,7 @@ const ci = {
job: "test",
profile: "debug",
}, {
...Runners.macosArm,
...Runners.macosArmSelfHosted,
job: "test",
profile: "release",
skip_pr: true,
@ -486,7 +492,7 @@ const ci = {
},
{
name: "Cache Cargo home",
uses: "actions/cache@v4",
uses: "cirruslabs/cache@v4",
with: {
// See https://doc.rust-lang.org/cargo/guide/cargo-home.html#caching-the-cargo-home-in-ci
// Note that with the new sparse registry format, we no longer have to cache a `.git` dir
@ -716,19 +722,6 @@ const ci = {
"df -h",
].join("\n"),
},
{
name: "Build denort release",
if: [
"matrix.job == 'test' &&",
"matrix.profile == 'release' &&",
"github.repository == 'denoland/deno'",
].join("\n"),
run: [
"df -h",
"cargo build --profile=release-slim --locked --bin denort",
"df -h",
].join("\n"),
},
{
// Run a minimal check to ensure that binary is not corrupted, regardless
// of our build mode
@ -775,11 +768,10 @@ const ci = {
"cd target/release",
"zip -r deno-${{ matrix.arch }}-unknown-linux-gnu.zip deno",
"shasum -a 256 deno-${{ matrix.arch }}-unknown-linux-gnu.zip > deno-${{ matrix.arch }}-unknown-linux-gnu.zip.sha256sum",
"./deno types > lib.deno.d.ts",
"cd ../release-slim",
"zip -r ../release/denort-${{ matrix.arch }}-unknown-linux-gnu.zip denort",
"cd ../release",
"strip denort",
"zip -r denort-${{ matrix.arch }}-unknown-linux-gnu.zip denort",
"shasum -a 256 denort-${{ matrix.arch }}-unknown-linux-gnu.zip > denort-${{ matrix.arch }}-unknown-linux-gnu.zip.sha256sum",
"./deno types > lib.deno.d.ts",
].join("\n"),
},
{
@ -804,9 +796,8 @@ const ci = {
"cd target/release",
"zip -r deno-${{ matrix.arch }}-apple-darwin.zip deno",
"shasum -a 256 deno-${{ matrix.arch }}-apple-darwin.zip > deno-${{ matrix.arch }}-apple-darwin.zip.sha256sum",
"cd ../release-slim",
"zip -r ../release/denort-${{ matrix.arch }}-apple-darwin.zip denort",
"cd ../release",
"strip denort",
"zip -r denort-${{ matrix.arch }}-apple-darwin.zip denort",
"shasum -a 256 denort-${{ matrix.arch }}-apple-darwin.zip > denort-${{ matrix.arch }}-apple-darwin.zip.sha256sum",
]
.join("\n"),
@ -823,8 +814,7 @@ const ci = {
run: [
"Compress-Archive -CompressionLevel Optimal -Force -Path target/release/deno.exe -DestinationPath target/release/deno-${{ matrix.arch }}-pc-windows-msvc.zip",
"Get-FileHash target/release/deno-${{ matrix.arch }}-pc-windows-msvc.zip -Algorithm SHA256 | Format-List > target/release/deno-${{ matrix.arch }}-pc-windows-msvc.zip.sha256sum",
"Compress-Archive -CompressionLevel Optimal -Force -Path target/release-slim/denort.exe -DestinationPath target/release/denort-${{ matrix.arch }}-pc-windows-msvc.zip",
"Compress-Archive -CompressionLevel Optimal -Force -Path target/release/denort.exe -DestinationPath target/release/denort-${{ matrix.arch }}-pc-windows-msvc.zip",
"Get-FileHash target/release/denort-${{ matrix.arch }}-pc-windows-msvc.zip -Algorithm SHA256 | Format-List > target/release/denort-${{ matrix.arch }}-pc-windows-msvc.zip.sha256sum",
].join("\n"),
},

View file

@ -48,7 +48,7 @@ jobs:
- pre_build
if: '${{ needs.pre_build.outputs.skip_build != ''true'' }}'
runs-on: '${{ matrix.runner }}'
timeout-minutes: 180
timeout-minutes: 240
defaults:
run:
shell: bash
@ -68,12 +68,12 @@ jobs:
skip: '${{ !contains(github.event.pull_request.labels.*.name, ''ci-full'') && (github.event_name == ''pull_request'') }}'
- os: macos
arch: aarch64
runner: '${{ github.repository == ''denoland/deno'' && startsWith(github.ref, ''refs/tags/'') && ''self-hosted'' || ''macos-14'' }}'
runner: macos-14
job: test
profile: debug
- os: macos
arch: aarch64
runner: '${{ (!contains(github.event.pull_request.labels.*.name, ''ci-full'') && (github.event_name == ''pull_request'')) && ''ubuntu-24.04'' || github.repository == ''denoland/deno'' && startsWith(github.ref, ''refs/tags/'') && ''self-hosted'' || ''macos-14'' }}'
runner: '${{ (!contains(github.event.pull_request.labels.*.name, ''ci-full'') && (github.event_name == ''pull_request'')) && ''ubuntu-24.04'' || github.repository == ''denoland/deno'' && (github.ref == ''refs/heads/main'' || startsWith(github.ref, ''refs/tags/'')) && ''ghcr.io/cirruslabs/macos-runner:sonoma'' || ''macos-14'' }}'
job: test
profile: release
skip: '${{ !contains(github.event.pull_request.labels.*.name, ''ci-full'') && (github.event_name == ''pull_request'') }}'
@ -175,7 +175,7 @@ jobs:
tar --exclude=".git*" --exclude=target --exclude=third_party/prebuilt \
-czvf target/release/deno_src.tar.gz -C .. deno
- name: Cache Cargo home
uses: actions/cache@v4
uses: cirruslabs/cache@v4
with:
path: |-
~/.cargo/.crates.toml
@ -184,8 +184,8 @@ jobs:
~/.cargo/registry/index
~/.cargo/registry/cache
~/.cargo/git/db
key: '32-cargo-home-${{ matrix.os }}-${{ matrix.arch }}-${{ hashFiles(''Cargo.lock'') }}'
restore-keys: '32-cargo-home-${{ matrix.os }}-${{ matrix.arch }}-'
key: '36-cargo-home-${{ matrix.os }}-${{ matrix.arch }}-${{ hashFiles(''Cargo.lock'') }}'
restore-keys: '36-cargo-home-${{ matrix.os }}-${{ matrix.arch }}-'
if: '!(matrix.skip)'
- uses: dsherret/rust-toolchain-file@v1
if: '!(matrix.skip)'
@ -379,7 +379,7 @@ jobs:
!./target/*/*.zip
!./target/*/*.tar.gz
key: never_saved
restore-keys: '32-cargo-target-${{ matrix.os }}-${{ matrix.arch }}-${{ matrix.profile }}-${{ matrix.job }}-'
restore-keys: '36-cargo-target-${{ matrix.os }}-${{ matrix.arch }}-${{ matrix.profile }}-${{ matrix.job }}-'
- name: Apply and update mtime cache
if: '!(matrix.skip) && (!startsWith(github.ref, ''refs/tags/''))'
uses: ./.github/mtime_cache
@ -419,15 +419,6 @@ jobs:
df -h
cargo build --release --locked --all-targets
df -h
- name: Build denort release
if: |-
!(matrix.skip) && (matrix.job == 'test' &&
matrix.profile == 'release' &&
github.repository == 'denoland/deno')
run: |-
df -h
cargo build --profile=release-slim --locked --bin denort
df -h
- name: Check deno binary
if: '!(matrix.skip) && (matrix.job == ''test'')'
run: 'target/${{ matrix.profile }}/deno eval "console.log(1+2)" | grep 3'
@ -457,11 +448,10 @@ jobs:
cd target/release
zip -r deno-${{ matrix.arch }}-unknown-linux-gnu.zip deno
shasum -a 256 deno-${{ matrix.arch }}-unknown-linux-gnu.zip > deno-${{ matrix.arch }}-unknown-linux-gnu.zip.sha256sum
./deno types > lib.deno.d.ts
cd ../release-slim
zip -r ../release/denort-${{ matrix.arch }}-unknown-linux-gnu.zip denort
cd ../release
strip denort
zip -r denort-${{ matrix.arch }}-unknown-linux-gnu.zip denort
shasum -a 256 denort-${{ matrix.arch }}-unknown-linux-gnu.zip > denort-${{ matrix.arch }}-unknown-linux-gnu.zip.sha256sum
./deno types > lib.deno.d.ts
- name: Pre-release (mac)
if: |-
!(matrix.skip) && (matrix.os == 'macos' &&
@ -477,9 +467,8 @@ jobs:
cd target/release
zip -r deno-${{ matrix.arch }}-apple-darwin.zip deno
shasum -a 256 deno-${{ matrix.arch }}-apple-darwin.zip > deno-${{ matrix.arch }}-apple-darwin.zip.sha256sum
cd ../release-slim
zip -r ../release/denort-${{ matrix.arch }}-apple-darwin.zip denort
cd ../release
strip denort
zip -r denort-${{ matrix.arch }}-apple-darwin.zip denort
shasum -a 256 denort-${{ matrix.arch }}-apple-darwin.zip > denort-${{ matrix.arch }}-apple-darwin.zip.sha256sum
- name: Pre-release (windows)
if: |-
@ -491,7 +480,7 @@ jobs:
run: |-
Compress-Archive -CompressionLevel Optimal -Force -Path target/release/deno.exe -DestinationPath target/release/deno-${{ matrix.arch }}-pc-windows-msvc.zip
Get-FileHash target/release/deno-${{ matrix.arch }}-pc-windows-msvc.zip -Algorithm SHA256 | Format-List > target/release/deno-${{ matrix.arch }}-pc-windows-msvc.zip.sha256sum
Compress-Archive -CompressionLevel Optimal -Force -Path target/release-slim/denort.exe -DestinationPath target/release/denort-${{ matrix.arch }}-pc-windows-msvc.zip
Compress-Archive -CompressionLevel Optimal -Force -Path target/release/denort.exe -DestinationPath target/release/denort-${{ matrix.arch }}-pc-windows-msvc.zip
Get-FileHash target/release/denort-${{ matrix.arch }}-pc-windows-msvc.zip -Algorithm SHA256 | Format-List > target/release/denort-${{ matrix.arch }}-pc-windows-msvc.zip.sha256sum
- name: Upload canary to dl.deno.land
if: |-
@ -700,7 +689,7 @@ jobs:
!./target/*/gn_root
!./target/*/*.zip
!./target/*/*.tar.gz
key: '32-cargo-target-${{ matrix.os }}-${{ matrix.arch }}-${{ matrix.profile }}-${{ matrix.job }}-${{ github.sha }}'
key: '36-cargo-target-${{ matrix.os }}-${{ matrix.arch }}-${{ matrix.profile }}-${{ matrix.job }}-${{ github.sha }}'
publish-canary:
name: publish canary
runs-on: ubuntu-24.04

417
Cargo.lock generated

File diff suppressed because it is too large Load diff

View file

@ -5,6 +5,9 @@ resolver = "2"
members = [
"bench_util",
"cli",
"cli/lib",
"cli/rt",
"cli/snapshot",
"ext/broadcast_channel",
"ext/cache",
"ext/canvas",
@ -48,56 +51,60 @@ repository = "https://github.com/denoland/deno"
[workspace.dependencies]
deno_ast = { version = "=0.44.0", features = ["transpiling"] }
deno_core = { version = "0.327.0" }
deno_core = { version = "0.331.0" }
deno_bench_util = { version = "0.178.0", path = "./bench_util" }
deno_bench_util = { version = "0.180.0", path = "./bench_util" }
# TODO(nayeemrmn): Use proper version when https://github.com/denoland/deno_config/pull/143 lands!
deno_config = { git = "https://github.com/denoland/deno_config.git", rev = "4cbb63704442a7834dc6bed2e7e310a0d46ade09", features = ["workspace", "sync"] }
deno_config = { git = "https://github.com/denoland/deno_config.git", rev = "39be71a5936221bf23e438c11cfcffe56ce54690", features = ["workspace", "sync"] }
deno_lockfile = "=0.24.0"
deno_media_type = { version = "0.2.3", features = ["module_specifier"] }
deno_npm = "=0.27.0"
deno_media_type = { version = "0.2.4", features = ["module_specifier"] }
deno_npm = "=0.27.2"
deno_path_util = "=0.3.0"
deno_permissions = { version = "0.43.0", path = "./runtime/permissions" }
deno_runtime = { version = "0.192.0", path = "./runtime" }
deno_permissions = { version = "0.45.0", path = "./runtime/permissions" }
deno_runtime = { version = "0.194.0", path = "./runtime" }
deno_semver = "=0.7.1"
deno_terminal = "0.2.0"
napi_sym = { version = "0.114.0", path = "./ext/napi/sym" }
napi_sym = { version = "0.116.0", path = "./ext/napi/sym" }
test_util = { package = "test_server", path = "./tests/util/server" }
denokv_proto = "0.8.4"
denokv_remote = "0.8.4"
denokv_proto = "0.9.0"
denokv_remote = "0.9.0"
# denokv_sqlite brings in bundled sqlite if we don't disable the default features
denokv_sqlite = { default-features = false, version = "0.8.4" }
denokv_sqlite = { default-features = false, version = "0.9.0" }
# exts
deno_broadcast_channel = { version = "0.178.0", path = "./ext/broadcast_channel" }
deno_cache = { version = "0.116.0", path = "./ext/cache" }
deno_canvas = { version = "0.53.0", path = "./ext/canvas" }
deno_console = { version = "0.184.0", path = "./ext/console" }
deno_cron = { version = "0.64.0", path = "./ext/cron" }
deno_crypto = { version = "0.198.0", path = "./ext/crypto" }
deno_fetch = { version = "0.208.0", path = "./ext/fetch" }
deno_ffi = { version = "0.171.0", path = "./ext/ffi" }
deno_fs = { version = "0.94.0", path = "./ext/fs" }
deno_http = { version = "0.182.0", path = "./ext/http" }
deno_io = { version = "0.94.0", path = "./ext/io" }
deno_kv = { version = "0.92.0", path = "./ext/kv" }
deno_napi = { version = "0.115.0", path = "./ext/napi" }
deno_net = { version = "0.176.0", path = "./ext/net" }
deno_node = { version = "0.122.0", path = "./ext/node" }
deno_telemetry = { version = "0.6.0", path = "./ext/telemetry" }
deno_tls = { version = "0.171.0", path = "./ext/tls" }
deno_url = { version = "0.184.0", path = "./ext/url" }
deno_web = { version = "0.215.0", path = "./ext/web" }
deno_webgpu = { version = "0.151.0", path = "./ext/webgpu" }
deno_webidl = { version = "0.184.0", path = "./ext/webidl" }
deno_websocket = { version = "0.189.0", path = "./ext/websocket" }
deno_webstorage = { version = "0.179.0", path = "./ext/webstorage" }
deno_broadcast_channel = { version = "0.180.0", path = "./ext/broadcast_channel" }
deno_cache = { version = "0.118.0", path = "./ext/cache" }
deno_canvas = { version = "0.55.0", path = "./ext/canvas" }
deno_console = { version = "0.186.0", path = "./ext/console" }
deno_cron = { version = "0.66.0", path = "./ext/cron" }
deno_crypto = { version = "0.200.0", path = "./ext/crypto" }
deno_fetch = { version = "0.210.0", path = "./ext/fetch" }
deno_ffi = { version = "0.173.0", path = "./ext/ffi" }
deno_fs = { version = "0.96.0", path = "./ext/fs" }
deno_http = { version = "0.184.0", path = "./ext/http" }
deno_io = { version = "0.96.0", path = "./ext/io" }
deno_kv = { version = "0.94.0", path = "./ext/kv" }
deno_napi = { version = "0.117.0", path = "./ext/napi" }
deno_net = { version = "0.178.0", path = "./ext/net" }
deno_node = { version = "0.124.0", path = "./ext/node" }
deno_os = { version = "0.3.0", path = "./ext/os" }
deno_process = { version = "0.1.0", path = "./ext/process" }
deno_telemetry = { version = "0.8.0", path = "./ext/telemetry" }
deno_tls = { version = "0.173.0", path = "./ext/tls" }
deno_url = { version = "0.186.0", path = "./ext/url" }
deno_web = { version = "0.217.0", path = "./ext/web" }
deno_webgpu = { version = "0.153.0", path = "./ext/webgpu" }
deno_webidl = { version = "0.186.0", path = "./ext/webidl" }
deno_websocket = { version = "0.191.0", path = "./ext/websocket" }
deno_webstorage = { version = "0.181.0", path = "./ext/webstorage" }
# resolvers
deno_npm_cache = { version = "0.3.0", path = "./resolvers/npm_cache" }
deno_resolver = { version = "0.15.0", path = "./resolvers/deno" }
node_resolver = { version = "0.22.0", path = "./resolvers/node" }
# workspace libraries
deno_lib = { version = "0.2.0", path = "./cli/lib" }
deno_npm_cache = { version = "0.5.0", path = "./resolvers/npm_cache" }
deno_resolver = { version = "0.17.0", path = "./resolvers/deno" }
deno_snapshots = { version = "0.1.0", path = "./cli/snapshot" }
node_resolver = { version = "0.24.0", path = "./resolvers/node" }
aes = "=0.8.3"
anyhow = "1.0.57"
@ -120,7 +127,7 @@ dashmap = "5.5.3"
data-encoding = "2.3.3"
data-url = "=0.3.1"
deno_cache_dir = "=0.16.0"
deno_error = "=0.5.2"
deno_error = "=0.5.5"
deno_package_json = { version = "0.4.0", default-features = false }
deno_unsync = "0.4.2"
dlopen2 = "0.6.1"
@ -151,6 +158,7 @@ ipnet = "2.3"
jsonc-parser = { version = "=0.26.2", features = ["serde"] }
lazy-regex = "3"
libc = "0.2.168"
libsui = "0.5.0"
libz-sys = { version = "1.1.20", default-features = false }
log = { version = "0.4.20", features = ["kv"] }
lsp-types = "=0.97.0" # used by tower-lsp and "proposed" feature is unstable in patch releases
@ -194,7 +202,7 @@ slab = "0.4"
smallvec = "1.8"
socket2 = { version = "0.5.3", features = ["all"] }
spki = "0.7.2"
sys_traits = "=0.1.6"
sys_traits = "=0.1.7"
tar = "=0.4.40"
tempfile = "3.4.0"
termcolor = "1.1.3"
@ -252,11 +260,6 @@ incremental = true
lto = true
opt-level = 'z' # Optimize for size
[profile.release-slim]
inherits = "release"
panic = "abort"
strip = "symbols"
# Build release with debug symbols: cargo build --profile=release-with-debug
[profile.release-with-debug]
inherits = "release"

View file

@ -6,8 +6,8 @@
<img align="right" src="https://deno.land/logo.svg" height="150px" alt="the deno mascot dinosaur standing in the rain">
[Deno](https://www.deno.com)
([/ˈdiːnoʊ/](http://ipa-reader.xyz/?text=%CB%88di%CB%90no%CA%8A), pronounced
[Deno](https://deno.com)
([/ˈdiːnoʊ/](https://ipa-reader.com/?text=%CB%88di%CB%90no%CA%8A), pronounced
`dee-no`) is a JavaScript, TypeScript, and WebAssembly runtime with secure
defaults and a great developer experience. It's built on [V8](https://v8.dev/),
[Rust](https://www.rust-lang.org/), and [Tokio](https://tokio.rs/).

View file

@ -6,6 +6,111 @@ https://github.com/denoland/deno/releases
We also have one-line install commands at:
https://github.com/denoland/deno_install
### 2.1.6 / 2025.01.16
- fix(check/lsp): correctly resolve compilerOptions.types (#27686)
- fix(check/lsp): fix bugs with tsc type resolution, allow npm packages to
augment `ImportMeta` (#27690)
- fix(compile): store embedded fs case sensitivity (#27653)
- fix(compile/windows): better handling of deno_dir on different drive letter
than code (#27654)
- fix(ext/console): change Temporal color (#27684)
- fix(ext/node): add `writev` method to `FileHandle` (#27563)
- fix(ext/node): add chown method to FileHandle class (#27638)
- fix(ext/node): apply `@npmcli/agent` workaround to `npm-check-updates`
(#27639)
- fix(ext/node): fix playwright http client (#27662)
- fix(ext/node): show bare-node-builtin hint when using an import map (#27632)
- fix(ext/node): use primordials in `ext/node/polyfills/_fs_common.ts` (#27589)
- fix(lsp): handle pathless untitled URIs (#27637)
- fix(lsp/check): don't resolve unknown media types to a `.js` extension
(#27631)
- fix(node): Prevent node:child_process from always inheriting the parent
environment (#27343) (#27340)
- fix(node/fs): add utimes method to the FileHandle class (#27582)
- fix(outdated): Use `latest` tag even when it's the same as the current version
(#27699)
- fix(outdated): retain strict semver specifier when updating (#27701)
### 2.1.5 / 2025.01.09
- feat(unstable): implement QUIC (#21942)
- feat(unstable): add JS linting plugin infrastructure (#27416)
- feat(unstable): add OTEL MeterProvider (#27240)
- feat(unstable): no config npm:@opentelemetry/api integration (#27541)
- feat(unstable): replace SpanExporter with TracerProvider (#27473)
- feat(unstable): support selectors in JS lint plugins (#27452)
- fix(check): line-break between diagnostic message chain entries (#27543)
- fix(check): move module not found errors to typescript diagnostics (#27533)
- fix(compile): analyze modules in directory specified in --include (#27296)
- fix(compile): be more deterministic when compiling the same code in different
directories (#27395)
- fix(compile): display embedded file sizes and total (#27360)
- fix(compile): output contents of embedded file system (#27302)
- fix(ext/fetch): better error message when body resource is unavailable
(#27429)
- fix(ext/fetch): retry some http/2 errors (#27417)
- fix(ext/fs): do not throw for bigint ctime/mtime/atime (#27453)
- fix(ext/http): improve error message when underlying resource of request body
unavailable (#27463)
- fix(ext/net): update moka cache to avoid potential panic in `Deno.resolveDns`
on some laptops with Ryzen CPU (#27572)
- fix(ext/node): fix `fs.access`/`fs.promises.access` with `X_OK` mode parameter
on Windows (#27407)
- fix(ext/node): fix `os.cpus()` on Linux (#27592)
- fix(ext/node): RangeError timingSafeEqual with different byteLength (#27470)
- fix(ext/node): add `truncate` method to the `FileHandle` class (#27389)
- fix(ext/node): add support of any length IV for aes-(128|256)-gcm ciphers
(#27476)
- fix(ext/node): convert brotli chunks with proper byte offset (#27455)
- fix(ext/node): do not exit worker thread when there is pending async op
(#27378)
- fix(ext/node): have `process` global available in Node context (#27562)
- fix(ext/node): make getCiphers return supported ciphers (#27466)
- fix(ext/node): sort list of built-in modules alphabetically (#27410)
- fix(ext/node): support createConnection option in node:http.request() (#25470)
- fix(ext/node): support private key export in JWK format (#27325)
- fix(ext/web): add `[[ErrorData]]` slot to `DOMException` (#27342)
- fix(ext/websocket): Fix close code without reason (#27578)
- fix(jsr): Wasm imports fail to load (#27594)
- fix(kv): improve backoff error message and inline documentation (#27537)
- fix(lint): fix single char selectors being ignored (#27576)
- fix(lockfile): include dependencies listed in external import map in lockfile
(#27337)
- fix(lsp): css preprocessor formatting (#27526)
- fix(lsp): don't skip dirs with enabled subdirs (#27580)
- fix(lsp): include "node:" prefix for node builtin auto-imports (#27404)
- fix(lsp): respect "typescript.suggestionActions.enabled" setting (#27373)
- fix(lsp): rewrite imports for 'Move to a new file' action (#27427)
- fix(lsp): sql and component file formatting (#27350)
- fix(lsp): use verbatim specifier for URL auto-imports (#27605)
- fix(no-slow-types): handle rest param with internal assignments (#27581)
- fix(node/fs): add a chmod method to the FileHandle class (#27522)
- fix(node): add missing `inspector/promises` (#27491)
- fix(node): handle cjs exports with escaped chars (#27438)
- fix(npm): deterministically output tags to initialized file (#27514)
- fix(npm): search node_modules folder for package matching npm specifier
(#27345)
- fix(outdated): ensure "Latest" version is greater than "Update" version
(#27390)
- fix(outdated): support updating dependencies in external import maps (#27339)
- fix(permissions): implicit `--allow-import` when using `--cached-only`
(#27530)
- fix(publish): infer literal types in const contexts (#27425)
- fix(task): properly handle task name wildcards with --recursive (#27396)
- fix(task): support tasks without commands (#27191)
- fix(unstable): don't error on non-existing attrs or type attr (#27456)
- fix: FastString v8_string() should error when cannot allocated (#27375)
- fix: deno_resolver crate without 'sync' feature (#27403)
- fix: incorrect memory info free/available bytes on mac (#27460)
- fix: upgrade deno_doc to 0.161.3 (#27377)
- perf(fs/windows): stat - only open file once (#27487)
- perf(node/fs/copy): reduce metadata lookups copying directory (#27495)
- perf: don't store duplicate info for ops in the snapshot (#27430)
- perf: remove now needless canonicalization getting closest package.json
(#27437)
- perf: upgrade to deno_semver 0.7 (#27426)
### 2.1.4 / 2024.12.11
- feat(unstable): support caching npm dependencies only as they're needed

View file

@ -2,7 +2,7 @@
[package]
name = "deno_bench_util"
version = "0.178.0"
version = "0.180.0"
authors.workspace = true
edition.workspace = true
license.workspace = true

View file

@ -2,7 +2,7 @@
[package]
name = "deno"
version = "2.1.4"
version = "2.1.6"
authors.workspace = true
default-run = "deno"
edition.workspace = true
@ -16,11 +16,6 @@ name = "deno"
path = "main.rs"
doc = false
[[bin]]
name = "denort"
path = "mainrt.rs"
doc = false
[[test]]
name = "integration"
path = "integration_tests_runner.rs"
@ -49,7 +44,7 @@ dhat-heap = ["dhat"]
upgrade = []
# A dev feature to disable creations and loading of snapshots in favor of
# loading JS sources at runtime.
hmr = ["deno_runtime/hmr"]
hmr = ["deno_runtime/hmr", "deno_snapshots/disable"]
# Vendor zlib as zlib-ng
__vendored_zlib_ng = ["flate2/zlib-ng-compat", "libz-sys/zlib-ng"]
@ -60,8 +55,11 @@ lazy-regex.workspace = true
serde.workspace = true
serde_json.workspace = true
zstd.workspace = true
glibc_version = "0.1.2"
flate2 = { workspace = true, features = ["default"] }
deno_error.workspace = true
[target.'cfg(unix)'.build-dependencies]
glibc_version = "0.1.2"
[target.'cfg(windows)'.build-dependencies]
winapi.workspace = true
@ -72,9 +70,10 @@ deno_ast = { workspace = true, features = ["bundler", "cjs", "codegen", "proposa
deno_cache_dir.workspace = true
deno_config.workspace = true
deno_core = { workspace = true, features = ["include_js_files_for_snapshotting"] }
deno_doc = { version = "=0.161.3", features = ["rust", "comrak"] }
deno_doc = { version = "=0.164.0", features = ["rust", "comrak"] }
deno_error.workspace = true
deno_graph = { version = "=0.86.7" }
deno_graph = { version = "=0.87.0" }
deno_lib.workspace = true
deno_lint = { version = "=0.68.2", features = ["docs"] }
deno_lockfile.workspace = true
deno_npm.workspace = true
@ -84,10 +83,11 @@ deno_path_util.workspace = true
deno_resolver = { workspace = true, features = ["sync"] }
deno_runtime = { workspace = true, features = ["include_js_files_for_snapshotting"] }
deno_semver.workspace = true
deno_snapshots = { workspace = true }
deno_task_shell = "=0.20.2"
deno_telemetry.workspace = true
deno_terminal.workspace = true
libsui = "0.5.0"
libsui.workspace = true
node_resolver.workspace = true
anstream = "0.6.14"
@ -113,7 +113,6 @@ dprint-plugin-json = "=0.19.4"
dprint-plugin-jupyter = "=0.1.5"
dprint-plugin-markdown = "=0.17.8"
dprint-plugin-typescript = "=0.93.3"
env_logger = "=0.10.0"
fancy-regex = "=0.10.0"
faster-hex.workspace = true
# If you disable the default __vendored_zlib_ng feature above, you _must_ be able to link against `-lz`.
@ -124,7 +123,7 @@ http.workspace = true
http-body.workspace = true
http-body-util.workspace = true
hyper-util.workspace = true
import_map = { version = "=0.20.1", features = ["ext"] }
import_map = { version = "=0.21.0", features = ["ext"] }
indexmap.workspace = true
jsonc-parser = { workspace = true, features = ["cst", "serde"] }
jupyter_runtime = { package = "runtimelib", version = "=0.19.0", features = ["tokio-runtime"] }
@ -154,7 +153,6 @@ rustyline-derive = "=0.7.0"
serde.workspace = true
serde_repr.workspace = true
sha2.workspace = true
shell-escape = "=0.1.5"
spki = { version = "0.7", features = ["pem"] }
sqlformat = "=0.3.2"
strsim = "0.11.1"
@ -183,6 +181,7 @@ winapi = { workspace = true, features = ["knownfolders", "mswsock", "objbase", "
[target.'cfg(unix)'.dependencies]
nix.workspace = true
shell-escape = "=0.1.5"
[dev-dependencies]
deno_bench_util.workspace = true

View file

@ -31,6 +31,9 @@ use deno_core::error::AnyError;
use deno_core::resolve_url_or_path;
use deno_core::url::Url;
use deno_graph::GraphKind;
use deno_lib::args::CaData;
use deno_lib::args::UnstableConfig;
use deno_lib::version::DENO_VERSION_INFO;
use deno_path_util::normalize_path;
use deno_path_util::url_to_file_path;
use deno_runtime::deno_permissions::SysDescriptor;
@ -546,15 +549,6 @@ impl Default for TypeCheckMode {
}
}
#[derive(Clone, Debug, Eq, PartialEq)]
pub enum CaData {
/// The string is a file path
File(String),
/// This variant is not exposed as an option in the CLI, it is used internally
/// for standalone binaries.
Bytes(Vec<u8>),
}
// Info needed to run NPM lifecycle scripts
#[derive(Clone, Debug, Eq, PartialEq, Default)]
pub struct LifecycleScriptsConfig {
@ -582,19 +576,6 @@ fn parse_packages_allowed_scripts(s: &str) -> Result<String, AnyError> {
}
}
#[derive(
Clone, Default, Debug, Eq, PartialEq, serde::Serialize, serde::Deserialize,
)]
pub struct UnstableConfig {
// TODO(bartlomieju): remove in Deno 2.5
pub legacy_flag_enabled: bool, // --unstable
pub bare_node_builtins: bool,
pub detect_cjs: bool,
pub sloppy_imports: bool,
pub npm_lazy_caching: bool,
pub features: Vec<String>, // --unstabe-kv --unstable-cron
}
#[derive(Clone, Debug, Eq, PartialEq, Default)]
pub struct InternalFlags {
/// Used when the language server is configured with an
@ -1492,14 +1473,15 @@ fn handle_repl_flags(flags: &mut Flags, repl_flags: ReplFlags) {
}
pub fn clap_root() -> Command {
debug_assert_eq!(DENO_VERSION_INFO.typescript, deno_snapshots::TS_VERSION);
let long_version = format!(
"{} ({}, {}, {})\nv8 {}\ntypescript {}",
crate::version::DENO_VERSION_INFO.deno,
crate::version::DENO_VERSION_INFO.release_channel.name(),
DENO_VERSION_INFO.deno,
DENO_VERSION_INFO.release_channel.name(),
env!("PROFILE"),
env!("TARGET"),
deno_core::v8::VERSION_STRING,
crate::version::DENO_VERSION_INFO.typescript
DENO_VERSION_INFO.typescript
);
run_args(Command::new("deno"), true)
@ -1515,7 +1497,7 @@ pub fn clap_root() -> Command {
)
.color(ColorChoice::Auto)
.term_width(800)
.version(crate::version::DENO_VERSION_INFO.deno)
.version(DENO_VERSION_INFO.deno)
.long_version(long_version)
.disable_version_flag(true)
.disable_help_flag(true)

View file

@ -10,6 +10,7 @@ use deno_core::error::AnyError;
use deno_core::parking_lot::Mutex;
use deno_core::parking_lot::MutexGuard;
use deno_core::serde_json;
use deno_error::JsErrorBox;
use deno_lockfile::Lockfile;
use deno_lockfile::WorkspaceMemberConfig;
use deno_package_json::PackageJsonDepValue;
@ -59,6 +60,16 @@ impl<'a, T> std::ops::DerefMut for Guard<'a, T> {
}
}
#[derive(Debug, thiserror::Error, deno_error::JsError)]
pub enum AtomicWriteFileWithRetriesError {
#[class(inherit)]
#[error(transparent)]
Changed(JsErrorBox),
#[class(inherit)]
#[error("Failed writing lockfile")]
Io(#[source] std::io::Error),
}
impl CliLockfile {
/// Get the inner deno_lockfile::Lockfile.
pub fn lock(&self) -> Guard<Lockfile> {
@ -78,12 +89,16 @@ impl CliLockfile {
self.lockfile.lock().overwrite
}
pub fn write_if_changed(&self) -> Result<(), AnyError> {
pub fn write_if_changed(
&self,
) -> Result<(), AtomicWriteFileWithRetriesError> {
if self.skip_write {
return Ok(());
}
self.error_if_changed()?;
self
.error_if_changed()
.map_err(AtomicWriteFileWithRetriesError::Changed)?;
let mut lockfile = self.lockfile.lock();
let Some(bytes) = lockfile.resolve_write_bytes() else {
return Ok(()); // nothing to do
@ -96,7 +111,7 @@ impl CliLockfile {
&bytes,
cache::CACHE_PERM,
)
.context("Failed writing lockfile.")?;
.map_err(AtomicWriteFileWithRetriesError::Io)?;
lockfile.has_content_changed = false;
Ok(())
}
@ -255,7 +270,7 @@ impl CliLockfile {
})
}
pub fn error_if_changed(&self) -> Result<(), AnyError> {
pub fn error_if_changed(&self) -> Result<(), JsErrorBox> {
if !self.frozen {
return Ok(());
}
@ -267,9 +282,7 @@ impl CliLockfile {
let diff = crate::util::diff::diff(&contents, &new_contents);
// has an extra newline at the end
let diff = diff.trim_end();
Err(deno_core::anyhow::anyhow!(
"The lockfile is out of date. Run `deno install --frozen=false`, or rerun with `--frozen=false` to update it.\nchanges:\n{diff}"
))
Err(JsErrorBox::generic(format!("The lockfile is out of date. Run `deno install --frozen=false`, or rerun with `--frozen=false` to update it.\nchanges:\n{diff}")))
} else {
Ok(())
}

View file

@ -8,13 +8,9 @@ mod lockfile;
mod package_json;
use std::borrow::Cow;
use std::collections::BTreeSet;
use std::collections::BTreeMap;
use std::collections::HashMap;
use std::env;
use std::io::BufReader;
use std::io::Cursor;
use std::io::Read;
use std::io::Seek;
use std::net::SocketAddr;
use std::path::Path;
use std::path::PathBuf;
@ -26,6 +22,7 @@ use deno_ast::SourceMapOption;
use deno_cache_dir::file_fetcher::CacheSetting;
pub use deno_config::deno_json::BenchConfig;
pub use deno_config::deno_json::ConfigFile;
use deno_config::deno_json::ConfigFileError;
use deno_config::deno_json::FmtConfig;
pub use deno_config::deno_json::FmtOptionsConfig;
use deno_config::deno_json::LintConfig;
@ -57,6 +54,13 @@ use deno_core::serde_json;
use deno_core::url::Url;
use deno_graph::GraphKind;
pub use deno_json::check_warn_tsconfig;
use deno_lib::args::has_flag_env_var;
use deno_lib::args::npm_pkg_req_ref_to_binary_command;
use deno_lib::args::CaData;
use deno_lib::args::NpmProcessStateKind;
use deno_lib::args::NPM_PROCESS_STATE;
use deno_lib::version::DENO_VERSION_INFO;
use deno_lib::worker::StorageKeyResolver;
use deno_lint::linter::LintConfig as DenoLintConfig;
use deno_npm::npm_rc::NpmRc;
use deno_npm::npm_rc::ResolvedNpmRc;
@ -64,27 +68,20 @@ use deno_npm::resolution::ValidSerializedNpmResolutionSnapshot;
use deno_npm::NpmSystemInfo;
use deno_path_util::normalize_path;
use deno_runtime::deno_permissions::PermissionsOptions;
use deno_runtime::deno_tls::deno_native_certs::load_native_certs;
use deno_runtime::deno_tls::rustls;
use deno_runtime::deno_tls::rustls::RootCertStore;
use deno_runtime::deno_tls::rustls_pemfile;
use deno_runtime::deno_tls::webpki_roots;
use deno_runtime::inspector_server::InspectorServer;
use deno_semver::npm::NpmPackageReqReference;
use deno_semver::StackString;
use deno_telemetry::OtelConfig;
use deno_telemetry::OtelRuntimeConfig;
use deno_terminal::colors;
use dotenvy::from_filename;
pub use flags::*;
use import_map::resolve_import_map_value_from_specifier;
pub use lockfile::AtomicWriteFileWithRetriesError;
pub use lockfile::CliLockfile;
pub use lockfile::CliLockfileReadFromPathOptions;
use once_cell::sync::Lazy;
pub use package_json::NpmInstallDepsProvider;
pub use package_json::PackageJsonDepValueParseWithLocationError;
use serde::Deserialize;
use serde::Serialize;
use sys_traits::EnvHomeDir;
use thiserror::Error;
@ -92,7 +89,6 @@ use crate::cache::DenoDirProvider;
use crate::file_fetcher::CliFileFetcher;
use crate::sys::CliSys;
use crate::util::fs::canonicalize_path_maybe_not_exists;
use crate::version;
pub fn npm_registry_url() -> &'static Url {
static NPM_REGISTRY_DEFAULT_URL: Lazy<Url> = Lazy::new(|| {
@ -553,146 +549,6 @@ pub fn create_default_npmrc() -> Arc<ResolvedNpmRc> {
})
}
#[derive(Error, Debug, Clone)]
pub enum RootCertStoreLoadError {
#[error(
"Unknown certificate store \"{0}\" specified (allowed: \"system,mozilla\")"
)]
UnknownStore(String),
#[error("Unable to add pem file to certificate store: {0}")]
FailedAddPemFile(String),
#[error("Failed opening CA file: {0}")]
CaFileOpenError(String),
}
/// Create and populate a root cert store based on the passed options and
/// environment.
pub fn get_root_cert_store(
maybe_root_path: Option<PathBuf>,
maybe_ca_stores: Option<Vec<String>>,
maybe_ca_data: Option<CaData>,
) -> Result<RootCertStore, RootCertStoreLoadError> {
let mut root_cert_store = RootCertStore::empty();
let ca_stores: Vec<String> = maybe_ca_stores
.or_else(|| {
let env_ca_store = env::var("DENO_TLS_CA_STORE").ok()?;
Some(
env_ca_store
.split(',')
.map(|s| s.trim().to_string())
.filter(|s| !s.is_empty())
.collect(),
)
})
.unwrap_or_else(|| vec!["mozilla".to_string()]);
for store in ca_stores.iter() {
match store.as_str() {
"mozilla" => {
root_cert_store.extend(webpki_roots::TLS_SERVER_ROOTS.to_vec());
}
"system" => {
let roots = load_native_certs().expect("could not load platform certs");
for root in roots {
if let Err(err) = root_cert_store
.add(rustls::pki_types::CertificateDer::from(root.0.clone()))
{
log::error!(
"{}",
colors::yellow(&format!(
"Unable to add system certificate to certificate store: {:?}",
err
))
);
let hex_encoded_root = faster_hex::hex_string(&root.0);
log::error!("{}", colors::gray(&hex_encoded_root));
}
}
}
_ => {
return Err(RootCertStoreLoadError::UnknownStore(store.clone()));
}
}
}
let ca_data =
maybe_ca_data.or_else(|| env::var("DENO_CERT").ok().map(CaData::File));
if let Some(ca_data) = ca_data {
let result = match ca_data {
CaData::File(ca_file) => {
let ca_file = if let Some(root) = &maybe_root_path {
root.join(&ca_file)
} else {
PathBuf::from(ca_file)
};
let certfile = std::fs::File::open(ca_file).map_err(|err| {
RootCertStoreLoadError::CaFileOpenError(err.to_string())
})?;
let mut reader = BufReader::new(certfile);
rustls_pemfile::certs(&mut reader).collect::<Result<Vec<_>, _>>()
}
CaData::Bytes(data) => {
let mut reader = BufReader::new(Cursor::new(data));
rustls_pemfile::certs(&mut reader).collect::<Result<Vec<_>, _>>()
}
};
match result {
Ok(certs) => {
root_cert_store.add_parsable_certificates(certs);
}
Err(e) => {
return Err(RootCertStoreLoadError::FailedAddPemFile(e.to_string()));
}
}
}
Ok(root_cert_store)
}
/// State provided to the process via an environment variable.
#[derive(Clone, Debug, Serialize, Deserialize)]
pub struct NpmProcessState {
pub kind: NpmProcessStateKind,
pub local_node_modules_path: Option<String>,
}
#[derive(Clone, Debug, Serialize, Deserialize)]
pub enum NpmProcessStateKind {
Snapshot(deno_npm::resolution::SerializedNpmResolutionSnapshot),
Byonm,
}
static NPM_PROCESS_STATE: Lazy<Option<NpmProcessState>> = Lazy::new(|| {
use deno_runtime::ops::process::NPM_RESOLUTION_STATE_FD_ENV_VAR_NAME;
let fd = std::env::var(NPM_RESOLUTION_STATE_FD_ENV_VAR_NAME).ok()?;
std::env::remove_var(NPM_RESOLUTION_STATE_FD_ENV_VAR_NAME);
let fd = fd.parse::<usize>().ok()?;
let mut file = {
use deno_runtime::deno_io::FromRawIoHandle;
unsafe { std::fs::File::from_raw_io_handle(fd as _) }
};
let mut buf = Vec::new();
// seek to beginning. after the file is written the position will be inherited by this subprocess,
// and also this file might have been read before
file.seek(std::io::SeekFrom::Start(0)).unwrap();
file
.read_to_end(&mut buf)
.inspect_err(|e| {
log::error!("failed to read npm process state from fd {fd}: {e}");
})
.ok()?;
let state: NpmProcessState = serde_json::from_slice(&buf)
.inspect_err(|e| {
log::error!(
"failed to deserialize npm process state: {e} {}",
String::from_utf8_lossy(&buf)
)
})
.ok()?;
Some(state)
});
/// Overrides for the options below that when set will
/// use these values over the values derived from the
/// CLI flags or config file.
@ -701,15 +557,6 @@ struct CliOptionOverrides {
import_map_specifier: Option<Option<ModuleSpecifier>>,
}
/// Overrides for the options below that when set will
/// use these values over the values derived from the
/// CLI flags or config file.
#[derive(Debug, Clone)]
pub struct ScopeOptions {
pub scope: Option<Arc<ModuleSpecifier>>,
pub all_scopes: Arc<BTreeSet<Arc<ModuleSpecifier>>>,
}
fn load_external_import_map(
deno_json: &ConfigFile,
) -> Result<Option<(PathBuf, serde_json::Value)>, AnyError> {
@ -737,11 +584,10 @@ pub struct CliOptions {
npmrc: Arc<ResolvedNpmRc>,
maybe_lockfile: Option<Arc<CliLockfile>>,
maybe_external_import_map: Option<(PathBuf, serde_json::Value)>,
sys: CliSys,
overrides: CliOptionOverrides,
pub start_dir: Arc<WorkspaceDirectory>,
pub all_dirs: BTreeMap<Arc<Url>, Arc<WorkspaceDirectory>>,
pub deno_dir_provider: Arc<DenoDirProvider>,
pub scope_options: Option<Arc<ScopeOptions>>,
}
impl CliOptions {
@ -755,7 +601,6 @@ impl CliOptions {
start_dir: Arc<WorkspaceDirectory>,
force_global_cache: bool,
maybe_external_import_map: Option<(PathBuf, serde_json::Value)>,
scope_options: Option<Arc<ScopeOptions>>,
) -> Result<Self, AnyError> {
if let Some(insecure_allowlist) =
flags.unsafely_ignore_certificate_errors.as_ref()
@ -787,6 +632,9 @@ impl CliOptions {
load_env_variables_from_env_file(flags.env_file.as_ref());
let all_dirs = [(start_dir.dir_url().clone(), start_dir.clone())]
.into_iter()
.collect();
Ok(Self {
flags,
initial_cwd,
@ -797,9 +645,8 @@ impl CliOptions {
main_module_cell: std::sync::OnceLock::new(),
maybe_external_import_map,
start_dir,
all_dirs,
deno_dir_provider,
sys: sys.clone(),
scope_options,
})
}
@ -817,8 +664,6 @@ impl CliOptions {
} else {
&[]
};
let config_parse_options =
deno_config::deno_json::ConfigParseOptions::default();
let discover_pkg_json = flags.config_flag != ConfigFlag::Disabled
&& !flags.no_npm
&& !has_flag_env_var("DENO_NO_PACKAGE_JSON");
@ -829,7 +674,6 @@ impl CliOptions {
deno_json_cache: None,
pkg_json_cache: Some(&node_resolver::PackageJsonThreadLocalCache),
workspace_cache: None,
config_parse_options,
additional_config_file_names,
discover_pkg_json,
maybe_vendor_override,
@ -899,39 +743,18 @@ impl CliOptions {
Arc::new(start_dir),
false,
external_import_map,
None,
)
}
pub fn with_new_start_dir_and_scope_options(
&self,
start_dir: Arc<WorkspaceDirectory>,
scope_options: Option<ScopeOptions>,
) -> Result<Self, AnyError> {
let (npmrc, _) = discover_npmrc_from_workspace(&start_dir.workspace)?;
let external_import_map =
if let Some(deno_json) = start_dir.workspace.root_deno_json() {
load_external_import_map(deno_json)?
} else {
None
};
let lockfile = CliLockfile::discover(
&self.sys,
&self.flags,
&start_dir.workspace,
external_import_map.as_ref().map(|(_, v)| v),
)?;
Self::new(
&self.sys,
self.flags.clone(),
self.initial_cwd().to_path_buf(),
lockfile.map(Arc::new),
npmrc,
start_dir,
false,
external_import_map,
scope_options.map(Arc::new),
)
pub fn with_all_dirs(
self,
all_dirs: impl IntoIterator<Item = Arc<WorkspaceDirectory>>,
) -> Self {
let all_dirs = all_dirs
.into_iter()
.map(|d| (d.dir_url().clone(), d))
.collect();
Self { all_dirs, ..self }
}
/// This method is purposefully verbose to disourage its use. Do not use it
@ -1095,11 +918,11 @@ impl CliOptions {
}
};
Ok(self.workspace().create_resolver(
&CliSys::default(),
CreateResolverOptions {
pkg_json_dep_resolution,
specified_import_map: cli_arg_specified_import_map,
},
|path| Ok(std::fs::read_to_string(path)?),
)?)
}
@ -1223,6 +1046,16 @@ impl CliOptions {
}
}
pub fn resolve_storage_key_resolver(&self) -> StorageKeyResolver {
if let Some(location) = &self.flags.location {
StorageKeyResolver::from_flag(location)
} else if let Some(deno_json) = self.start_dir.maybe_deno_json() {
StorageKeyResolver::from_config_file_url(&deno_json.specifier)
} else {
StorageKeyResolver::new_use_main_module()
}
}
// If the main module should be treated as being in an npm package.
// This is triggered via a secret environment variable which is used
// for functionality like child_process.fork. Users should NOT depend
@ -1241,11 +1074,14 @@ impl CliOptions {
pub fn node_modules_dir(
&self,
) -> Result<Option<NodeModulesDirMode>, AnyError> {
) -> Result<
Option<NodeModulesDirMode>,
deno_config::deno_json::NodeModulesDirParseError,
> {
if let Some(flag) = self.flags.node_modules_dir {
return Ok(Some(flag));
}
self.workspace().node_modules_dir().map_err(Into::into)
self.workspace().node_modules_dir()
}
pub fn vendor_dir_path(&self) -> Option<&PathBuf> {
@ -1255,7 +1091,7 @@ impl CliOptions {
pub fn resolve_ts_config_for_emit(
&self,
config_type: TsConfigType,
) -> Result<TsConfigForEmit, AnyError> {
) -> Result<TsConfigForEmit, ConfigFileError> {
self.start_dir.to_ts_config_for_emit(config_type)
}
@ -1274,7 +1110,7 @@ impl CliOptions {
Ok(Some(InspectorServer::new(
host,
version::DENO_VERSION_INFO.user_agent,
DENO_VERSION_INFO.user_agent,
)?))
}
@ -1284,7 +1120,7 @@ impl CliOptions {
pub fn to_compiler_option_types(
&self,
) -> Result<Vec<deno_graph::ReferrerImports>, AnyError> {
) -> Result<Vec<deno_graph::ReferrerImports>, serde_json::Error> {
self
.start_dir
.to_compiler_option_types()
@ -2011,72 +1847,11 @@ fn resolve_import_map_specifier(
}
}
pub struct StorageKeyResolver(Option<Option<String>>);
impl StorageKeyResolver {
pub fn from_options(options: &CliOptions) -> Self {
Self(if let Some(location) = &options.flags.location {
// if a location is set, then the ascii serialization of the location is
// used, unless the origin is opaque, and then no storage origin is set, as
// we can't expect the origin to be reproducible
let storage_origin = location.origin();
if storage_origin.is_tuple() {
Some(Some(storage_origin.ascii_serialization()))
} else {
Some(None)
}
} else {
// otherwise we will use the path to the config file or None to
// fall back to using the main module's path
options
.start_dir
.maybe_deno_json()
.map(|config_file| Some(config_file.specifier.to_string()))
})
}
/// Creates a storage key resolver that will always resolve to being empty.
pub fn empty() -> Self {
Self(Some(None))
}
/// Resolves the storage key to use based on the current flags, config, or main module.
pub fn resolve_storage_key(
&self,
main_module: &ModuleSpecifier,
) -> Option<String> {
// use the stored value or fall back to using the path of the main module.
if let Some(maybe_value) = &self.0 {
maybe_value.clone()
} else {
Some(main_module.to_string())
}
}
}
/// Resolves the no_prompt value based on the cli flags and environment.
pub fn resolve_no_prompt(flags: &PermissionFlags) -> bool {
flags.no_prompt || has_flag_env_var("DENO_NO_PROMPT")
}
pub fn has_trace_permissions_enabled() -> bool {
has_flag_env_var("DENO_TRACE_PERMISSIONS")
}
pub fn has_flag_env_var(name: &str) -> bool {
let value = env::var(name);
matches!(value.as_ref().map(|s| s.as_str()), Ok("1"))
}
pub fn npm_pkg_req_ref_to_binary_command(
req_ref: &NpmPackageReqReference,
) -> String {
req_ref
.sub_path()
.map(|s| s.to_string())
.unwrap_or_else(|| req_ref.req().name.to_string())
}
pub fn config_to_deno_graph_workspace_member(
config: &ConfigFile,
) -> Result<deno_graph::WorkspaceMember, AnyError> {
@ -2137,13 +1912,6 @@ pub enum NpmCachingStrategy {
Manual,
}
pub fn otel_runtime_config() -> OtelRuntimeConfig {
OtelRuntimeConfig {
runtime_name: Cow::Borrowed("deno"),
runtime_version: Cow::Borrowed(crate::version::DENO_VERSION_INFO.deno),
}
}
#[cfg(test)]
mod test {
use pretty_assertions::assert_eq;
@ -2158,12 +1926,7 @@ mod test {
let cwd = &std::env::current_dir().unwrap();
let config_specifier =
ModuleSpecifier::parse("file:///deno/deno.jsonc").unwrap();
let config_file = ConfigFile::new(
config_text,
config_specifier,
&deno_config::deno_json::ConfigParseOptions::default(),
)
.unwrap();
let config_file = ConfigFile::new(config_text, config_specifier).unwrap();
let actual = resolve_import_map_specifier(
Some("import-map.json"),
Some(&config_file),
@ -2182,12 +1945,7 @@ mod test {
let config_text = r#"{}"#;
let config_specifier =
ModuleSpecifier::parse("file:///deno/deno.jsonc").unwrap();
let config_file = ConfigFile::new(
config_text,
config_specifier,
&deno_config::deno_json::ConfigParseOptions::default(),
)
.unwrap();
let config_file = ConfigFile::new(config_text, config_specifier).unwrap();
let actual = resolve_import_map_specifier(
None,
Some(&config_file),
@ -2206,27 +1964,6 @@ mod test {
assert_eq!(actual, None);
}
#[test]
fn storage_key_resolver_test() {
let resolver = StorageKeyResolver(None);
let specifier = ModuleSpecifier::parse("file:///a.ts").unwrap();
assert_eq!(
resolver.resolve_storage_key(&specifier),
Some(specifier.to_string())
);
let resolver = StorageKeyResolver(Some(None));
assert_eq!(resolver.resolve_storage_key(&specifier), None);
let resolver = StorageKeyResolver(Some(Some("value".to_string())));
assert_eq!(
resolver.resolve_storage_key(&specifier),
Some("value".to_string())
);
// test empty
let resolver = StorageKeyResolver::empty();
assert_eq!(resolver.resolve_storage_key(&specifier), None);
}
#[test]
fn jsr_urls() {
let reg_url = jsr_url();

View file

@ -5,7 +5,6 @@ use std::path::PathBuf;
use deno_core::snapshot::*;
use deno_runtime::*;
mod shared;
mod ts {
use std::collections::HashMap;
@ -13,10 +12,9 @@ mod ts {
use std::path::Path;
use std::path::PathBuf;
use deno_core::error::custom_error;
use deno_core::error::AnyError;
use deno_core::op2;
use deno_core::OpState;
use deno_error::JsErrorBox;
use serde::Serialize;
use super::*;
@ -53,7 +51,7 @@ mod ts {
fn op_script_version(
_state: &mut OpState,
#[string] _arg: &str,
) -> Result<Option<String>, AnyError> {
) -> Result<Option<String>, JsErrorBox> {
Ok(Some("1".to_string()))
}
@ -72,7 +70,7 @@ mod ts {
fn op_load(
state: &mut OpState,
#[string] load_specifier: &str,
) -> Result<LoadResponse, AnyError> {
) -> Result<LoadResponse, JsErrorBox> {
let op_crate_libs = state.borrow::<HashMap<&str, PathBuf>>();
let path_dts = state.borrow::<PathBuf>();
let re_asset = lazy_regex::regex!(r"asset:/{3}lib\.(\S+)\.d\.ts");
@ -93,12 +91,15 @@ mod ts {
// if it comes from an op crate, we were supplied with the path to the
// file.
let path = if let Some(op_crate_lib) = op_crate_libs.get(lib) {
PathBuf::from(op_crate_lib).canonicalize()?
PathBuf::from(op_crate_lib)
.canonicalize()
.map_err(JsErrorBox::from_err)?
// otherwise we will generate the path ourself
} else {
path_dts.join(format!("lib.{lib}.d.ts"))
};
let data = std::fs::read_to_string(path)?;
let data =
std::fs::read_to_string(path).map_err(JsErrorBox::from_err)?;
Ok(LoadResponse {
data,
version: "1".to_string(),
@ -106,13 +107,13 @@ mod ts {
script_kind: 3,
})
} else {
Err(custom_error(
Err(JsErrorBox::new(
"InvalidSpecifier",
format!("An invalid specifier was requested: {}", load_specifier),
))
}
} else {
Err(custom_error(
Err(JsErrorBox::new(
"InvalidSpecifier",
format!("An invalid specifier was requested: {}", load_specifier),
))
@ -308,57 +309,6 @@ mod ts {
println!("cargo:rerun-if-changed={}", path.display());
}
}
pub(crate) fn version() -> String {
let file_text = std::fs::read_to_string("tsc/00_typescript.js").unwrap();
let version_text = " version = \"";
for line in file_text.lines() {
if let Some(index) = line.find(version_text) {
let remaining_line = &line[index + version_text.len()..];
return remaining_line[..remaining_line.find('"').unwrap()].to_string();
}
}
panic!("Could not find ts version.")
}
}
#[cfg(not(feature = "hmr"))]
fn create_cli_snapshot(snapshot_path: PathBuf) {
use deno_runtime::ops::bootstrap::SnapshotOptions;
let snapshot_options = SnapshotOptions {
ts_version: ts::version(),
v8_version: deno_core::v8::VERSION_STRING,
target: std::env::var("TARGET").unwrap(),
};
deno_runtime::snapshot::create_runtime_snapshot(
snapshot_path,
snapshot_options,
vec![],
);
}
fn git_commit_hash() -> String {
if let Ok(output) = std::process::Command::new("git")
.arg("rev-list")
.arg("-1")
.arg("HEAD")
.output()
{
if output.status.success() {
std::str::from_utf8(&output.stdout[..40])
.unwrap()
.to_string()
} else {
// When not in git repository
// (e.g. when the user install by `cargo install deno`)
"UNKNOWN".to_string()
}
} else {
// When there is no git command for some reason
"UNKNOWN".to_string()
}
}
fn main() {
@ -368,7 +318,7 @@ fn main() {
}
deno_napi::print_linker_flags("deno");
deno_napi::print_linker_flags("denort");
deno_webgpu::print_linker_flags("deno");
// Host snapshots won't work when cross compiling.
let target = env::var("TARGET").unwrap();
@ -387,51 +337,15 @@ fn main() {
}
println!("cargo:rerun-if-env-changed=DENO_CANARY");
println!("cargo:rustc-env=GIT_COMMIT_HASH={}", git_commit_hash());
println!("cargo:rerun-if-env-changed=GIT_COMMIT_HASH");
println!(
"cargo:rustc-env=GIT_COMMIT_HASH_SHORT={}",
&git_commit_hash()[..7]
);
let ts_version = ts::version();
debug_assert_eq!(ts_version, "5.6.2"); // bump this assertion when it changes
println!("cargo:rustc-env=TS_VERSION={}", ts_version);
println!("cargo:rerun-if-env-changed=TS_VERSION");
println!("cargo:rustc-env=TARGET={}", env::var("TARGET").unwrap());
println!("cargo:rustc-env=PROFILE={}", env::var("PROFILE").unwrap());
if cfg!(windows) {
// these dls load slowly, so delay loading them
let dlls = [
// webgpu
"d3dcompiler_47",
"OPENGL32",
// network related functions
"iphlpapi",
];
for dll in dlls {
println!("cargo:rustc-link-arg-bin=deno=/delayload:{dll}.dll");
println!("cargo:rustc-link-arg-bin=denort=/delayload:{dll}.dll");
}
// enable delay loading
println!("cargo:rustc-link-arg-bin=deno=delayimp.lib");
println!("cargo:rustc-link-arg-bin=denort=delayimp.lib");
}
let c = PathBuf::from(env::var_os("CARGO_MANIFEST_DIR").unwrap());
let o = PathBuf::from(env::var_os("OUT_DIR").unwrap());
let compiler_snapshot_path = o.join("COMPILER_SNAPSHOT.bin");
ts::create_compiler_snapshot(compiler_snapshot_path, &c);
#[cfg(not(feature = "hmr"))]
{
let cli_snapshot_path = o.join("CLI_SNAPSHOT.bin");
create_cli_snapshot(cli_snapshot_path);
}
#[cfg(target_os = "windows")]
{
let mut res = winres::WindowsResource::new();

View file

@ -9,14 +9,13 @@ use deno_core::error::AnyError;
use deno_core::parking_lot::Mutex;
use deno_core::parking_lot::MutexGuard;
use deno_core::unsync::spawn_blocking;
use deno_lib::util::hash::FastInsecureHasher;
use deno_runtime::deno_webstorage::rusqlite;
use deno_runtime::deno_webstorage::rusqlite::Connection;
use deno_runtime::deno_webstorage::rusqlite::OptionalExtension;
use deno_runtime::deno_webstorage::rusqlite::Params;
use once_cell::sync::OnceCell;
use super::FastInsecureHasher;
#[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
pub struct CacheDBHash(u64);
@ -25,12 +24,12 @@ impl CacheDBHash {
Self(hash)
}
pub fn from_source(source: impl std::hash::Hash) -> Self {
pub fn from_hashable(hashable: impl std::hash::Hash) -> Self {
Self::new(
// always write in the deno version just in case
// the clearing on deno version change doesn't work
FastInsecureHasher::new_deno_versioned()
.write_hashable(source)
.write_hashable(hashable)
.finish(),
)
}

11
cli/cache/caches.rs vendored
View file

@ -3,17 +3,18 @@
use std::path::PathBuf;
use std::sync::Arc;
use deno_lib::version::DENO_VERSION_INFO;
use once_cell::sync::OnceCell;
use super::cache_db::CacheDB;
use super::cache_db::CacheDBConfiguration;
use super::check::TYPE_CHECK_CACHE_DB;
use super::code_cache::CODE_CACHE_DB;
use super::deno_dir::DenoDirProvider;
use super::fast_check::FAST_CHECK_CACHE_DB;
use super::incremental::INCREMENTAL_CACHE_DB;
use super::module_info::MODULE_INFO_CACHE_DB;
use super::node::NODE_ANALYSIS_CACHE_DB;
use crate::cache::DenoDirProvider;
pub struct Caches {
dir_provider: Arc<DenoDirProvider>,
@ -48,13 +49,9 @@ impl Caches {
cell
.get_or_init(|| {
if let Some(path) = path {
CacheDB::from_path(
config,
path,
crate::version::DENO_VERSION_INFO.deno,
)
CacheDB::from_path(config, path, DENO_VERSION_INFO.deno)
} else {
CacheDB::in_memory(config, crate::version::DENO_VERSION_INFO.deno)
CacheDB::in_memory(config, DENO_VERSION_INFO.deno)
}
})
.clone()

View file

@ -1,7 +1,5 @@
// Copyright 2018-2025 the Deno authors. MIT license.
use std::sync::Arc;
use deno_ast::ModuleSpecifier;
use deno_core::error::AnyError;
use deno_runtime::code_cache;
@ -11,7 +9,6 @@ use super::cache_db::CacheDB;
use super::cache_db::CacheDBConfiguration;
use super::cache_db::CacheDBHash;
use super::cache_db::CacheFailure;
use crate::worker::CliCodeCache;
pub static CODE_CACHE_DB: CacheDBConfiguration = CacheDBConfiguration {
table_initializer: concat!(
@ -85,12 +82,6 @@ impl CodeCache {
}
}
impl CliCodeCache for CodeCache {
fn as_code_cache(self: Arc<Self>) -> Arc<dyn code_cache::CodeCache> {
self
}
}
impl code_cache::CodeCache for CodeCache {
fn get_sync(
&self,

View file

@ -4,7 +4,6 @@ use std::env;
use std::path::PathBuf;
use deno_cache_dir::DenoDirResolutionError;
use once_cell::sync::OnceCell;
use super::DiskCache;
use crate::sys::CliSys;
@ -14,7 +13,7 @@ use crate::sys::CliSys;
pub struct DenoDirProvider {
sys: CliSys,
maybe_custom_root: Option<PathBuf>,
deno_dir: OnceCell<Result<DenoDir, DenoDirResolutionError>>,
deno_dir: std::sync::OnceLock<Result<DenoDir, DenoDirResolutionError>>,
}
impl DenoDirProvider {

View file

@ -9,11 +9,11 @@ use std::path::Prefix;
use std::str;
use deno_cache_dir::url_to_filename;
use deno_cache_dir::CACHE_PERM;
use deno_core::url::Host;
use deno_core::url::Url;
use deno_path_util::fs::atomic_write_file_with_retries;
use super::CACHE_PERM;
use crate::sys::CliSys;
#[derive(Debug, Clone)]
@ -130,6 +130,9 @@ impl DiskCache {
#[cfg(test)]
mod tests {
// ok, testing
#[allow(clippy::disallowed_types)]
use sys_traits::impls::RealSys;
use test_util::TempDir;
use super::*;
@ -138,7 +141,7 @@ mod tests {
fn test_set_get_cache_file() {
let temp_dir = TempDir::new();
let sub_dir = temp_dir.path().join("sub_dir");
let cache = DiskCache::new(CliSys::default(), &sub_dir.to_path_buf());
let cache = DiskCache::new(RealSys, &sub_dir.to_path_buf());
let path = PathBuf::from("foo/bar.txt");
cache.set(&path, b"hello").unwrap();
assert_eq!(cache.get(&path).unwrap(), b"hello");
@ -152,7 +155,7 @@ mod tests {
PathBuf::from("/deno_dir/")
};
let cache = DiskCache::new(CliSys::default(), &cache_location);
let cache = DiskCache::new(RealSys, &cache_location);
let mut test_cases = vec![
(
@ -208,7 +211,7 @@ mod tests {
} else {
"/foo"
};
let cache = DiskCache::new(CliSys::default(), &PathBuf::from(p));
let cache = DiskCache::new(RealSys, &PathBuf::from(p));
let mut test_cases = vec![
(
@ -256,7 +259,7 @@ mod tests {
PathBuf::from("/deno_dir/")
};
let cache = DiskCache::new(CliSys::default(), &cache_location);
let cache = DiskCache::new(RealSys, &cache_location);
let mut test_cases = vec!["unknown://localhost/test.ts"];

5
cli/cache/emit.rs vendored
View file

@ -6,6 +6,7 @@ use deno_ast::ModuleSpecifier;
use deno_core::anyhow::anyhow;
use deno_core::error::AnyError;
use deno_core::unsync::sync::AtomicFlag;
use deno_lib::version::DENO_VERSION_INFO;
use super::DiskCache;
@ -23,7 +24,7 @@ impl EmitCache {
disk_cache,
emit_failed_flag: Default::default(),
file_serializer: EmitFileSerializer {
cli_version: crate::version::DENO_VERSION_INFO.deno,
cli_version: DENO_VERSION_INFO.deno,
},
}
}
@ -147,7 +148,7 @@ impl EmitFileSerializer {
// it's ok to use an insecure hash here because
// if someone can change the emit source then they
// can also change the version hash
crate::cache::FastInsecureHasher::new_without_deno_version() // use cli_version property instead
deno_lib::util::hash::FastInsecureHasher::new_without_deno_version() // use cli_version property instead
.write(bytes)
// emit should not be re-used between cli versions
.write_str(self.cli_version)

View file

@ -34,12 +34,16 @@ pub static INCREMENTAL_CACHE_DB: CacheDBConfiguration = CacheDBConfiguration {
pub struct IncrementalCache(IncrementalCacheInner);
impl IncrementalCache {
pub fn new<TState: std::hash::Hash>(
pub fn new(
db: CacheDB,
state: &TState,
state_hash: CacheDBHash,
initial_file_paths: &[PathBuf],
) -> Self {
IncrementalCache(IncrementalCacheInner::new(db, state, initial_file_paths))
IncrementalCache(IncrementalCacheInner::new(
db,
state_hash,
initial_file_paths,
))
}
pub fn is_file_same(&self, file_path: &Path, file_text: &str) -> bool {
@ -67,12 +71,11 @@ struct IncrementalCacheInner {
}
impl IncrementalCacheInner {
pub fn new<TState: std::hash::Hash>(
pub fn new(
db: CacheDB,
state: &TState,
state_hash: CacheDBHash,
initial_file_paths: &[PathBuf],
) -> Self {
let state_hash = CacheDBHash::from_source(state);
let sql_cache = SqlIncrementalCache::new(db, state_hash);
Self::from_sql_incremental_cache(sql_cache, initial_file_paths)
}
@ -112,13 +115,13 @@ impl IncrementalCacheInner {
pub fn is_file_same(&self, file_path: &Path, file_text: &str) -> bool {
match self.previous_hashes.get(file_path) {
Some(hash) => *hash == CacheDBHash::from_source(file_text),
Some(hash) => *hash == CacheDBHash::from_hashable(file_text),
None => false,
}
}
pub fn update_file(&self, file_path: &Path, file_text: &str) {
let hash = CacheDBHash::from_source(file_text);
let hash = CacheDBHash::from_hashable(file_text);
if let Some(previous_hash) = self.previous_hashes.get(file_path) {
if *previous_hash == hash {
return; // do not bother updating the db file because nothing has changed
@ -262,7 +265,7 @@ mod test {
let sql_cache = SqlIncrementalCache::new(conn, CacheDBHash::new(1));
let file_path = PathBuf::from("/mod.ts");
let file_text = "test";
let file_hash = CacheDBHash::from_source(file_text);
let file_hash = CacheDBHash::from_hashable(file_text);
sql_cache.set_source_hash(&file_path, file_hash).unwrap();
let cache = IncrementalCacheInner::from_sql_incremental_cache(
sql_cache,

26
cli/cache/mod.rs vendored
View file

@ -8,7 +8,6 @@ use deno_ast::MediaType;
use deno_cache_dir::file_fetcher::CacheSetting;
use deno_cache_dir::file_fetcher::FetchNoFollowErrorKind;
use deno_cache_dir::file_fetcher::FileOrRedirect;
use deno_core::error::AnyError;
use deno_core::futures;
use deno_core::futures::FutureExt;
use deno_core::ModuleSpecifier;
@ -16,6 +15,7 @@ use deno_graph::source::CacheInfo;
use deno_graph::source::LoadFuture;
use deno_graph::source::LoadResponse;
use deno_graph::source::Loader;
use deno_resolver::npm::DenoInNpmPackageChecker;
use deno_runtime::deno_permissions::PermissionsContainer;
use node_resolver::InNpmPackageChecker;
@ -30,7 +30,6 @@ mod cache_db;
mod caches;
mod check;
mod code_cache;
mod common;
mod deno_dir;
mod disk_cache;
mod emit;
@ -44,7 +43,6 @@ pub use cache_db::CacheDBHash;
pub use caches::Caches;
pub use check::TypeCheckCache;
pub use code_cache::CodeCache;
pub use common::FastInsecureHasher;
/// Permissions used to save a file in the disk caches.
pub use deno_cache_dir::CACHE_PERM;
pub use deno_dir::DenoDir;
@ -62,6 +60,7 @@ pub type GlobalHttpCache = deno_cache_dir::GlobalHttpCache<CliSys>;
pub type LocalHttpCache = deno_cache_dir::LocalHttpCache<CliSys>;
pub type LocalLspHttpCache = deno_cache_dir::LocalLspHttpCache<CliSys>;
pub use deno_cache_dir::HttpCache;
use deno_error::JsErrorBox;
pub struct FetchCacherOptions {
pub file_header_overrides: HashMap<ModuleSpecifier, HashMap<String, String>>,
@ -76,7 +75,7 @@ pub struct FetchCacher {
pub file_header_overrides: HashMap<ModuleSpecifier, HashMap<String, String>>,
file_fetcher: Arc<CliFileFetcher>,
global_http_cache: Arc<GlobalHttpCache>,
in_npm_pkg_checker: Arc<dyn InNpmPackageChecker>,
in_npm_pkg_checker: DenoInNpmPackageChecker,
module_info_cache: Arc<ModuleInfoCache>,
permissions: PermissionsContainer,
sys: CliSys,
@ -88,7 +87,7 @@ impl FetchCacher {
pub fn new(
file_fetcher: Arc<CliFileFetcher>,
global_http_cache: Arc<GlobalHttpCache>,
in_npm_pkg_checker: Arc<dyn InNpmPackageChecker>,
in_npm_pkg_checker: DenoInNpmPackageChecker,
module_info_cache: Arc<ModuleInfoCache>,
sys: CliSys,
options: FetchCacherOptions,
@ -194,9 +193,9 @@ impl Loader for FetchCacher {
LoaderCacheSetting::Use => None,
LoaderCacheSetting::Reload => {
if matches!(file_fetcher.cache_setting(), CacheSetting::Only) {
return Err(deno_core::anyhow::anyhow!(
return Err(deno_graph::source::LoadError::Other(Arc::new(JsErrorBox::generic(
"Could not resolve version constraint using only cached data. Try running again without --cached-only"
));
))));
}
Some(CacheSetting::ReloadAll)
}
@ -262,28 +261,27 @@ impl Loader for FetchCacher {
FetchNoFollowErrorKind::CacheSave { .. } |
FetchNoFollowErrorKind::UnsupportedScheme { .. } |
FetchNoFollowErrorKind::RedirectHeaderParse { .. } |
FetchNoFollowErrorKind::InvalidHeader { .. } => Err(AnyError::from(err)),
FetchNoFollowErrorKind::InvalidHeader { .. } => Err(deno_graph::source::LoadError::Other(Arc::new(JsErrorBox::from_err(err)))),
FetchNoFollowErrorKind::NotCached { .. } => {
if options.cache_setting == LoaderCacheSetting::Only {
Ok(None)
} else {
Err(AnyError::from(err))
Err(deno_graph::source::LoadError::Other(Arc::new(JsErrorBox::from_err(err))))
}
},
FetchNoFollowErrorKind::ChecksumIntegrity(err) => {
// convert to the equivalent deno_graph error so that it
// enhances it if this is passed to deno_graph
Err(
deno_graph::source::ChecksumIntegrityError {
deno_graph::source::LoadError::ChecksumIntegrity(deno_graph::source::ChecksumIntegrityError {
actual: err.actual,
expected: err.expected,
}
.into(),
}),
)
}
}
},
CliFetchNoFollowErrorKind::PermissionCheck(permission_check_error) => Err(AnyError::from(permission_check_error)),
CliFetchNoFollowErrorKind::PermissionCheck(permission_check_error) => Err(deno_graph::source::LoadError::Other(Arc::new(JsErrorBox::from_err(permission_check_error)))),
}
})
}
@ -298,7 +296,7 @@ impl Loader for FetchCacher {
module_info: &deno_graph::ModuleInfo,
) {
log::debug!("Caching module info for {}", specifier);
let source_hash = CacheDBHash::from_source(source);
let source_hash = CacheDBHash::from_hashable(source);
let result = self.module_info_cache.set_module_info(
specifier,
media_type,

View file

@ -194,7 +194,7 @@ impl<'a> ModuleInfoCacheModuleAnalyzer<'a> {
source: &Arc<str>,
) -> Result<ModuleInfo, deno_ast::ParseDiagnostic> {
// attempt to load from the cache
let source_hash = CacheDBHash::from_source(source);
let source_hash = CacheDBHash::from_hashable(source);
if let Some(info) =
self.load_cached_module_info(specifier, media_type, source_hash)
{
@ -228,7 +228,7 @@ impl<'a> deno_graph::ModuleAnalyzer for ModuleInfoCacheModuleAnalyzer<'a> {
media_type: MediaType,
) -> Result<ModuleInfo, deno_ast::ParseDiagnostic> {
// attempt to load from the cache
let source_hash = CacheDBHash::from_source(&source);
let source_hash = CacheDBHash::from_hashable(&source);
if let Some(info) =
self.load_cached_module_info(specifier, media_type, source_hash)
{

View file

@ -11,22 +11,24 @@ use deno_ast::SourceRangedForSpanned;
use deno_ast::TranspileModuleOptions;
use deno_ast::TranspileResult;
use deno_core::error::AnyError;
use deno_core::error::CoreError;
use deno_core::futures::stream::FuturesUnordered;
use deno_core::futures::FutureExt;
use deno_core::futures::StreamExt;
use deno_core::ModuleSpecifier;
use deno_error::JsErrorBox;
use deno_graph::MediaType;
use deno_graph::Module;
use deno_graph::ModuleGraph;
use deno_lib::util::hash::FastInsecureHasher;
use crate::cache::EmitCache;
use crate::cache::FastInsecureHasher;
use crate::cache::ParsedSourceCache;
use crate::resolver::CjsTracker;
use crate::resolver::CliCjsTracker;
#[derive(Debug)]
pub struct Emitter {
cjs_tracker: Arc<CjsTracker>,
cjs_tracker: Arc<CliCjsTracker>,
emit_cache: Arc<EmitCache>,
parsed_source_cache: Arc<ParsedSourceCache>,
transpile_and_emit_options:
@ -37,7 +39,7 @@ pub struct Emitter {
impl Emitter {
pub fn new(
cjs_tracker: Arc<CjsTracker>,
cjs_tracker: Arc<CliCjsTracker>,
emit_cache: Arc<EmitCache>,
parsed_source_cache: Arc<ParsedSourceCache>,
transpile_options: deno_ast::TranspileOptions,
@ -110,9 +112,9 @@ impl Emitter {
&self,
specifier: &ModuleSpecifier,
media_type: MediaType,
module_kind: deno_ast::ModuleKind,
module_kind: ModuleKind,
source: &Arc<str>,
) -> Result<String, AnyError> {
) -> Result<String, EmitParsedSourceHelperError> {
// Note: keep this in sync with the sync version below
let helper = EmitParsedSourceHelper(self);
match helper.pre_emit_parsed_source(specifier, module_kind, source) {
@ -124,7 +126,7 @@ impl Emitter {
let transpiled_source = deno_core::unsync::spawn_blocking({
let specifier = specifier.clone();
let source = source.clone();
move || -> Result<_, AnyError> {
move || {
EmitParsedSourceHelper::transpile(
&parsed_source_cache,
&specifier,
@ -155,7 +157,7 @@ impl Emitter {
media_type: MediaType,
module_kind: deno_ast::ModuleKind,
source: &Arc<str>,
) -> Result<String, AnyError> {
) -> Result<String, EmitParsedSourceHelperError> {
// Note: keep this in sync with the async version above
let helper = EmitParsedSourceHelper(self);
match helper.pre_emit_parsed_source(specifier, module_kind, source) {
@ -210,7 +212,7 @@ impl Emitter {
pub async fn load_and_emit_for_hmr(
&self,
specifier: &ModuleSpecifier,
) -> Result<String, AnyError> {
) -> Result<String, CoreError> {
let media_type = MediaType::from_specifier(specifier);
let source_code = tokio::fs::read_to_string(
ModuleSpecifier::to_file_path(specifier).unwrap(),
@ -225,17 +227,21 @@ impl Emitter {
let source_arc: Arc<str> = source_code.into();
let parsed_source = self
.parsed_source_cache
.remove_or_parse_module(specifier, source_arc, media_type)?;
.remove_or_parse_module(specifier, source_arc, media_type)
.map_err(JsErrorBox::from_err)?;
// HMR doesn't work with embedded source maps for some reason, so set
// the option to not use them (though you should test this out because
// this statement is probably wrong)
let mut options = self.transpile_and_emit_options.1.clone();
options.source_map = SourceMapOption::None;
let is_cjs = self.cjs_tracker.is_cjs_with_known_is_script(
let is_cjs = self
.cjs_tracker
.is_cjs_with_known_is_script(
specifier,
media_type,
parsed_source.compute_is_script(),
)?;
)
.map_err(JsErrorBox::from_err)?;
let transpiled_source = parsed_source
.transpile(
&self.transpile_and_emit_options.0,
@ -243,7 +249,8 @@ impl Emitter {
module_kind: Some(ModuleKind::from_is_cjs(is_cjs)),
},
&options,
)?
)
.map_err(JsErrorBox::from_err)?
.into_source();
Ok(transpiled_source.text)
}
@ -282,6 +289,19 @@ enum PreEmitResult {
NotCached { source_hash: u64 },
}
#[derive(Debug, thiserror::Error, deno_error::JsError)]
pub enum EmitParsedSourceHelperError {
#[class(inherit)]
#[error(transparent)]
ParseDiagnostic(#[from] deno_ast::ParseDiagnostic),
#[class(inherit)]
#[error(transparent)]
Transpile(#[from] deno_ast::TranspileError),
#[class(inherit)]
#[error(transparent)]
Other(#[from] JsErrorBox),
}
/// Helper to share code between async and sync emit_parsed_source methods.
struct EmitParsedSourceHelper<'a>(&'a Emitter);
@ -311,7 +331,7 @@ impl<'a> EmitParsedSourceHelper<'a> {
source: Arc<str>,
transpile_options: &deno_ast::TranspileOptions,
emit_options: &deno_ast::EmitOptions,
) -> Result<EmittedSourceText, AnyError> {
) -> Result<EmittedSourceText, EmitParsedSourceHelperError> {
// nothing else needs the parsed source at this point, so remove from
// the cache in order to not transpile owned
let parsed_source = parsed_source_cache
@ -351,7 +371,7 @@ impl<'a> EmitParsedSourceHelper<'a> {
// todo(dsherret): this is a temporary measure until we have swc erroring for this
fn ensure_no_import_assertion(
parsed_source: &deno_ast::ParsedSource,
) -> Result<(), AnyError> {
) -> Result<(), JsErrorBox> {
fn has_import_assertion(text: &str) -> bool {
// good enough
text.contains(" assert ") && !text.contains(" with ")
@ -360,7 +380,7 @@ fn ensure_no_import_assertion(
fn create_err(
parsed_source: &deno_ast::ParsedSource,
range: SourceRange,
) -> AnyError {
) -> JsErrorBox {
let text_info = parsed_source.text_info_lazy();
let loc = text_info.line_and_column_display(range.start);
let mut msg = "Import assertions are deprecated. Use `with` keyword, instead of 'assert' keyword.".to_string();
@ -373,7 +393,7 @@ fn ensure_no_import_assertion(
loc.line_number,
loc.column_number,
));
deno_core::anyhow::anyhow!("{}", msg)
JsErrorBox::generic(msg)
}
let deno_ast::ProgramRef::Module(module) = parsed_source.program_ref() else {

View file

@ -1,123 +0,0 @@
// Copyright 2018-2025 the Deno authors. MIT license.
//! There are many types of errors in Deno:
//! - AnyError: a generic wrapper that can encapsulate any type of error.
//! - JsError: a container for the error message and stack trace for exceptions
//! thrown in JavaScript code. We use this to pretty-print stack traces.
//! - Diagnostic: these are errors that originate in TypeScript's compiler.
//! They're similar to JsError, in that they have line numbers. But
//! Diagnostics are compile-time type errors, whereas JsErrors are runtime
//! exceptions.
use deno_ast::ParseDiagnostic;
use deno_core::error::AnyError;
use deno_graph::source::ResolveError;
use deno_graph::ModuleError;
use deno_graph::ModuleGraphError;
use deno_graph::ModuleLoadError;
use deno_graph::ResolutionError;
use import_map::ImportMapError;
fn get_import_map_error_class(_: &ImportMapError) -> &'static str {
"URIError"
}
fn get_diagnostic_class(_: &ParseDiagnostic) -> &'static str {
"SyntaxError"
}
pub fn get_module_graph_error_class(err: &ModuleGraphError) -> &'static str {
match err {
ModuleGraphError::ResolutionError(err)
| ModuleGraphError::TypesResolutionError(err) => {
get_resolution_error_class(err)
}
ModuleGraphError::ModuleError(err) => get_module_error_class(err),
}
}
pub fn get_module_error_class(err: &ModuleError) -> &'static str {
use deno_graph::JsrLoadError;
use deno_graph::NpmLoadError;
match err {
ModuleError::InvalidTypeAssertion { .. } => "SyntaxError",
ModuleError::ParseErr(_, diagnostic) => get_diagnostic_class(diagnostic),
ModuleError::WasmParseErr(..) => "SyntaxError",
ModuleError::UnsupportedMediaType { .. }
| ModuleError::UnsupportedImportAttributeType { .. } => "TypeError",
ModuleError::Missing(_, _) | ModuleError::MissingDynamic(_, _) => {
"NotFound"
}
ModuleError::LoadingErr(_, _, err) => match err {
ModuleLoadError::Loader(err) => get_error_class_name(err.as_ref()),
ModuleLoadError::HttpsChecksumIntegrity(_)
| ModuleLoadError::TooManyRedirects => "Error",
ModuleLoadError::NodeUnknownBuiltinModule(_) => "NotFound",
ModuleLoadError::Decode(_) => "TypeError",
ModuleLoadError::Npm(err) => match err {
NpmLoadError::NotSupportedEnvironment
| NpmLoadError::PackageReqResolution(_)
| NpmLoadError::RegistryInfo(_) => "Error",
NpmLoadError::PackageReqReferenceParse(_) => "TypeError",
},
ModuleLoadError::Jsr(err) => match err {
JsrLoadError::UnsupportedManifestChecksum
| JsrLoadError::PackageFormat(_) => "TypeError",
JsrLoadError::ContentLoadExternalSpecifier
| JsrLoadError::ContentLoad(_)
| JsrLoadError::ContentChecksumIntegrity(_)
| JsrLoadError::PackageManifestLoad(_, _)
| JsrLoadError::PackageVersionManifestChecksumIntegrity(..)
| JsrLoadError::PackageVersionManifestLoad(_, _)
| JsrLoadError::RedirectInPackage(_) => "Error",
JsrLoadError::PackageNotFound(_)
| JsrLoadError::PackageReqNotFound(_)
| JsrLoadError::PackageVersionNotFound(_)
| JsrLoadError::UnknownExport { .. } => "NotFound",
},
},
}
}
fn get_resolution_error_class(err: &ResolutionError) -> &'static str {
match err {
ResolutionError::ResolverError { error, .. } => {
use ResolveError::*;
match error.as_ref() {
Specifier(_) => "TypeError",
Other(e) => get_error_class_name(e),
}
}
_ => "TypeError",
}
}
fn get_try_from_int_error_class(_: &std::num::TryFromIntError) -> &'static str {
"TypeError"
}
pub fn get_error_class_name(e: &AnyError) -> &'static str {
deno_runtime::errors::get_error_class_name(e)
.or_else(|| {
e.downcast_ref::<ImportMapError>()
.map(get_import_map_error_class)
})
.or_else(|| {
e.downcast_ref::<ParseDiagnostic>()
.map(get_diagnostic_class)
})
.or_else(|| {
e.downcast_ref::<ModuleGraphError>()
.map(get_module_graph_error_class)
})
.or_else(|| {
e.downcast_ref::<ResolutionError>()
.map(get_resolution_error_class)
})
.or_else(|| {
e.downcast_ref::<std::num::TryFromIntError>()
.map(get_try_from_int_error_class)
})
.unwrap_or("Error")
}

File diff suppressed because it is too large Load diff

View file

@ -13,7 +13,6 @@ use deno_runtime::deno_permissions::PermissionsContainer;
use crate::args::CliOptions;
use crate::module_loader::ModuleLoadPreparer;
use crate::tools::check::MaybeDiagnostics;
use crate::util::fs::collect_specifiers;
use crate::util::path::is_script_ext;
@ -70,7 +69,7 @@ impl MainModuleGraphContainer {
&self,
specifiers: &[ModuleSpecifier],
ext_overwrite: Option<&String>,
) -> Result<(), MaybeDiagnostics> {
) -> Result<(), AnyError> {
let mut graph_permit = self.acquire_update_permit().await;
let graph = graph_permit.graph_mut();
self
@ -100,7 +99,7 @@ impl MainModuleGraphContainer {
log::warn!("{} No matching files found.", colors::yellow("Warning"));
}
Ok(self.check_specifiers(&specifiers, None).await?)
self.check_specifiers(&specifiers, None).await
}
pub fn collect_specifiers(

View file

@ -1,17 +1,19 @@
// Copyright 2018-2025 the Deno authors. MIT license.
use std::error::Error;
use std::ops::Deref;
use std::path::PathBuf;
use std::sync::Arc;
use deno_config::deno_json;
use deno_config::deno_json::JsxImportSourceConfig;
use deno_config::deno_json::NodeModulesDirMode;
use deno_config::workspace::JsrPackageConfig;
use deno_core::anyhow::bail;
use deno_core::error::custom_error;
use deno_core::error::AnyError;
use deno_core::parking_lot::Mutex;
use deno_core::serde_json;
use deno_core::ModuleSpecifier;
use deno_error::JsErrorBox;
use deno_error::JsErrorClass;
use deno_graph::source::Loader;
use deno_graph::source::LoaderChecksum;
use deno_graph::source::ResolutionKind;
@ -26,14 +28,14 @@ use deno_graph::ModuleLoadError;
use deno_graph::ResolutionError;
use deno_graph::SpecifierError;
use deno_graph::WorkspaceFastCheckOption;
use deno_resolver::npm::DenoInNpmPackageChecker;
use deno_resolver::sloppy_imports::SloppyImportsCachedFs;
use deno_resolver::sloppy_imports::SloppyImportsResolutionKind;
use deno_runtime::deno_node;
use deno_runtime::deno_permissions::PermissionsContainer;
use deno_semver::jsr::JsrDepPackageReq;
use deno_semver::package::PackageNv;
use deno_semver::SmallStackString;
use import_map::ImportMapError;
use node_resolver::InNpmPackageChecker;
use crate::args::config_to_deno_graph_workspace_member;
use crate::args::jsr_url;
@ -47,16 +49,17 @@ use crate::cache::GlobalHttpCache;
use crate::cache::ModuleInfoCache;
use crate::cache::ParsedSourceCache;
use crate::colors;
use crate::errors::get_error_class_name;
use crate::errors::get_module_graph_error_class;
use crate::file_fetcher::CliFileFetcher;
use crate::npm::installer::NpmInstaller;
use crate::npm::installer::PackageCaching;
use crate::npm::CliNpmResolver;
use crate::resolver::CjsTracker;
use crate::resolver::CliCjsTracker;
use crate::resolver::CliNpmGraphResolver;
use crate::resolver::CliResolver;
use crate::resolver::CliSloppyImportsResolver;
use crate::resolver::SloppyImportsCachedFs;
use crate::sys::CliSys;
use crate::tools::check;
use crate::tools::check::CheckError;
use crate::tools::check::TypeChecker;
use crate::util::file_watcher::WatcherCommunicator;
@ -82,7 +85,7 @@ pub fn graph_valid(
sys: &CliSys,
roots: &[ModuleSpecifier],
options: GraphValidOptions,
) -> Result<(), AnyError> {
) -> Result<(), JsErrorBox> {
if options.exit_integrity_errors {
graph_exit_integrity_errors(graph);
}
@ -101,9 +104,9 @@ pub fn graph_valid(
} else {
// finally surface the npm resolution result
if let Err(err) = &graph.npm_dep_graph_result {
return Err(custom_error(
get_error_class_name(err),
format_deno_graph_error(err.as_ref().deref()),
return Err(JsErrorBox::new(
err.get_class(),
format_deno_graph_error(err),
));
}
Ok(())
@ -142,7 +145,7 @@ pub fn graph_walk_errors<'a>(
sys: &'a CliSys,
roots: &'a [ModuleSpecifier],
options: GraphWalkErrorsOptions,
) -> impl Iterator<Item = AnyError> + 'a {
) -> impl Iterator<Item = JsErrorBox> + 'a {
graph
.walk(
roots.iter(),
@ -194,7 +197,7 @@ pub fn graph_walk_errors<'a>(
return None;
}
Some(custom_error(get_module_graph_error_class(&error), message))
Some(JsErrorBox::new(error.get_class(), message))
})
}
@ -261,7 +264,7 @@ pub struct CreateGraphOptions<'a> {
pub struct ModuleGraphCreator {
options: Arc<CliOptions>,
npm_resolver: Arc<dyn CliNpmResolver>,
npm_installer: Option<Arc<NpmInstaller>>,
module_graph_builder: Arc<ModuleGraphBuilder>,
type_checker: Arc<TypeChecker>,
}
@ -269,13 +272,13 @@ pub struct ModuleGraphCreator {
impl ModuleGraphCreator {
pub fn new(
options: Arc<CliOptions>,
npm_resolver: Arc<dyn CliNpmResolver>,
npm_installer: Option<Arc<NpmInstaller>>,
module_graph_builder: Arc<ModuleGraphBuilder>,
type_checker: Arc<TypeChecker>,
) -> Self {
Self {
options,
npm_resolver,
npm_installer,
module_graph_builder,
type_checker,
}
@ -398,9 +401,9 @@ impl ModuleGraphCreator {
.build_graph_with_npm_resolution(&mut graph, options)
.await?;
if let Some(npm_resolver) = self.npm_resolver.as_managed() {
if let Some(npm_installer) = &self.npm_installer {
if graph.has_node_specifier && self.options.type_check_mode().is_true() {
npm_resolver.inject_synthetic_types_node_package().await?;
npm_installer.inject_synthetic_types_node_package().await?;
}
}
@ -434,14 +437,14 @@ impl ModuleGraphCreator {
}
}
pub fn graph_valid(&self, graph: &ModuleGraph) -> Result<(), AnyError> {
pub fn graph_valid(&self, graph: &ModuleGraph) -> Result<(), JsErrorBox> {
self.module_graph_builder.graph_valid(graph)
}
async fn type_check_graph(
&self,
graph: ModuleGraph,
) -> Result<Arc<ModuleGraph>, AnyError> {
) -> Result<Arc<ModuleGraph>, CheckError> {
self
.type_checker
.check(
@ -455,7 +458,6 @@ impl ModuleGraphCreator {
},
)
.await
.map_err(AnyError::from)
}
}
@ -465,17 +467,40 @@ pub struct BuildFastCheckGraphOptions<'a> {
pub workspace_fast_check: deno_graph::WorkspaceFastCheckOption<'a>,
}
#[derive(Debug, thiserror::Error, deno_error::JsError)]
pub enum BuildGraphWithNpmResolutionError {
#[class(inherit)]
#[error(transparent)]
SerdeJson(#[from] serde_json::Error),
#[class(inherit)]
#[error(transparent)]
ToMaybeJsxImportSourceConfig(
#[from] deno_json::ToMaybeJsxImportSourceConfigError,
),
#[class(inherit)]
#[error(transparent)]
NodeModulesDirParse(#[from] deno_json::NodeModulesDirParseError),
#[class(inherit)]
#[error(transparent)]
Other(#[from] JsErrorBox),
#[class(generic)]
#[error("Resolving npm specifier entrypoints this way is currently not supported with \"nodeModules\": \"manual\". In the meantime, try with --node-modules-dir=auto instead")]
UnsupportedNpmSpecifierEntrypointResolutionWay,
}
pub struct ModuleGraphBuilder {
caches: Arc<cache::Caches>,
cjs_tracker: Arc<CjsTracker>,
cjs_tracker: Arc<CliCjsTracker>,
cli_options: Arc<CliOptions>,
file_fetcher: Arc<CliFileFetcher>,
global_http_cache: Arc<GlobalHttpCache>,
in_npm_pkg_checker: Arc<dyn InNpmPackageChecker>,
in_npm_pkg_checker: DenoInNpmPackageChecker,
lockfile: Option<Arc<CliLockfile>>,
maybe_file_watcher_reporter: Option<FileWatcherReporter>,
module_info_cache: Arc<ModuleInfoCache>,
npm_resolver: Arc<dyn CliNpmResolver>,
npm_graph_resolver: Arc<CliNpmGraphResolver>,
npm_installer: Option<Arc<NpmInstaller>>,
npm_resolver: CliNpmResolver,
parsed_source_cache: Arc<ParsedSourceCache>,
resolver: Arc<CliResolver>,
root_permissions_container: PermissionsContainer,
@ -486,15 +511,17 @@ impl ModuleGraphBuilder {
#[allow(clippy::too_many_arguments)]
pub fn new(
caches: Arc<cache::Caches>,
cjs_tracker: Arc<CjsTracker>,
cjs_tracker: Arc<CliCjsTracker>,
cli_options: Arc<CliOptions>,
file_fetcher: Arc<CliFileFetcher>,
global_http_cache: Arc<GlobalHttpCache>,
in_npm_pkg_checker: Arc<dyn InNpmPackageChecker>,
in_npm_pkg_checker: DenoInNpmPackageChecker,
lockfile: Option<Arc<CliLockfile>>,
maybe_file_watcher_reporter: Option<FileWatcherReporter>,
module_info_cache: Arc<ModuleInfoCache>,
npm_resolver: Arc<dyn CliNpmResolver>,
npm_graph_resolver: Arc<CliNpmGraphResolver>,
npm_installer: Option<Arc<NpmInstaller>>,
npm_resolver: CliNpmResolver,
parsed_source_cache: Arc<ParsedSourceCache>,
resolver: Arc<CliResolver>,
root_permissions_container: PermissionsContainer,
@ -510,6 +537,8 @@ impl ModuleGraphBuilder {
lockfile,
maybe_file_watcher_reporter,
module_info_cache,
npm_graph_resolver,
npm_installer,
npm_resolver,
parsed_source_cache,
resolver,
@ -522,7 +551,7 @@ impl ModuleGraphBuilder {
&self,
graph: &mut ModuleGraph,
options: CreateGraphOptions<'a>,
) -> Result<(), AnyError> {
) -> Result<(), BuildGraphWithNpmResolutionError> {
enum MutLoaderRef<'a> {
Borrowed(&'a mut dyn Loader),
Owned(cache::FetchCacher),
@ -608,10 +637,7 @@ impl ModuleGraphBuilder {
Some(loader) => MutLoaderRef::Borrowed(loader),
None => MutLoaderRef::Owned(self.create_graph_loader()),
};
let cli_resolver = &self.resolver;
let graph_resolver = self.create_graph_resolver()?;
let graph_npm_resolver =
cli_resolver.create_graph_npm_resolver(options.npm_caching);
let maybe_file_watcher_reporter = self
.maybe_file_watcher_reporter
.as_ref()
@ -632,7 +658,7 @@ impl ModuleGraphBuilder {
executor: Default::default(),
file_system: &self.sys,
jsr_url_provider: &CliJsrUrlProvider,
npm_resolver: Some(&graph_npm_resolver),
npm_resolver: Some(self.npm_graph_resolver.as_ref()),
module_analyzer: &analyzer,
reporter: maybe_file_watcher_reporter,
resolver: Some(&graph_resolver),
@ -650,22 +676,21 @@ impl ModuleGraphBuilder {
loader: &'a mut dyn deno_graph::source::Loader,
options: deno_graph::BuildOptions<'a>,
npm_caching: NpmCachingStrategy,
) -> Result<(), AnyError> {
) -> Result<(), BuildGraphWithNpmResolutionError> {
// ensure an "npm install" is done if the user has explicitly
// opted into using a node_modules directory
if self
.cli_options
.node_modules_dir()?
.map(|m| m.uses_node_modules_dir())
.map(|m| m == NodeModulesDirMode::Auto)
.unwrap_or(false)
{
if let Some(npm_resolver) = self.npm_resolver.as_managed() {
let already_done =
npm_resolver.ensure_top_level_package_json_install().await?;
if !already_done && matches!(npm_caching, NpmCachingStrategy::Eager) {
npm_resolver
.cache_packages(crate::npm::PackageCaching::All)
if let Some(npm_installer) = &self.npm_installer {
let already_done = npm_installer
.ensure_top_level_package_json_install()
.await?;
if !already_done && matches!(npm_caching, NpmCachingStrategy::Eager) {
npm_installer.cache_packages(PackageCaching::All).await?;
}
}
}
@ -684,10 +709,9 @@ impl ModuleGraphBuilder {
let initial_package_deps_len = graph.packages.package_deps_sum();
let initial_package_mappings_len = graph.packages.mappings().len();
if roots.iter().any(|r| r.scheme() == "npm")
&& self.npm_resolver.as_byonm().is_some()
if roots.iter().any(|r| r.scheme() == "npm") && self.npm_resolver.is_byonm()
{
bail!("Resolving npm specifier entrypoints this way is currently not supported with \"nodeModules\": \"manual\". In the meantime, try with --node-modules-dir=auto instead");
return Err(BuildGraphWithNpmResolutionError::UnsupportedNpmSpecifierEntrypointResolutionWay);
}
graph.build(roots, loader, options).await;
@ -738,7 +762,7 @@ impl ModuleGraphBuilder {
&self,
graph: &mut ModuleGraph,
options: BuildFastCheckGraphOptions,
) -> Result<(), AnyError> {
) -> Result<(), deno_json::ToMaybeJsxImportSourceConfigError> {
if !graph.graph_kind().include_types() {
return Ok(());
}
@ -753,11 +777,7 @@ impl ModuleGraphBuilder {
None
};
let parser = self.parsed_source_cache.as_capturing_parser();
let cli_resolver = &self.resolver;
let graph_resolver = self.create_graph_resolver()?;
let graph_npm_resolver = cli_resolver.create_graph_npm_resolver(
self.cli_options.default_npm_caching_strategy(),
);
graph.build_fast_check_type_graph(
deno_graph::BuildFastCheckTypeGraphOptions {
@ -766,7 +786,7 @@ impl ModuleGraphBuilder {
fast_check_dts: false,
jsr_url_provider: &CliJsrUrlProvider,
resolver: Some(&graph_resolver),
npm_resolver: Some(&graph_npm_resolver),
npm_resolver: Some(self.npm_graph_resolver.as_ref()),
workspace_fast_check: options.workspace_fast_check,
},
);
@ -802,7 +822,7 @@ impl ModuleGraphBuilder {
/// Check if `roots` and their deps are available. Returns `Ok(())` if
/// so. Returns `Err(_)` if there is a known module graph or resolution
/// error statically reachable from `roots` and not a dynamic import.
pub fn graph_valid(&self, graph: &ModuleGraph) -> Result<(), AnyError> {
pub fn graph_valid(&self, graph: &ModuleGraph) -> Result<(), JsErrorBox> {
self.graph_roots_valid(
graph,
&graph.roots.iter().cloned().collect::<Vec<_>>(),
@ -813,7 +833,7 @@ impl ModuleGraphBuilder {
&self,
graph: &ModuleGraph,
roots: &[ModuleSpecifier],
) -> Result<(), AnyError> {
) -> Result<(), JsErrorBox> {
graph_valid(
graph,
&self.sys,
@ -830,7 +850,10 @@ impl ModuleGraphBuilder {
)
}
fn create_graph_resolver(&self) -> Result<CliGraphResolver, AnyError> {
fn create_graph_resolver(
&self,
) -> Result<CliGraphResolver, deno_json::ToMaybeJsxImportSourceConfigError>
{
let jsx_import_source_config = self
.cli_options
.start_dir
@ -998,9 +1021,11 @@ fn get_resolution_error_bare_specifier(
{
Some(specifier.as_str())
} else if let ResolutionError::ResolverError { error, .. } = error {
if let ResolveError::Other(error) = (*error).as_ref() {
if let Some(ImportMapError::UnmappedBareSpecifier(specifier, _)) =
error.downcast_ref::<ImportMapError>()
if let ResolveError::ImportMap(error) = (*error).as_ref() {
if let import_map::ImportMapErrorKind::UnmappedBareSpecifier(
specifier,
_,
) = error.as_kind()
{
Some(specifier.as_str())
} else {
@ -1037,11 +1062,12 @@ fn get_import_prefix_missing_error(error: &ResolutionError) -> Option<&str> {
ResolveError::Other(other_error) => {
if let Some(SpecifierError::ImportPrefixMissing {
specifier, ..
}) = other_error.downcast_ref::<SpecifierError>()
}) = other_error.as_any().downcast_ref::<SpecifierError>()
{
maybe_specifier = Some(specifier);
}
}
ResolveError::ImportMap(_) => {}
}
}
}
@ -1165,7 +1191,7 @@ fn format_deno_graph_error(err: &dyn Error) -> String {
#[derive(Debug)]
struct CliGraphResolver<'a> {
cjs_tracker: &'a CjsTracker,
cjs_tracker: &'a CliCjsTracker,
resolver: &'a CliResolver,
jsx_import_source_config: Option<JsxImportSourceConfig>,
}
@ -1261,7 +1287,7 @@ mod test {
let specifier = ModuleSpecifier::parse("file:///file.ts").unwrap();
let err = import_map.resolve(input, &specifier).err().unwrap();
let err = ResolutionError::ResolverError {
error: Arc::new(ResolveError::Other(err.into())),
error: Arc::new(ResolveError::ImportMap(err)),
specifier: input.to_string(),
range: Range {
specifier,

View file

@ -6,13 +6,15 @@ use std::thread::ThreadId;
use boxed_error::Boxed;
use deno_cache_dir::file_fetcher::RedirectHeaderParseError;
use deno_core::error::custom_error;
use deno_core::error::AnyError;
use deno_core::futures::StreamExt;
use deno_core::parking_lot::Mutex;
use deno_core::serde;
use deno_core::serde_json;
use deno_core::url::Url;
use deno_error::JsError;
use deno_error::JsErrorBox;
use deno_lib::version::DENO_VERSION_INFO;
use deno_runtime::deno_fetch;
use deno_runtime::deno_fetch::create_http_client;
use deno_runtime::deno_fetch::CreateHttpClientOptions;
@ -27,7 +29,6 @@ use http_body_util::BodyExt;
use thiserror::Error;
use crate::util::progress_bar::UpdateGuard;
use crate::version;
#[derive(Debug, Error)]
pub enum SendError {
@ -69,7 +70,7 @@ impl HttpClientProvider {
}
}
pub fn get_or_create(&self) -> Result<HttpClient, AnyError> {
pub fn get_or_create(&self) -> Result<HttpClient, JsErrorBox> {
use std::collections::hash_map::Entry;
let thread_id = std::thread::current().id();
let mut clients = self.clients_by_thread_id.lock();
@ -78,7 +79,7 @@ impl HttpClientProvider {
Entry::Occupied(entry) => Ok(HttpClient::new(entry.get().clone())),
Entry::Vacant(entry) => {
let client = create_http_client(
version::DENO_VERSION_INFO.user_agent,
DENO_VERSION_INFO.user_agent,
CreateHttpClientOptions {
root_cert_store: match &self.root_cert_store_provider {
Some(provider) => Some(provider.get_or_try_init()?.clone()),
@ -86,7 +87,8 @@ impl HttpClientProvider {
},
..self.options.clone()
},
)?;
)
.map_err(JsErrorBox::from_err)?;
entry.insert(client.clone());
Ok(HttpClient::new(client))
}
@ -94,34 +96,49 @@ impl HttpClientProvider {
}
}
#[derive(Debug, Error)]
#[derive(Debug, Error, JsError)]
#[class(type)]
#[error("Bad response: {:?}{}", .status_code, .response_text.as_ref().map(|s| format!("\n\n{}", s)).unwrap_or_else(String::new))]
pub struct BadResponseError {
pub status_code: StatusCode,
pub response_text: Option<String>,
}
#[derive(Debug, Boxed)]
#[derive(Debug, Boxed, JsError)]
pub struct DownloadError(pub Box<DownloadErrorKind>);
#[derive(Debug, Error)]
#[derive(Debug, Error, JsError)]
pub enum DownloadErrorKind {
#[class(inherit)]
#[error(transparent)]
Fetch(AnyError),
Fetch(deno_fetch::ClientSendError),
#[class(inherit)]
#[error(transparent)]
UrlParse(#[from] deno_core::url::ParseError),
#[class(generic)]
#[error(transparent)]
HttpParse(#[from] http::Error),
#[class(inherit)]
#[error(transparent)]
Json(#[from] serde_json::Error),
#[class(generic)]
#[error(transparent)]
ToStr(#[from] http::header::ToStrError),
#[class(inherit)]
#[error(transparent)]
RedirectHeaderParse(RedirectHeaderParseError),
#[class(type)]
#[error("Too many redirects.")]
TooManyRedirects,
#[class(inherit)]
#[error(transparent)]
BadResponse(#[from] BadResponseError),
#[class("Http")]
#[error("Not Found.")]
NotFound,
#[class(inherit)]
#[error(transparent)]
Other(JsErrorBox),
}
#[derive(Debug)]
@ -208,11 +225,11 @@ impl HttpClient {
Ok(String::from_utf8(bytes)?)
}
pub async fn download(&self, url: Url) -> Result<Vec<u8>, AnyError> {
pub async fn download(&self, url: Url) -> Result<Vec<u8>, DownloadError> {
let maybe_bytes = self.download_inner(url, None, None).await?;
match maybe_bytes {
Some(bytes) => Ok(bytes),
None => Err(custom_error("Http", "Not found.")),
None => Err(DownloadErrorKind::NotFound.into_box()),
}
}
@ -276,7 +293,7 @@ impl HttpClient {
get_response_body_with_progress(response, progress_guard)
.await
.map(|(_, body)| Some(body))
.map_err(|err| DownloadErrorKind::Fetch(err).into_box())
.map_err(|err| DownloadErrorKind::Other(err).into_box())
}
async fn get_redirected_response(
@ -293,7 +310,7 @@ impl HttpClient {
.clone()
.send(req)
.await
.map_err(|e| DownloadErrorKind::Fetch(e.into()).into_box())?;
.map_err(|e| DownloadErrorKind::Fetch(e).into_box())?;
let status = response.status();
if status.is_redirection() {
for _ in 0..5 {
@ -313,7 +330,7 @@ impl HttpClient {
.clone()
.send(req)
.await
.map_err(|e| DownloadErrorKind::Fetch(e.into()).into_box())?;
.map_err(|e| DownloadErrorKind::Fetch(e).into_box())?;
let status = new_response.status();
if status.is_redirection() {
response = new_response;
@ -332,7 +349,7 @@ impl HttpClient {
pub async fn get_response_body_with_progress(
response: http::Response<deno_fetch::ResBody>,
progress_guard: Option<&UpdateGuard>,
) -> Result<(HeaderMap, Vec<u8>), AnyError> {
) -> Result<(HeaderMap, Vec<u8>), JsErrorBox> {
use http_body::Body as _;
if let Some(progress_guard) = progress_guard {
let mut total_size = response.body().size_hint().exact();
@ -464,7 +481,7 @@ mod test {
let client = HttpClient::new(
create_http_client(
version::DENO_VERSION_INFO.user_agent,
DENO_VERSION_INFO.user_agent,
CreateHttpClientOptions {
ca_certs: vec![std::fs::read(
test_util::testdata_path().join("tls/RootCA.pem"),
@ -508,7 +525,7 @@ mod test {
let client = HttpClient::new(
create_http_client(
version::DENO_VERSION_INFO.user_agent,
DENO_VERSION_INFO.user_agent,
CreateHttpClientOptions::default(),
)
.unwrap(),
@ -549,7 +566,7 @@ mod test {
let client = HttpClient::new(
create_http_client(
version::DENO_VERSION_INFO.user_agent,
DENO_VERSION_INFO.user_agent,
CreateHttpClientOptions {
root_cert_store: Some(root_cert_store),
..Default::default()
@ -570,7 +587,7 @@ mod test {
.unwrap();
let client = HttpClient::new(
create_http_client(
version::DENO_VERSION_INFO.user_agent,
DENO_VERSION_INFO.user_agent,
CreateHttpClientOptions {
ca_certs: vec![std::fs::read(
test_util::testdata_path()
@ -603,7 +620,7 @@ mod test {
let url = Url::parse("https://localhost:5545/etag_script.ts").unwrap();
let client = HttpClient::new(
create_http_client(
version::DENO_VERSION_INFO.user_agent,
DENO_VERSION_INFO.user_agent,
CreateHttpClientOptions {
ca_certs: vec![std::fs::read(
test_util::testdata_path()
@ -644,7 +661,7 @@ mod test {
.unwrap();
let client = HttpClient::new(
create_http_client(
version::DENO_VERSION_INFO.user_agent,
DENO_VERSION_INFO.user_agent,
CreateHttpClientOptions {
ca_certs: vec![std::fs::read(
test_util::testdata_path()

View file

@ -1,18 +1,5 @@
// Copyright 2018-2025 the Deno authors. MIT license.
pub fn main() {
let mut args = vec!["cargo", "test", "-p", "cli_tests", "--features", "run"];
if !cfg!(debug_assertions) {
args.push("--release");
}
args.push("--");
// If any args were passed to this process, pass them through to the child
let orig_args = std::env::args().skip(1).collect::<Vec<_>>();
let orig_args: Vec<&str> =
orig_args.iter().map(|x| x.as_ref()).collect::<Vec<_>>();
args.extend(orig_args);
test_util::spawn::exec_replace("cargo", &args).unwrap();
// this file exists to cause the executable to be built when running cargo test
}

View file

@ -2,18 +2,7 @@
use log::debug;
#[cfg(not(feature = "hmr"))]
static CLI_SNAPSHOT: &[u8] =
include_bytes!(concat!(env!("OUT_DIR"), "/CLI_SNAPSHOT.bin"));
pub fn deno_isolate_init() -> Option<&'static [u8]> {
debug!("Deno isolate init with snapshots.");
#[cfg(not(feature = "hmr"))]
{
Some(CLI_SNAPSHOT)
}
#[cfg(feature = "hmr")]
{
None
}
deno_snapshots::CLI_SNAPSHOT
}

View file

@ -8,7 +8,7 @@ import {
restorePermissions,
} from "ext:cli/40_test_common.js";
import { Console } from "ext:deno_console/01_console.js";
import { setExitHandler } from "ext:runtime/30_os.js";
import { setExitHandler } from "ext:deno_os/30_os.js";
const {
op_register_bench,
op_bench_get_origin,

File diff suppressed because it is too large Load diff

View file

@ -406,8 +406,9 @@ export function splitSelectors(input) {
}
}
if (last < input.length - 1) {
out.push(input.slice(last).trim());
const remaining = input.slice(last).trim();
if (remaining.length > 0) {
out.push(remaining);
}
return out;
@ -743,8 +744,7 @@ export function compileSelector(selector) {
fn = matchNthChild(node, fn);
break;
case PSEUDO_HAS:
// FIXME
// fn = matchIs(part, fn);
// TODO(@marvinhagemeister)
throw new Error("TODO: :has");
case PSEUDO_NOT:
fn = matchNot(node.selectors, fn);
@ -766,8 +766,7 @@ export function compileSelector(selector) {
*/
function matchFirstChild(next) {
return (ctx, id) => {
const parent = ctx.getParent(id);
const first = ctx.getFirstChild(parent);
const first = ctx.getFirstChild(id);
return first === id && next(ctx, first);
};
}
@ -778,8 +777,7 @@ function matchFirstChild(next) {
*/
function matchLastChild(next) {
return (ctx, id) => {
const parent = ctx.getParent(id);
const last = ctx.getLastChild(parent);
const last = ctx.getLastChild(id);
return last === id && next(ctx, id);
};
}
@ -954,7 +952,9 @@ function matchElem(part, next) {
else if (part.elem === 0) return false;
const type = ctx.getType(id);
if (type > 0 && type === part.elem) return next(ctx, id);
if (type > 0 && type === part.elem) {
return next(ctx, id);
}
return false;
};
@ -967,7 +967,16 @@ function matchElem(part, next) {
*/
function matchAttrExists(attr, next) {
return (ctx, id) => {
return ctx.hasAttrPath(id, attr.prop, 0) ? next(ctx, id) : false;
try {
ctx.getAttrPathValue(id, attr.prop, 0);
return next(ctx, id);
} catch (err) {
if (err === -1) {
return false;
}
throw err;
}
};
}
@ -978,9 +987,15 @@ function matchAttrExists(attr, next) {
*/
function matchAttrBin(attr, next) {
return (ctx, id) => {
if (!ctx.hasAttrPath(id, attr.prop, 0)) return false;
try {
const value = ctx.getAttrPathValue(id, attr.prop, 0);
if (!matchAttrValue(attr, value)) return false;
} catch (err) {
if (err === -1) {
return false;
}
throw err;
}
return next(ctx, id);
};
}

View file

@ -12,6 +12,8 @@ export interface AstContext {
strTableOffset: number;
rootOffset: number;
nodes: Map<number, NodeFacade>;
spansOffset: number;
propsOffset: number;
strByType: number[];
strByProp: number[];
typeByStr: Map<string, number>;
@ -19,6 +21,12 @@ export interface AstContext {
matcher: MatchContext;
}
export interface Node {
range: Range;
}
export type Range = [number, number];
// TODO(@marvinhagemeister) Remove once we land "official" types
export interface RuleContext {
id: string;
@ -121,7 +129,6 @@ export interface MatchContext {
getSiblings(id: number): number[];
getParent(id: number): number;
getType(id: number): number;
hasAttrPath(id: number, propIds: number[], idx: number): boolean;
getAttrPathValue(id: number, propIds: number[], idx: number): unknown;
}

View file

@ -26,7 +26,7 @@ const {
TypeError,
} = primordials;
import { setExitHandler } from "ext:runtime/30_os.js";
import { setExitHandler } from "ext:deno_os/30_os.js";
// Capture `Deno` global so that users deleting or mangling it, won't
// have impact on our sanitizers.

46
cli/lib/Cargo.toml Normal file
View file

@ -0,0 +1,46 @@
# Copyright 2018-2025 the Deno authors. MIT license.
[package]
name = "deno_lib"
version = "0.2.0"
authors.workspace = true
edition.workspace = true
license.workspace = true
readme = "README.md"
repository.workspace = true
description = "Shared code between the Deno CLI and denort"
[lib]
path = "lib.rs"
[dependencies]
capacity_builder.workspace = true
deno_config.workspace = true
deno_error.workspace = true
deno_fs = { workspace = true, features = ["sync_fs"] }
deno_media_type.workspace = true
deno_node = { workspace = true, features = ["sync_fs"] }
deno_npm.workspace = true
deno_path_util.workspace = true
deno_resolver = { workspace = true, features = ["sync"] }
deno_runtime.workspace = true
deno_semver.workspace = true
deno_terminal.workspace = true
env_logger = "=0.10.0"
faster-hex.workspace = true
indexmap.workspace = true
libsui.workspace = true
log.workspace = true
node_resolver = { workspace = true, features = ["sync"] }
parking_lot.workspace = true
ring.workspace = true
serde = { workspace = true, features = ["derive"] }
serde_json.workspace = true
sys_traits = { workspace = true, features = ["getrandom"] }
thiserror.workspace = true
tokio.workspace = true
twox-hash.workspace = true
url.workspace = true
[dev-dependencies]
test_util.workspace = true

4
cli/lib/README.md Normal file
View file

@ -0,0 +1,4 @@
# deno_lib
This crate contains the shared code between the Deno CLI and denort. It is
highly unstable.

199
cli/lib/args.rs Normal file
View file

@ -0,0 +1,199 @@
// Copyright 2018-2025 the Deno authors. MIT license.
use std::io::BufReader;
use std::io::Cursor;
use std::io::Read;
use std::io::Seek;
use std::path::PathBuf;
use std::sync::LazyLock;
use deno_runtime::colors;
use deno_runtime::deno_tls::deno_native_certs::load_native_certs;
use deno_runtime::deno_tls::rustls;
use deno_runtime::deno_tls::rustls::RootCertStore;
use deno_runtime::deno_tls::rustls_pemfile;
use deno_runtime::deno_tls::webpki_roots;
use deno_semver::npm::NpmPackageReqReference;
use serde::Deserialize;
use serde::Serialize;
use thiserror::Error;
pub fn npm_pkg_req_ref_to_binary_command(
req_ref: &NpmPackageReqReference,
) -> String {
req_ref
.sub_path()
.map(|s| s.to_string())
.unwrap_or_else(|| req_ref.req().name.to_string())
}
pub fn has_trace_permissions_enabled() -> bool {
has_flag_env_var("DENO_TRACE_PERMISSIONS")
}
pub fn has_flag_env_var(name: &str) -> bool {
let value = std::env::var(name);
matches!(value.as_ref().map(|s| s.as_str()), Ok("1"))
}
#[derive(Clone, Debug, Eq, PartialEq)]
pub enum CaData {
/// The string is a file path
File(String),
/// This variant is not exposed as an option in the CLI, it is used internally
/// for standalone binaries.
Bytes(Vec<u8>),
}
#[derive(Error, Debug, Clone, deno_error::JsError)]
#[class(generic)]
pub enum RootCertStoreLoadError {
#[error(
"Unknown certificate store \"{0}\" specified (allowed: \"system,mozilla\")"
)]
UnknownStore(String),
#[error("Unable to add pem file to certificate store: {0}")]
FailedAddPemFile(String),
#[error("Failed opening CA file: {0}")]
CaFileOpenError(String),
}
/// Create and populate a root cert store based on the passed options and
/// environment.
pub fn get_root_cert_store(
maybe_root_path: Option<PathBuf>,
maybe_ca_stores: Option<Vec<String>>,
maybe_ca_data: Option<CaData>,
) -> Result<RootCertStore, RootCertStoreLoadError> {
let mut root_cert_store = RootCertStore::empty();
let ca_stores: Vec<String> = maybe_ca_stores
.or_else(|| {
let env_ca_store = std::env::var("DENO_TLS_CA_STORE").ok()?;
Some(
env_ca_store
.split(',')
.map(|s| s.trim().to_string())
.filter(|s| !s.is_empty())
.collect(),
)
})
.unwrap_or_else(|| vec!["mozilla".to_string()]);
for store in ca_stores.iter() {
match store.as_str() {
"mozilla" => {
root_cert_store.extend(webpki_roots::TLS_SERVER_ROOTS.to_vec());
}
"system" => {
let roots = load_native_certs().expect("could not load platform certs");
for root in roots {
if let Err(err) = root_cert_store
.add(rustls::pki_types::CertificateDer::from(root.0.clone()))
{
log::error!(
"{}",
colors::yellow(&format!(
"Unable to add system certificate to certificate store: {:?}",
err
))
);
let hex_encoded_root = faster_hex::hex_string(&root.0);
log::error!("{}", colors::gray(&hex_encoded_root));
}
}
}
_ => {
return Err(RootCertStoreLoadError::UnknownStore(store.clone()));
}
}
}
let ca_data =
maybe_ca_data.or_else(|| std::env::var("DENO_CERT").ok().map(CaData::File));
if let Some(ca_data) = ca_data {
let result = match ca_data {
CaData::File(ca_file) => {
let ca_file = if let Some(root) = &maybe_root_path {
root.join(&ca_file)
} else {
PathBuf::from(ca_file)
};
let certfile = std::fs::File::open(ca_file).map_err(|err| {
RootCertStoreLoadError::CaFileOpenError(err.to_string())
})?;
let mut reader = BufReader::new(certfile);
rustls_pemfile::certs(&mut reader).collect::<Result<Vec<_>, _>>()
}
CaData::Bytes(data) => {
let mut reader = BufReader::new(Cursor::new(data));
rustls_pemfile::certs(&mut reader).collect::<Result<Vec<_>, _>>()
}
};
match result {
Ok(certs) => {
root_cert_store.add_parsable_certificates(certs);
}
Err(e) => {
return Err(RootCertStoreLoadError::FailedAddPemFile(e.to_string()));
}
}
}
Ok(root_cert_store)
}
/// State provided to the process via an environment variable.
#[derive(Clone, Debug, Serialize, Deserialize)]
pub struct NpmProcessState {
pub kind: NpmProcessStateKind,
pub local_node_modules_path: Option<String>,
}
#[derive(Clone, Debug, Serialize, Deserialize)]
pub enum NpmProcessStateKind {
Snapshot(deno_npm::resolution::SerializedNpmResolutionSnapshot),
Byonm,
}
pub static NPM_PROCESS_STATE: LazyLock<Option<NpmProcessState>> =
LazyLock::new(|| {
use deno_runtime::deno_process::NPM_RESOLUTION_STATE_FD_ENV_VAR_NAME;
let fd = std::env::var(NPM_RESOLUTION_STATE_FD_ENV_VAR_NAME).ok()?;
std::env::remove_var(NPM_RESOLUTION_STATE_FD_ENV_VAR_NAME);
let fd = fd.parse::<usize>().ok()?;
let mut file = {
use deno_runtime::deno_io::FromRawIoHandle;
unsafe { std::fs::File::from_raw_io_handle(fd as _) }
};
let mut buf = Vec::new();
// seek to beginning. after the file is written the position will be inherited by this subprocess,
// and also this file might have been read before
file.seek(std::io::SeekFrom::Start(0)).unwrap();
file
.read_to_end(&mut buf)
.inspect_err(|e| {
log::error!("failed to read npm process state from fd {fd}: {e}");
})
.ok()?;
let state: NpmProcessState = serde_json::from_slice(&buf)
.inspect_err(|e| {
log::error!(
"failed to deserialize npm process state: {e} {}",
String::from_utf8_lossy(&buf)
)
})
.ok()?;
Some(state)
});
#[derive(Clone, Default, Debug, Eq, PartialEq, Serialize, Deserialize)]
pub struct UnstableConfig {
// TODO(bartlomieju): remove in Deno 2.5
pub legacy_flag_enabled: bool, // --unstable
pub bare_node_builtins: bool,
pub detect_cjs: bool,
pub sloppy_imports: bool,
pub npm_lazy_caching: bool,
pub features: Vec<String>, // --unstabe-kv --unstable-cron
}

42
cli/lib/build.rs Normal file
View file

@ -0,0 +1,42 @@
// Copyright 2018-2025 the Deno authors. MIT license.
fn main() {
// todo(dsherret): remove this after Deno 2.2.0 is published and then
// align the version of this crate with Deno then. We need to wait because
// there was previously a deno_lib 2.2.0 published (https://crates.io/crates/deno_lib/versions)
let version_path = std::path::Path::new(".").join("version.txt");
println!("cargo:rerun-if-changed={}", version_path.display());
#[allow(clippy::disallowed_methods)]
let text = std::fs::read_to_string(version_path).unwrap();
println!("cargo:rustc-env=DENO_VERSION={}", text);
let commit_hash = git_commit_hash();
println!("cargo:rustc-env=GIT_COMMIT_HASH={}", commit_hash);
println!("cargo:rerun-if-env-changed=GIT_COMMIT_HASH");
println!(
"cargo:rustc-env=GIT_COMMIT_HASH_SHORT={}",
&commit_hash[..7]
);
}
fn git_commit_hash() -> String {
if let Ok(output) = std::process::Command::new("git")
.arg("rev-list")
.arg("-1")
.arg("HEAD")
.output()
{
if output.status.success() {
std::str::from_utf8(&output.stdout[..40])
.unwrap()
.to_string()
} else {
// When not in git repository
// (e.g. when the user install by `cargo install deno`)
"UNKNOWN".to_string()
}
} else {
// When there is no git command for some reason
"UNKNOWN".to_string()
}
}

48
cli/lib/clippy.toml Normal file
View file

@ -0,0 +1,48 @@
disallowed-methods = [
{ path = "std::env::current_dir", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::path::Path::canonicalize", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::path::Path::is_dir", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::path::Path::is_file", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::path::Path::is_symlink", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::path::Path::metadata", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::path::Path::read_dir", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::path::Path::read_link", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::path::Path::symlink_metadata", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::path::Path::try_exists", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::path::PathBuf::exists", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::path::PathBuf::canonicalize", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::path::PathBuf::is_dir", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::path::PathBuf::is_file", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::path::PathBuf::is_symlink", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::path::PathBuf::metadata", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::path::PathBuf::read_dir", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::path::PathBuf::read_link", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::path::PathBuf::symlink_metadata", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::path::PathBuf::try_exists", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::env::set_current_dir", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::env::temp_dir", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::fs::canonicalize", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::fs::copy", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::fs::create_dir_all", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::fs::create_dir", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::fs::DirBuilder::new", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::fs::hard_link", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::fs::metadata", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::fs::OpenOptions::new", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::fs::read_dir", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::fs::read_link", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::fs::read_to_string", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::fs::read", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::fs::remove_dir_all", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::fs::remove_dir", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::fs::remove_file", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::fs::rename", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::fs::set_permissions", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::fs::symlink_metadata", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::fs::write", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::path::Path::canonicalize", reason = "File system operations should be done using DenoLibSys" },
{ path = "std::path::Path::exists", reason = "File system operations should be done using DenoLibSys" },
{ path = "url::Url::to_file_path", reason = "Use deno_path_util instead" },
{ path = "url::Url::from_file_path", reason = "Use deno_path_util instead" },
{ path = "url::Url::from_directory_path", reason = "Use deno_path_util instead" },
]

11
cli/lib/lib.rs Normal file
View file

@ -0,0 +1,11 @@
// Copyright 2018-2025 the Deno authors. MIT license.
pub mod args;
pub mod loader;
pub mod npm;
pub mod shared;
pub mod standalone;
pub mod sys;
pub mod util;
pub mod version;
pub mod worker;

213
cli/lib/loader.rs Normal file
View file

@ -0,0 +1,213 @@
// Copyright 2018-2025 the Deno authors. MIT license.
use std::borrow::Cow;
use std::path::PathBuf;
use std::sync::Arc;
use deno_media_type::MediaType;
use deno_resolver::cjs::CjsTracker;
use deno_resolver::npm::DenoInNpmPackageChecker;
use deno_runtime::deno_core::ModuleSourceCode;
use node_resolver::analyze::CjsCodeAnalyzer;
use node_resolver::analyze::NodeCodeTranslator;
use node_resolver::InNpmPackageChecker;
use node_resolver::IsBuiltInNodeModuleChecker;
use node_resolver::NpmPackageFolderResolver;
use thiserror::Error;
use url::Url;
use crate::sys::DenoLibSys;
use crate::util::text_encoding::from_utf8_lossy_cow;
pub struct ModuleCodeStringSource {
pub code: ModuleSourceCode,
pub found_url: Url,
pub media_type: MediaType,
}
#[derive(Debug, Error, deno_error::JsError)]
#[class(type)]
#[error("{media_type} files are not supported in npm packages: {specifier}")]
pub struct NotSupportedKindInNpmError {
pub media_type: MediaType,
pub specifier: Url,
}
#[derive(Debug, Error, deno_error::JsError)]
pub enum NpmModuleLoadError {
#[class(inherit)]
#[error(transparent)]
UrlToFilePath(#[from] deno_path_util::UrlToFilePathError),
#[class(inherit)]
#[error(transparent)]
NotSupportedKindInNpm(#[from] NotSupportedKindInNpmError),
#[class(inherit)]
#[error(transparent)]
ClosestPkgJson(#[from] node_resolver::errors::ClosestPkgJsonError),
#[class(inherit)]
#[error(transparent)]
TranslateCjsToEsm(#[from] node_resolver::analyze::TranslateCjsToEsmError),
#[class(inherit)]
#[error("Unable to load {}{}", file_path.display(), maybe_referrer.as_ref().map(|r| format!(" imported from {}", r)).unwrap_or_default())]
UnableToLoad {
file_path: PathBuf,
maybe_referrer: Option<Url>,
#[source]
#[inherit]
source: std::io::Error,
},
#[class(inherit)]
#[error(
"{}",
format_dir_import_message(file_path, maybe_referrer, suggestion)
)]
DirImport {
file_path: PathBuf,
maybe_referrer: Option<Url>,
suggestion: Option<&'static str>,
#[source]
#[inherit]
source: std::io::Error,
},
}
fn format_dir_import_message(
file_path: &std::path::Path,
maybe_referrer: &Option<Url>,
suggestion: &Option<&'static str>,
) -> String {
// directory imports are not allowed when importing from an
// ES module, so provide the user with a helpful error message
let dir_path = file_path;
let mut msg = "Directory import ".to_string();
msg.push_str(&dir_path.to_string_lossy());
if let Some(referrer) = maybe_referrer {
msg.push_str(" is not supported resolving import from ");
msg.push_str(referrer.as_str());
if let Some(entrypoint_name) = suggestion {
msg.push_str("\nDid you mean to import ");
msg.push_str(entrypoint_name);
msg.push_str(" within the directory?");
}
}
msg
}
#[derive(Clone)]
pub struct NpmModuleLoader<
TCjsCodeAnalyzer: CjsCodeAnalyzer,
TInNpmPackageChecker: InNpmPackageChecker,
TIsBuiltInNodeModuleChecker: IsBuiltInNodeModuleChecker,
TNpmPackageFolderResolver: NpmPackageFolderResolver,
TSys: DenoLibSys,
> {
cjs_tracker: Arc<CjsTracker<DenoInNpmPackageChecker, TSys>>,
sys: TSys,
node_code_translator: Arc<
NodeCodeTranslator<
TCjsCodeAnalyzer,
TInNpmPackageChecker,
TIsBuiltInNodeModuleChecker,
TNpmPackageFolderResolver,
TSys,
>,
>,
}
impl<
TCjsCodeAnalyzer: CjsCodeAnalyzer,
TInNpmPackageChecker: InNpmPackageChecker,
TIsBuiltInNodeModuleChecker: IsBuiltInNodeModuleChecker,
TNpmPackageFolderResolver: NpmPackageFolderResolver,
TSys: DenoLibSys,
>
NpmModuleLoader<
TCjsCodeAnalyzer,
TInNpmPackageChecker,
TIsBuiltInNodeModuleChecker,
TNpmPackageFolderResolver,
TSys,
>
{
pub fn new(
cjs_tracker: Arc<CjsTracker<DenoInNpmPackageChecker, TSys>>,
node_code_translator: Arc<
NodeCodeTranslator<
TCjsCodeAnalyzer,
TInNpmPackageChecker,
TIsBuiltInNodeModuleChecker,
TNpmPackageFolderResolver,
TSys,
>,
>,
sys: TSys,
) -> Self {
Self {
cjs_tracker,
node_code_translator,
sys,
}
}
pub async fn load(
&self,
specifier: &Url,
maybe_referrer: Option<&Url>,
) -> Result<ModuleCodeStringSource, NpmModuleLoadError> {
let file_path = deno_path_util::url_to_file_path(specifier)?;
let code = self.sys.fs_read(&file_path).map_err(|source| {
if self.sys.fs_is_dir_no_err(&file_path) {
let suggestion = ["index.mjs", "index.js", "index.cjs"]
.into_iter()
.find(|e| self.sys.fs_is_file_no_err(file_path.join(e)));
NpmModuleLoadError::DirImport {
file_path,
maybe_referrer: maybe_referrer.cloned(),
suggestion,
source,
}
} else {
NpmModuleLoadError::UnableToLoad {
file_path,
maybe_referrer: maybe_referrer.cloned(),
source,
}
}
})?;
let media_type = MediaType::from_specifier(specifier);
if media_type.is_emittable() {
return Err(NpmModuleLoadError::NotSupportedKindInNpm(
NotSupportedKindInNpmError {
media_type,
specifier: specifier.clone(),
},
));
}
let code = if self.cjs_tracker.is_maybe_cjs(specifier, media_type)? {
// translate cjs to esm if it's cjs and inject node globals
let code = from_utf8_lossy_cow(code);
ModuleSourceCode::String(
self
.node_code_translator
.translate_cjs_to_esm(specifier, Some(code))
.await?
.into_owned()
.into(),
)
} else {
// esm and json code is untouched
ModuleSourceCode::Bytes(match code {
Cow::Owned(bytes) => bytes.into_boxed_slice().into(),
Cow::Borrowed(bytes) => bytes.into(),
})
};
Ok(ModuleCodeStringSource {
code,
found_url: specifier.clone(),
media_type: MediaType::from_specifier(specifier),
})
}
}

80
cli/lib/npm/mod.rs Normal file
View file

@ -0,0 +1,80 @@
// Copyright 2018-2025 the Deno authors. MIT license.
mod permission_checker;
use std::path::Path;
use std::sync::Arc;
use deno_npm::resolution::ValidSerializedNpmResolutionSnapshot;
use deno_resolver::npm::ByonmNpmResolver;
use deno_resolver::npm::ManagedNpmResolverRc;
use deno_resolver::npm::NpmResolver;
use deno_runtime::deno_process::NpmProcessStateProvider;
use deno_runtime::deno_process::NpmProcessStateProviderRc;
pub use permission_checker::NpmRegistryReadPermissionChecker;
pub use permission_checker::NpmRegistryReadPermissionCheckerMode;
use crate::args::NpmProcessState;
use crate::args::NpmProcessStateKind;
use crate::sys::DenoLibSys;
pub fn create_npm_process_state_provider<TSys: DenoLibSys>(
npm_resolver: &NpmResolver<TSys>,
) -> NpmProcessStateProviderRc {
match npm_resolver {
NpmResolver::Byonm(byonm_npm_resolver) => {
Arc::new(ByonmNpmProcessStateProvider(byonm_npm_resolver.clone()))
}
NpmResolver::Managed(managed_npm_resolver) => {
Arc::new(ManagedNpmProcessStateProvider(managed_npm_resolver.clone()))
}
}
}
pub fn npm_process_state(
snapshot: ValidSerializedNpmResolutionSnapshot,
node_modules_path: Option<&Path>,
) -> String {
serde_json::to_string(&NpmProcessState {
kind: NpmProcessStateKind::Snapshot(snapshot.into_serialized()),
local_node_modules_path: node_modules_path
.map(|p| p.to_string_lossy().to_string()),
})
.unwrap()
}
#[derive(Debug)]
pub struct ManagedNpmProcessStateProvider<TSys: DenoLibSys>(
pub ManagedNpmResolverRc<TSys>,
);
impl<TSys: DenoLibSys> NpmProcessStateProvider
for ManagedNpmProcessStateProvider<TSys>
{
fn get_npm_process_state(&self) -> String {
npm_process_state(
self.0.resolution().serialized_valid_snapshot(),
self.0.root_node_modules_path(),
)
}
}
#[derive(Debug)]
pub struct ByonmNpmProcessStateProvider<TSys: DenoLibSys>(
pub Arc<ByonmNpmResolver<TSys>>,
);
impl<TSys: DenoLibSys> NpmProcessStateProvider
for ByonmNpmProcessStateProvider<TSys>
{
fn get_npm_process_state(&self) -> String {
serde_json::to_string(&NpmProcessState {
kind: NpmProcessStateKind::Byonm,
local_node_modules_path: self
.0
.root_node_modules_path()
.map(|p| p.to_string_lossy().to_string()),
})
.unwrap()
}
}

View file

@ -6,13 +6,11 @@ use std::io::ErrorKind;
use std::path::Path;
use std::path::PathBuf;
use deno_core::anyhow::Context;
use deno_core::error::AnyError;
use deno_core::parking_lot::Mutex;
use deno_error::JsErrorBox;
use deno_runtime::deno_node::NodePermissions;
use sys_traits::FsCanonicalize;
use parking_lot::Mutex;
use crate::sys::CliSys;
use crate::sys::DenoLibSys;
#[derive(Debug)]
pub enum NpmRegistryReadPermissionCheckerMode {
@ -22,14 +20,24 @@ pub enum NpmRegistryReadPermissionCheckerMode {
}
#[derive(Debug)]
pub struct NpmRegistryReadPermissionChecker {
sys: CliSys,
pub struct NpmRegistryReadPermissionChecker<TSys: DenoLibSys> {
sys: TSys,
cache: Mutex<HashMap<PathBuf, PathBuf>>,
mode: NpmRegistryReadPermissionCheckerMode,
}
impl NpmRegistryReadPermissionChecker {
pub fn new(sys: CliSys, mode: NpmRegistryReadPermissionCheckerMode) -> Self {
#[derive(Debug, thiserror::Error, deno_error::JsError)]
#[class(inherit)]
#[error("failed canonicalizing '{path}'")]
struct EnsureRegistryReadPermissionError {
path: PathBuf,
#[source]
#[inherit]
source: std::io::Error,
}
impl<TSys: DenoLibSys> NpmRegistryReadPermissionChecker<TSys> {
pub fn new(sys: TSys, mode: NpmRegistryReadPermissionCheckerMode) -> Self {
Self {
sys,
cache: Default::default(),
@ -42,7 +50,7 @@ impl NpmRegistryReadPermissionChecker {
&self,
permissions: &mut dyn NodePermissions,
path: &'a Path,
) -> Result<Cow<'a, Path>, AnyError> {
) -> Result<Cow<'a, Path>, JsErrorBox> {
if permissions.query_read_all() {
return Ok(Cow::Borrowed(path)); // skip permissions checks below
}
@ -52,7 +60,9 @@ impl NpmRegistryReadPermissionChecker {
if path.components().any(|c| c.as_os_str() == "node_modules") {
Ok(Cow::Borrowed(path))
} else {
permissions.check_read_path(path).map_err(Into::into)
permissions
.check_read_path(path)
.map_err(JsErrorBox::from_err)
}
}
NpmRegistryReadPermissionCheckerMode::Global(registry_path)
@ -66,7 +76,7 @@ impl NpmRegistryReadPermissionChecker {
if is_path_in_node_modules {
let mut cache = self.cache.lock();
let mut canonicalize =
|path: &Path| -> Result<Option<PathBuf>, AnyError> {
|path: &Path| -> Result<Option<PathBuf>, JsErrorBox> {
match cache.get(path) {
Some(canon) => Ok(Some(canon.clone())),
None => match self.sys.fs_canonicalize(path) {
@ -78,9 +88,12 @@ impl NpmRegistryReadPermissionChecker {
if e.kind() == ErrorKind::NotFound {
return Ok(None);
}
Err(AnyError::from(e)).with_context(|| {
format!("failed canonicalizing '{}'", path.display())
})
Err(JsErrorBox::from_err(
EnsureRegistryReadPermissionError {
path: path.to_path_buf(),
source: e,
},
))
}
},
}
@ -98,7 +111,9 @@ impl NpmRegistryReadPermissionChecker {
}
}
permissions.check_read_path(path).map_err(Into::into)
permissions
.check_read_path(path)
.map_err(JsErrorBox::from_err)
}
}
}

View file

@ -1,8 +1,11 @@
// Copyright 2018-2025 the Deno authors. MIT license.
/// This module is shared between build script and the binaries. Use it sparsely.
use deno_core::anyhow::bail;
use deno_core::error::AnyError;
use thiserror::Error;
#[derive(Debug, Error)]
#[error("Unrecognized release channel: {0}")]
pub struct UnrecognizedReleaseChannelError(pub String);
#[derive(Debug, Clone, Copy, PartialEq)]
pub enum ReleaseChannel {
@ -50,13 +53,17 @@ impl ReleaseChannel {
// NOTE(bartlomieju): do not ever change these values, tools like `patchver`
// rely on them.
#[allow(unused)]
pub fn deserialize(str_: &str) -> Result<Self, AnyError> {
pub fn deserialize(
str_: &str,
) -> Result<Self, UnrecognizedReleaseChannelError> {
Ok(match str_ {
"stable" => Self::Stable,
"canary" => Self::Canary,
"rc" => Self::Rc,
"lts" => Self::Lts,
unknown => bail!("Unrecognized release channel: {}", unknown),
unknown => {
return Err(UnrecognizedReleaseChannelError(unknown.to_string()))
}
})
}
}

View file

@ -0,0 +1,389 @@
// Copyright 2018-2025 the Deno authors. MIT license.
use std::borrow::Cow;
use std::collections::BTreeMap;
use deno_config::workspace::PackageJsonDepResolution;
use deno_media_type::MediaType;
use deno_runtime::deno_permissions::PermissionsOptions;
use deno_runtime::deno_telemetry::OtelConfig;
use deno_semver::Version;
use indexmap::IndexMap;
use node_resolver::analyze::CjsAnalysisExports;
use serde::Deserialize;
use serde::Serialize;
use url::Url;
use super::virtual_fs::FileSystemCaseSensitivity;
use crate::args::UnstableConfig;
pub const MAGIC_BYTES: &[u8; 8] = b"d3n0l4nd";
pub trait DenoRtDeserializable<'a>: Sized {
fn deserialize(input: &'a [u8]) -> std::io::Result<(&'a [u8], Self)>;
}
impl<'a> DenoRtDeserializable<'a> for Cow<'a, [u8]> {
fn deserialize(input: &'a [u8]) -> std::io::Result<(&'a [u8], Self)> {
let (input, data) = read_bytes_with_u32_len(input)?;
Ok((input, Cow::Borrowed(data)))
}
}
pub trait DenoRtSerializable<'a> {
fn serialize(
&'a self,
builder: &mut capacity_builder::BytesBuilder<'a, Vec<u8>>,
);
}
#[derive(Deserialize, Serialize)]
pub enum NodeModules {
Managed {
/// Relative path for the node_modules directory in the vfs.
node_modules_dir: Option<String>,
},
Byonm {
root_node_modules_dir: Option<String>,
},
}
#[derive(Deserialize, Serialize)]
pub struct SerializedWorkspaceResolverImportMap {
pub specifier: String,
pub json: String,
}
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
pub struct SerializedResolverWorkspaceJsrPackage {
pub relative_base: String,
pub name: String,
pub version: Option<Version>,
pub exports: IndexMap<String, String>,
}
#[derive(Deserialize, Serialize)]
pub struct SerializedWorkspaceResolver {
pub import_map: Option<SerializedWorkspaceResolverImportMap>,
pub jsr_pkgs: Vec<SerializedResolverWorkspaceJsrPackage>,
pub package_jsons: BTreeMap<String, serde_json::Value>,
pub pkg_json_resolution: PackageJsonDepResolution,
}
// Note: Don't use hashmaps/hashsets. Ensure the serialization
// is deterministic.
#[derive(Deserialize, Serialize)]
pub struct Metadata {
pub argv: Vec<String>,
pub seed: Option<u64>,
pub code_cache_key: Option<u64>,
pub permissions: PermissionsOptions,
pub location: Option<Url>,
pub v8_flags: Vec<String>,
pub log_level: Option<log::Level>,
pub ca_stores: Option<Vec<String>>,
pub ca_data: Option<Vec<u8>>,
pub unsafely_ignore_certificate_errors: Option<Vec<String>>,
pub env_vars_from_env_file: IndexMap<String, String>,
pub workspace_resolver: SerializedWorkspaceResolver,
pub entrypoint_key: String,
pub node_modules: Option<NodeModules>,
pub unstable_config: UnstableConfig,
pub otel_config: OtelConfig,
pub vfs_case_sensitivity: FileSystemCaseSensitivity,
}
#[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
pub struct SpecifierId(u32);
impl SpecifierId {
pub fn new(id: u32) -> Self {
Self(id)
}
}
impl<'a> capacity_builder::BytesAppendable<'a> for SpecifierId {
fn append_to_builder<TBytes: capacity_builder::BytesType>(
self,
builder: &mut capacity_builder::BytesBuilder<'a, TBytes>,
) {
builder.append_le(self.0);
}
}
impl<'a> DenoRtSerializable<'a> for SpecifierId {
fn serialize(
&'a self,
builder: &mut capacity_builder::BytesBuilder<'a, Vec<u8>>,
) {
builder.append_le(self.0);
}
}
impl<'a> DenoRtDeserializable<'a> for SpecifierId {
fn deserialize(input: &'a [u8]) -> std::io::Result<(&'a [u8], Self)> {
let (input, id) = read_u32(input)?;
Ok((input, Self(id)))
}
}
#[derive(Deserialize, Serialize)]
pub enum CjsExportAnalysisEntry {
Esm,
Cjs(CjsAnalysisExports),
}
const HAS_TRANSPILED_FLAG: u8 = 1 << 0;
const HAS_SOURCE_MAP_FLAG: u8 = 1 << 1;
const HAS_CJS_EXPORT_ANALYSIS_FLAG: u8 = 1 << 2;
pub struct RemoteModuleEntry<'a> {
pub media_type: MediaType,
pub data: Cow<'a, [u8]>,
pub maybe_transpiled: Option<Cow<'a, [u8]>>,
pub maybe_source_map: Option<Cow<'a, [u8]>>,
pub maybe_cjs_export_analysis: Option<Cow<'a, [u8]>>,
}
impl<'a> DenoRtSerializable<'a> for RemoteModuleEntry<'a> {
fn serialize(
&'a self,
builder: &mut capacity_builder::BytesBuilder<'a, Vec<u8>>,
) {
fn append_maybe_data<'a>(
builder: &mut capacity_builder::BytesBuilder<'a, Vec<u8>>,
maybe_data: Option<&'a [u8]>,
) {
if let Some(data) = maybe_data {
builder.append_le(data.len() as u32);
builder.append(data);
}
}
let mut has_data_flags = 0;
if self.maybe_transpiled.is_some() {
has_data_flags |= HAS_TRANSPILED_FLAG;
}
if self.maybe_source_map.is_some() {
has_data_flags |= HAS_SOURCE_MAP_FLAG;
}
if self.maybe_cjs_export_analysis.is_some() {
has_data_flags |= HAS_CJS_EXPORT_ANALYSIS_FLAG;
}
builder.append(serialize_media_type(self.media_type));
builder.append_le(self.data.len() as u32);
builder.append(self.data.as_ref());
builder.append(has_data_flags);
append_maybe_data(builder, self.maybe_transpiled.as_deref());
append_maybe_data(builder, self.maybe_source_map.as_deref());
append_maybe_data(builder, self.maybe_cjs_export_analysis.as_deref());
}
}
impl<'a> DenoRtDeserializable<'a> for RemoteModuleEntry<'a> {
fn deserialize(input: &'a [u8]) -> std::io::Result<(&'a [u8], Self)> {
#[allow(clippy::type_complexity)]
fn deserialize_data_if_has_flag(
input: &[u8],
has_data_flags: u8,
flag: u8,
) -> std::io::Result<(&[u8], Option<Cow<[u8]>>)> {
if has_data_flags & flag != 0 {
let (input, bytes) = read_bytes_with_u32_len(input)?;
Ok((input, Some(Cow::Borrowed(bytes))))
} else {
Ok((input, None))
}
}
let (input, media_type) = MediaType::deserialize(input)?;
let (input, data) = read_bytes_with_u32_len(input)?;
let (input, has_data_flags) = read_u8(input)?;
let (input, maybe_transpiled) =
deserialize_data_if_has_flag(input, has_data_flags, HAS_TRANSPILED_FLAG)?;
let (input, maybe_source_map) =
deserialize_data_if_has_flag(input, has_data_flags, HAS_SOURCE_MAP_FLAG)?;
let (input, maybe_cjs_export_analysis) = deserialize_data_if_has_flag(
input,
has_data_flags,
HAS_CJS_EXPORT_ANALYSIS_FLAG,
)?;
Ok((
input,
Self {
media_type,
data: Cow::Borrowed(data),
maybe_transpiled,
maybe_source_map,
maybe_cjs_export_analysis,
},
))
}
}
fn serialize_media_type(media_type: MediaType) -> u8 {
match media_type {
MediaType::JavaScript => 0,
MediaType::Jsx => 1,
MediaType::Mjs => 2,
MediaType::Cjs => 3,
MediaType::TypeScript => 4,
MediaType::Mts => 5,
MediaType::Cts => 6,
MediaType::Dts => 7,
MediaType::Dmts => 8,
MediaType::Dcts => 9,
MediaType::Tsx => 10,
MediaType::Json => 11,
MediaType::Wasm => 12,
MediaType::Css => 13,
MediaType::SourceMap => 14,
MediaType::Unknown => 15,
}
}
impl<'a> DenoRtDeserializable<'a> for MediaType {
fn deserialize(input: &'a [u8]) -> std::io::Result<(&'a [u8], Self)> {
let (input, value) = read_u8(input)?;
let value = match value {
0 => MediaType::JavaScript,
1 => MediaType::Jsx,
2 => MediaType::Mjs,
3 => MediaType::Cjs,
4 => MediaType::TypeScript,
5 => MediaType::Mts,
6 => MediaType::Cts,
7 => MediaType::Dts,
8 => MediaType::Dmts,
9 => MediaType::Dcts,
10 => MediaType::Tsx,
11 => MediaType::Json,
12 => MediaType::Wasm,
13 => MediaType::Css,
14 => MediaType::SourceMap,
15 => MediaType::Unknown,
value => {
return Err(std::io::Error::new(
std::io::ErrorKind::InvalidData,
format!("Unknown media type value: {value}"),
))
}
};
Ok((input, value))
}
}
/// Data stored keyed by specifier.
pub struct SpecifierDataStore<TData> {
data: IndexMap<SpecifierId, TData>,
}
impl<TData> Default for SpecifierDataStore<TData> {
fn default() -> Self {
Self {
data: IndexMap::new(),
}
}
}
impl<TData> SpecifierDataStore<TData> {
pub fn with_capacity(capacity: usize) -> Self {
Self {
data: IndexMap::with_capacity(capacity),
}
}
pub fn iter(&self) -> impl Iterator<Item = (SpecifierId, &TData)> {
self.data.iter().map(|(k, v)| (*k, v))
}
#[allow(clippy::len_without_is_empty)]
pub fn len(&self) -> usize {
self.data.len()
}
pub fn contains(&self, specifier: SpecifierId) -> bool {
self.data.contains_key(&specifier)
}
pub fn add(&mut self, specifier: SpecifierId, value: TData) {
self.data.insert(specifier, value);
}
pub fn get(&self, specifier: SpecifierId) -> Option<&TData> {
self.data.get(&specifier)
}
}
impl<'a, TData> SpecifierDataStore<TData>
where
TData: DenoRtSerializable<'a> + 'a,
{
pub fn serialize(
&'a self,
builder: &mut capacity_builder::BytesBuilder<'a, Vec<u8>>,
) {
builder.append_le(self.len() as u32);
for (specifier, value) in self.iter() {
builder.append(specifier);
value.serialize(builder);
}
}
}
impl<'a, TData> DenoRtDeserializable<'a> for SpecifierDataStore<TData>
where
TData: DenoRtDeserializable<'a>,
{
fn deserialize(input: &'a [u8]) -> std::io::Result<(&'a [u8], Self)> {
let (input, len) = read_u32_as_usize(input)?;
let mut data = IndexMap::with_capacity(len);
let mut input = input;
for _ in 0..len {
let (new_input, specifier) = SpecifierId::deserialize(input)?;
let (new_input, value) = TData::deserialize(new_input)?;
data.insert(specifier, value);
input = new_input;
}
Ok((input, Self { data }))
}
}
fn read_bytes_with_u32_len(input: &[u8]) -> std::io::Result<(&[u8], &[u8])> {
let (input, len) = read_u32_as_usize(input)?;
let (input, data) = read_bytes(input, len)?;
Ok((input, data))
}
fn read_u32_as_usize(input: &[u8]) -> std::io::Result<(&[u8], usize)> {
read_u32(input).map(|(input, len)| (input, len as usize))
}
fn read_u32(input: &[u8]) -> std::io::Result<(&[u8], u32)> {
let (input, len_bytes) = read_bytes(input, 4)?;
let len = u32::from_le_bytes(len_bytes.try_into().unwrap());
Ok((input, len))
}
fn read_u8(input: &[u8]) -> std::io::Result<(&[u8], u8)> {
check_has_len(input, 1)?;
Ok((&input[1..], input[0]))
}
fn read_bytes(input: &[u8], len: usize) -> std::io::Result<(&[u8], &[u8])> {
check_has_len(input, len)?;
let (len_bytes, input) = input.split_at(len);
Ok((input, len_bytes))
}
#[inline(always)]
fn check_has_len(input: &[u8], len: usize) -> std::io::Result<()> {
if input.len() < len {
Err(std::io::Error::new(
std::io::ErrorKind::InvalidData,
"Unexpected end of data",
))
} else {
Ok(())
}
}

View file

@ -0,0 +1,4 @@
// Copyright 2018-2025 the Deno authors. MIT license.
pub mod binary;
pub mod virtual_fs;

View file

@ -0,0 +1,999 @@
// Copyright 2018-2025 the Deno authors. MIT license.
use std::cmp::Ordering;
use std::collections::hash_map::Entry;
use std::collections::HashMap;
use std::collections::VecDeque;
use std::fmt;
use std::path::Path;
use std::path::PathBuf;
use deno_path_util::normalize_path;
use deno_path_util::strip_unc_prefix;
use deno_runtime::colors;
use deno_runtime::deno_core::anyhow::bail;
use deno_runtime::deno_core::anyhow::Context;
use deno_runtime::deno_core::error::AnyError;
use indexmap::IndexSet;
use serde::de;
use serde::de::SeqAccess;
use serde::de::Visitor;
use serde::Deserialize;
use serde::Deserializer;
use serde::Serialize;
use serde::Serializer;
#[derive(Debug, PartialEq, Eq)]
pub enum WindowsSystemRootablePath {
/// The root of the system above any drive letters.
WindowSystemRoot,
Path(PathBuf),
}
impl WindowsSystemRootablePath {
pub fn root_for_current_os() -> Self {
if cfg!(windows) {
WindowsSystemRootablePath::WindowSystemRoot
} else {
WindowsSystemRootablePath::Path(PathBuf::from("/"))
}
}
pub fn join(&self, name_component: &str) -> PathBuf {
// this method doesn't handle multiple components
debug_assert!(
!name_component.contains('\\'),
"Invalid component: {}",
name_component
);
debug_assert!(
!name_component.contains('/'),
"Invalid component: {}",
name_component
);
match self {
WindowsSystemRootablePath::WindowSystemRoot => {
// windows drive letter
PathBuf::from(&format!("{}\\", name_component))
}
WindowsSystemRootablePath::Path(path) => path.join(name_component),
}
}
}
#[derive(Debug, Copy, Clone, Serialize, Deserialize)]
pub enum FileSystemCaseSensitivity {
#[serde(rename = "s")]
Sensitive,
#[serde(rename = "i")]
Insensitive,
}
#[derive(Debug, Default, Serialize, Deserialize)]
pub struct VirtualDirectoryEntries(Vec<VfsEntry>);
impl VirtualDirectoryEntries {
pub fn new(mut entries: Vec<VfsEntry>) -> Self {
// needs to be sorted by name
entries.sort_by(|a, b| a.name().cmp(b.name()));
Self(entries)
}
pub fn iter_mut(&mut self) -> std::slice::IterMut<'_, VfsEntry> {
self.0.iter_mut()
}
pub fn iter(&self) -> std::slice::Iter<'_, VfsEntry> {
self.0.iter()
}
pub fn take_inner(&mut self) -> Vec<VfsEntry> {
std::mem::take(&mut self.0)
}
pub fn is_empty(&self) -> bool {
self.0.is_empty()
}
pub fn len(&self) -> usize {
self.0.len()
}
pub fn get_by_name(
&self,
name: &str,
case_sensitivity: FileSystemCaseSensitivity,
) -> Option<&VfsEntry> {
self
.binary_search(name, case_sensitivity)
.ok()
.map(|index| &self.0[index])
}
pub fn get_mut_by_name(
&mut self,
name: &str,
case_sensitivity: FileSystemCaseSensitivity,
) -> Option<&mut VfsEntry> {
self
.binary_search(name, case_sensitivity)
.ok()
.map(|index| &mut self.0[index])
}
pub fn get_mut_by_index(&mut self, index: usize) -> Option<&mut VfsEntry> {
self.0.get_mut(index)
}
pub fn get_by_index(&self, index: usize) -> Option<&VfsEntry> {
self.0.get(index)
}
pub fn binary_search(
&self,
name: &str,
case_sensitivity: FileSystemCaseSensitivity,
) -> Result<usize, usize> {
match case_sensitivity {
FileSystemCaseSensitivity::Sensitive => {
self.0.binary_search_by(|e| e.name().cmp(name))
}
FileSystemCaseSensitivity::Insensitive => self.0.binary_search_by(|e| {
e.name()
.chars()
.zip(name.chars())
.map(|(a, b)| a.to_ascii_lowercase().cmp(&b.to_ascii_lowercase()))
.find(|&ord| ord != Ordering::Equal)
.unwrap_or_else(|| e.name().len().cmp(&name.len()))
}),
}
}
pub fn insert(
&mut self,
entry: VfsEntry,
case_sensitivity: FileSystemCaseSensitivity,
) -> usize {
match self.binary_search(entry.name(), case_sensitivity) {
Ok(index) => {
self.0[index] = entry;
index
}
Err(insert_index) => {
self.0.insert(insert_index, entry);
insert_index
}
}
}
pub fn insert_or_modify(
&mut self,
name: &str,
case_sensitivity: FileSystemCaseSensitivity,
on_insert: impl FnOnce() -> VfsEntry,
on_modify: impl FnOnce(&mut VfsEntry),
) -> usize {
match self.binary_search(name, case_sensitivity) {
Ok(index) => {
on_modify(&mut self.0[index]);
index
}
Err(insert_index) => {
self.0.insert(insert_index, on_insert());
insert_index
}
}
}
pub fn remove(&mut self, index: usize) -> VfsEntry {
self.0.remove(index)
}
}
#[derive(Debug, Serialize, Deserialize)]
pub struct VirtualDirectory {
#[serde(rename = "n")]
pub name: String,
// should be sorted by name
#[serde(rename = "e")]
pub entries: VirtualDirectoryEntries,
}
#[derive(Debug, Clone, Copy)]
pub struct OffsetWithLength {
pub offset: u64,
pub len: u64,
}
// serialize as an array in order to save space
impl Serialize for OffsetWithLength {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
let array = [self.offset, self.len];
array.serialize(serializer)
}
}
impl<'de> Deserialize<'de> for OffsetWithLength {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where
D: Deserializer<'de>,
{
struct OffsetWithLengthVisitor;
impl<'de> Visitor<'de> for OffsetWithLengthVisitor {
type Value = OffsetWithLength;
fn expecting(&self, formatter: &mut fmt::Formatter) -> fmt::Result {
formatter.write_str("an array with two elements: [offset, len]")
}
fn visit_seq<A>(self, mut seq: A) -> Result<Self::Value, A::Error>
where
A: SeqAccess<'de>,
{
let offset = seq
.next_element()?
.ok_or_else(|| de::Error::invalid_length(0, &self))?;
let len = seq
.next_element()?
.ok_or_else(|| de::Error::invalid_length(1, &self))?;
Ok(OffsetWithLength { offset, len })
}
}
deserializer.deserialize_seq(OffsetWithLengthVisitor)
}
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct VirtualFile {
#[serde(rename = "n")]
pub name: String,
#[serde(rename = "o")]
pub offset: OffsetWithLength,
#[serde(rename = "m", skip_serializing_if = "Option::is_none")]
pub transpiled_offset: Option<OffsetWithLength>,
#[serde(rename = "c", skip_serializing_if = "Option::is_none")]
pub cjs_export_analysis_offset: Option<OffsetWithLength>,
#[serde(rename = "s", skip_serializing_if = "Option::is_none")]
pub source_map_offset: Option<OffsetWithLength>,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct VirtualSymlinkParts(Vec<String>);
impl VirtualSymlinkParts {
pub fn from_path(path: &Path) -> Self {
Self(
path
.components()
.filter(|c| !matches!(c, std::path::Component::RootDir))
.map(|c| c.as_os_str().to_string_lossy().to_string())
.collect(),
)
}
pub fn take_parts(&mut self) -> Vec<String> {
std::mem::take(&mut self.0)
}
pub fn parts(&self) -> &[String] {
&self.0
}
pub fn set_parts(&mut self, parts: Vec<String>) {
self.0 = parts;
}
pub fn display(&self) -> String {
self.0.join("/")
}
}
#[derive(Debug, Serialize, Deserialize)]
pub struct VirtualSymlink {
#[serde(rename = "n")]
pub name: String,
#[serde(rename = "p")]
pub dest_parts: VirtualSymlinkParts,
}
impl VirtualSymlink {
pub fn resolve_dest_from_root(&self, root: &Path) -> PathBuf {
let mut dest = root.to_path_buf();
for part in &self.dest_parts.0 {
dest.push(part);
}
dest
}
}
#[derive(Debug, Copy, Clone)]
pub enum VfsEntryRef<'a> {
Dir(&'a VirtualDirectory),
File(&'a VirtualFile),
Symlink(&'a VirtualSymlink),
}
impl VfsEntryRef<'_> {
pub fn name(&self) -> &str {
match self {
Self::Dir(dir) => &dir.name,
Self::File(file) => &file.name,
Self::Symlink(symlink) => &symlink.name,
}
}
}
// todo(dsherret): we should store this more efficiently in the binary
#[derive(Debug, Serialize, Deserialize)]
pub enum VfsEntry {
Dir(VirtualDirectory),
File(VirtualFile),
Symlink(VirtualSymlink),
}
impl VfsEntry {
pub fn name(&self) -> &str {
match self {
Self::Dir(dir) => &dir.name,
Self::File(file) => &file.name,
Self::Symlink(symlink) => &symlink.name,
}
}
pub fn as_ref(&self) -> VfsEntryRef {
match self {
VfsEntry::Dir(dir) => VfsEntryRef::Dir(dir),
VfsEntry::File(file) => VfsEntryRef::File(file),
VfsEntry::Symlink(symlink) => VfsEntryRef::Symlink(symlink),
}
}
}
pub static DENO_COMPILE_GLOBAL_NODE_MODULES_DIR_NAME: &str =
".deno_compile_node_modules";
#[derive(Debug)]
pub struct BuiltVfs {
pub root_path: WindowsSystemRootablePath,
pub case_sensitivity: FileSystemCaseSensitivity,
pub entries: VirtualDirectoryEntries,
pub files: Vec<Vec<u8>>,
}
#[derive(Debug, Default)]
struct FilesData {
files: Vec<Vec<u8>>,
current_offset: u64,
file_offsets: HashMap<(String, usize), OffsetWithLength>,
}
impl FilesData {
pub fn file_bytes(&self, offset: OffsetWithLength) -> Option<&[u8]> {
if offset.len == 0 {
return Some(&[]);
}
// the debug assertions in this method should never happen
// because it would indicate providing an offset not in the vfs
let mut count: u64 = 0;
for file in &self.files {
// clippy wanted a match
match count.cmp(&offset.offset) {
Ordering::Equal => {
debug_assert_eq!(offset.len, file.len() as u64);
if offset.len == file.len() as u64 {
return Some(file);
} else {
return None;
}
}
Ordering::Less => {
count += file.len() as u64;
}
Ordering::Greater => {
debug_assert!(false);
return None;
}
}
}
debug_assert!(false);
None
}
pub fn add_data(&mut self, data: Vec<u8>) -> OffsetWithLength {
if data.is_empty() {
return OffsetWithLength { offset: 0, len: 0 };
}
let checksum = crate::util::checksum::gen(&[&data]);
match self.file_offsets.entry((checksum, data.len())) {
Entry::Occupied(occupied_entry) => {
let offset_and_len = *occupied_entry.get();
debug_assert_eq!(data.len() as u64, offset_and_len.len);
offset_and_len
}
Entry::Vacant(vacant_entry) => {
let offset_and_len = OffsetWithLength {
offset: self.current_offset,
len: data.len() as u64,
};
vacant_entry.insert(offset_and_len);
self.current_offset += offset_and_len.len;
self.files.push(data);
offset_and_len
}
}
}
}
pub struct AddFileDataOptions {
pub data: Vec<u8>,
pub maybe_transpiled: Option<Vec<u8>>,
pub maybe_source_map: Option<Vec<u8>>,
pub maybe_cjs_export_analysis: Option<Vec<u8>>,
}
#[derive(Debug)]
pub struct VfsBuilder {
executable_root: VirtualDirectory,
files: FilesData,
/// The minimum root directory that should be included in the VFS.
min_root_dir: Option<WindowsSystemRootablePath>,
case_sensitivity: FileSystemCaseSensitivity,
}
impl Default for VfsBuilder {
fn default() -> Self {
Self::new()
}
}
impl VfsBuilder {
pub fn new() -> Self {
Self {
executable_root: VirtualDirectory {
name: "/".to_string(),
entries: Default::default(),
},
files: Default::default(),
min_root_dir: Default::default(),
// This is not exactly correct because file systems on these OSes
// may be case-sensitive or not based on the directory, but this
// is a good enough approximation and limitation. In the future,
// we may want to store this information per directory instead
// depending on the feedback we get.
case_sensitivity: if cfg!(windows) || cfg!(target_os = "macos") {
FileSystemCaseSensitivity::Insensitive
} else {
FileSystemCaseSensitivity::Sensitive
},
}
}
pub fn case_sensitivity(&self) -> FileSystemCaseSensitivity {
self.case_sensitivity
}
pub fn files_len(&self) -> usize {
self.files.files.len()
}
pub fn file_bytes(&self, offset: OffsetWithLength) -> Option<&[u8]> {
self.files.file_bytes(offset)
}
/// Add a directory that might be the minimum root directory
/// of the VFS.
///
/// For example, say the user has a deno.json and specifies an
/// import map in a parent directory. The import map won't be
/// included in the VFS, but its base will meaning we need to
/// tell the VFS builder to include the base of the import map
/// by calling this method.
pub fn add_possible_min_root_dir(&mut self, path: &Path) {
self.add_dir_raw(path);
match &self.min_root_dir {
Some(WindowsSystemRootablePath::WindowSystemRoot) => {
// already the root dir
}
Some(WindowsSystemRootablePath::Path(current_path)) => {
let mut common_components = Vec::new();
for (a, b) in current_path.components().zip(path.components()) {
if a != b {
break;
}
common_components.push(a);
}
if common_components.is_empty() {
self.min_root_dir =
Some(WindowsSystemRootablePath::root_for_current_os());
} else {
self.min_root_dir = Some(WindowsSystemRootablePath::Path(
common_components.iter().collect(),
));
}
}
None => {
self.min_root_dir =
Some(WindowsSystemRootablePath::Path(path.to_path_buf()));
}
}
}
pub fn add_dir_recursive(&mut self, path: &Path) -> Result<(), AnyError> {
let target_path = self.resolve_target_path(path)?;
self.add_dir_recursive_not_symlink(&target_path)
}
fn add_dir_recursive_not_symlink(
&mut self,
path: &Path,
) -> Result<(), AnyError> {
self.add_dir_raw(path);
// ok, building fs implementation
#[allow(clippy::disallowed_methods)]
let read_dir = std::fs::read_dir(path)
.with_context(|| format!("Reading {}", path.display()))?;
let mut dir_entries =
read_dir.into_iter().collect::<Result<Vec<_>, _>>()?;
dir_entries.sort_by_cached_key(|entry| entry.file_name()); // determinism
for entry in dir_entries {
let file_type = entry.file_type()?;
let path = entry.path();
if file_type.is_dir() {
self.add_dir_recursive_not_symlink(&path)?;
} else if file_type.is_file() {
self.add_file_at_path_not_symlink(&path)?;
} else if file_type.is_symlink() {
match self.add_symlink(&path) {
Ok(target) => match target {
SymlinkTarget::File(target) => {
self.add_file_at_path_not_symlink(&target)?
}
SymlinkTarget::Dir(target) => {
self.add_dir_recursive_not_symlink(&target)?;
}
},
Err(err) => {
log::warn!(
"{} Failed resolving symlink. Ignoring.\n Path: {}\n Message: {:#}",
colors::yellow("Warning"),
path.display(),
err
);
}
}
}
}
Ok(())
}
fn add_dir_raw(&mut self, path: &Path) -> &mut VirtualDirectory {
log::debug!("Ensuring directory '{}'", path.display());
debug_assert!(path.is_absolute());
let mut current_dir = &mut self.executable_root;
for component in path.components() {
if matches!(component, std::path::Component::RootDir) {
continue;
}
let name = component.as_os_str().to_string_lossy();
let index = current_dir.entries.insert_or_modify(
&name,
self.case_sensitivity,
|| {
VfsEntry::Dir(VirtualDirectory {
name: name.to_string(),
entries: Default::default(),
})
},
|_| {
// ignore
},
);
match current_dir.entries.get_mut_by_index(index) {
Some(VfsEntry::Dir(dir)) => {
current_dir = dir;
}
_ => unreachable!(),
};
}
current_dir
}
pub fn get_system_root_dir_mut(&mut self) -> &mut VirtualDirectory {
&mut self.executable_root
}
pub fn get_dir_mut(&mut self, path: &Path) -> Option<&mut VirtualDirectory> {
debug_assert!(path.is_absolute());
let mut current_dir = &mut self.executable_root;
for component in path.components() {
if matches!(component, std::path::Component::RootDir) {
continue;
}
let name = component.as_os_str().to_string_lossy();
let entry = current_dir
.entries
.get_mut_by_name(&name, self.case_sensitivity)?;
match entry {
VfsEntry::Dir(dir) => {
current_dir = dir;
}
_ => unreachable!("{}", path.display()),
};
}
Some(current_dir)
}
pub fn add_file_at_path(&mut self, path: &Path) -> Result<(), AnyError> {
// ok, building fs implementation
#[allow(clippy::disallowed_methods)]
let file_bytes = std::fs::read(path)
.with_context(|| format!("Reading {}", path.display()))?;
self.add_file_with_data(
path,
AddFileDataOptions {
data: file_bytes,
maybe_cjs_export_analysis: None,
maybe_transpiled: None,
maybe_source_map: None,
},
)
}
fn add_file_at_path_not_symlink(
&mut self,
path: &Path,
) -> Result<(), AnyError> {
// ok, building fs implementation
#[allow(clippy::disallowed_methods)]
let file_bytes = std::fs::read(path)
.with_context(|| format!("Reading {}", path.display()))?;
self.add_file_with_data_raw(path, file_bytes)
}
pub fn add_file_with_data(
&mut self,
path: &Path,
options: AddFileDataOptions,
) -> Result<(), AnyError> {
// ok, fs implementation
#[allow(clippy::disallowed_methods)]
let metadata = std::fs::symlink_metadata(path).with_context(|| {
format!("Resolving target path for '{}'", path.display())
})?;
if metadata.is_symlink() {
let target = self.add_symlink(path)?.into_path_buf();
self.add_file_with_data_raw_options(&target, options)
} else {
self.add_file_with_data_raw_options(path, options)
}
}
pub fn add_file_with_data_raw(
&mut self,
path: &Path,
data: Vec<u8>,
) -> Result<(), AnyError> {
self.add_file_with_data_raw_options(
path,
AddFileDataOptions {
data,
maybe_transpiled: None,
maybe_cjs_export_analysis: None,
maybe_source_map: None,
},
)
}
fn add_file_with_data_raw_options(
&mut self,
path: &Path,
options: AddFileDataOptions,
) -> Result<(), AnyError> {
log::debug!("Adding file '{}'", path.display());
let case_sensitivity = self.case_sensitivity;
let offset_and_len = self.files.add_data(options.data);
let transpiled_offset = options
.maybe_transpiled
.map(|data| self.files.add_data(data));
let source_map_offset = options
.maybe_source_map
.map(|data| self.files.add_data(data));
let cjs_export_analysis_offset = options
.maybe_cjs_export_analysis
.map(|data| self.files.add_data(data));
let dir = self.add_dir_raw(path.parent().unwrap());
let name = path.file_name().unwrap().to_string_lossy();
dir.entries.insert_or_modify(
&name,
case_sensitivity,
|| {
VfsEntry::File(VirtualFile {
name: name.to_string(),
offset: offset_and_len,
transpiled_offset,
cjs_export_analysis_offset,
source_map_offset,
})
},
|entry| match entry {
VfsEntry::File(virtual_file) => {
virtual_file.offset = offset_and_len;
// doesn't overwrite to None
if transpiled_offset.is_some() {
virtual_file.transpiled_offset = transpiled_offset;
}
if source_map_offset.is_some() {
virtual_file.source_map_offset = source_map_offset;
}
if cjs_export_analysis_offset.is_some() {
virtual_file.cjs_export_analysis_offset =
cjs_export_analysis_offset;
}
}
VfsEntry::Dir(_) | VfsEntry::Symlink(_) => unreachable!(),
},
);
Ok(())
}
fn resolve_target_path(&mut self, path: &Path) -> Result<PathBuf, AnyError> {
// ok, fs implementation
#[allow(clippy::disallowed_methods)]
let metadata = std::fs::symlink_metadata(path).with_context(|| {
format!("Resolving target path for '{}'", path.display())
})?;
if metadata.is_symlink() {
Ok(self.add_symlink(path)?.into_path_buf())
} else {
Ok(path.to_path_buf())
}
}
pub fn add_symlink(
&mut self,
path: &Path,
) -> Result<SymlinkTarget, AnyError> {
self.add_symlink_inner(path, &mut IndexSet::new())
}
fn add_symlink_inner(
&mut self,
path: &Path,
visited: &mut IndexSet<PathBuf>,
) -> Result<SymlinkTarget, AnyError> {
log::debug!("Adding symlink '{}'", path.display());
let target = strip_unc_prefix(
// ok, fs implementation
#[allow(clippy::disallowed_methods)]
std::fs::read_link(path)
.with_context(|| format!("Reading symlink '{}'", path.display()))?,
);
let case_sensitivity = self.case_sensitivity;
let target = normalize_path(path.parent().unwrap().join(&target));
let dir = self.add_dir_raw(path.parent().unwrap());
let name = path.file_name().unwrap().to_string_lossy();
dir.entries.insert_or_modify(
&name,
case_sensitivity,
|| {
VfsEntry::Symlink(VirtualSymlink {
name: name.to_string(),
dest_parts: VirtualSymlinkParts::from_path(&target),
})
},
|_| {
// ignore previously inserted
},
);
// ok, fs implementation
#[allow(clippy::disallowed_methods)]
let target_metadata =
std::fs::symlink_metadata(&target).with_context(|| {
format!("Reading symlink target '{}'", target.display())
})?;
if target_metadata.is_symlink() {
if !visited.insert(target.clone()) {
// todo: probably don't error in this scenario
bail!(
"Circular symlink detected: {} -> {}",
visited
.iter()
.map(|p| p.display().to_string())
.collect::<Vec<_>>()
.join(" -> "),
target.display()
);
}
self.add_symlink_inner(&target, visited)
} else if target_metadata.is_dir() {
Ok(SymlinkTarget::Dir(target))
} else {
Ok(SymlinkTarget::File(target))
}
}
/// Adds the CJS export analysis to the provided file.
///
/// Warning: This will panic if the file wasn't properly
/// setup before calling this.
pub fn add_cjs_export_analysis(&mut self, path: &Path, data: Vec<u8>) {
self.add_data_for_file_or_panic(path, data, |file, offset_with_length| {
file.cjs_export_analysis_offset = Some(offset_with_length);
})
}
fn add_data_for_file_or_panic(
&mut self,
path: &Path,
data: Vec<u8>,
update_file: impl FnOnce(&mut VirtualFile, OffsetWithLength),
) {
let offset_with_length = self.files.add_data(data);
let case_sensitivity = self.case_sensitivity;
let dir = self.get_dir_mut(path.parent().unwrap()).unwrap();
let name = path.file_name().unwrap().to_string_lossy();
let file = dir
.entries
.get_mut_by_name(&name, case_sensitivity)
.unwrap();
match file {
VfsEntry::File(virtual_file) => {
update_file(virtual_file, offset_with_length);
}
VfsEntry::Dir(_) | VfsEntry::Symlink(_) => {
unreachable!()
}
}
}
/// Iterates through all the files in the virtual file system.
pub fn iter_files(
&self,
) -> impl Iterator<Item = (PathBuf, &VirtualFile)> + '_ {
FileIterator {
pending_dirs: VecDeque::from([(
WindowsSystemRootablePath::root_for_current_os(),
&self.executable_root,
)]),
current_dir_index: 0,
}
}
pub fn build(self) -> BuiltVfs {
fn strip_prefix_from_symlinks(
dir: &mut VirtualDirectory,
parts: &[String],
) {
for entry in dir.entries.iter_mut() {
match entry {
VfsEntry::Dir(dir) => {
strip_prefix_from_symlinks(dir, parts);
}
VfsEntry::File(_) => {}
VfsEntry::Symlink(symlink) => {
let parts = symlink
.dest_parts
.take_parts()
.into_iter()
.skip(parts.len())
.collect();
symlink.dest_parts.set_parts(parts);
}
}
}
}
let mut current_dir = self.executable_root;
let mut current_path = WindowsSystemRootablePath::root_for_current_os();
loop {
if current_dir.entries.len() != 1 {
break;
}
if self.min_root_dir.as_ref() == Some(&current_path) {
break;
}
match current_dir.entries.iter().next().unwrap() {
VfsEntry::Dir(dir) => {
if dir.name == DENO_COMPILE_GLOBAL_NODE_MODULES_DIR_NAME {
// special directory we want to maintain
break;
}
match current_dir.entries.remove(0) {
VfsEntry::Dir(dir) => {
current_path =
WindowsSystemRootablePath::Path(current_path.join(&dir.name));
current_dir = dir;
}
_ => unreachable!(),
};
}
VfsEntry::File(_) | VfsEntry::Symlink(_) => break,
}
}
if let WindowsSystemRootablePath::Path(path) = &current_path {
strip_prefix_from_symlinks(
&mut current_dir,
VirtualSymlinkParts::from_path(path).parts(),
);
}
BuiltVfs {
root_path: current_path,
case_sensitivity: self.case_sensitivity,
entries: current_dir.entries,
files: self.files.files,
}
}
}
struct FileIterator<'a> {
pending_dirs: VecDeque<(WindowsSystemRootablePath, &'a VirtualDirectory)>,
current_dir_index: usize,
}
impl<'a> Iterator for FileIterator<'a> {
type Item = (PathBuf, &'a VirtualFile);
fn next(&mut self) -> Option<Self::Item> {
while !self.pending_dirs.is_empty() {
let (dir_path, current_dir) = self.pending_dirs.front()?;
if let Some(entry) =
current_dir.entries.get_by_index(self.current_dir_index)
{
self.current_dir_index += 1;
match entry {
VfsEntry::Dir(virtual_directory) => {
self.pending_dirs.push_back((
WindowsSystemRootablePath::Path(
dir_path.join(&virtual_directory.name),
),
virtual_directory,
));
}
VfsEntry::File(virtual_file) => {
return Some((dir_path.join(&virtual_file.name), virtual_file));
}
VfsEntry::Symlink(_) => {
// ignore
}
}
} else {
self.pending_dirs.pop_front();
self.current_dir_index = 0;
}
}
None
}
}
#[derive(Debug)]
pub enum SymlinkTarget {
File(PathBuf),
Dir(PathBuf),
}
impl SymlinkTarget {
pub fn into_path_buf(self) -> PathBuf {
match self {
Self::File(path) => path,
Self::Dir(path) => path,
}
}
}

37
cli/lib/sys.rs Normal file
View file

@ -0,0 +1,37 @@
// Copyright 2018-2025 the Deno authors. MIT license.
use deno_node::ExtNodeSys;
use sys_traits::FsCanonicalize;
use sys_traits::FsCreateDirAll;
use sys_traits::FsMetadata;
use sys_traits::FsOpen;
use sys_traits::FsRead;
use sys_traits::FsReadDir;
use sys_traits::FsRemoveFile;
use sys_traits::FsRename;
use sys_traits::SystemRandom;
use sys_traits::ThreadSleep;
pub trait DenoLibSys:
FsCanonicalize
+ FsCreateDirAll
+ FsReadDir
+ FsMetadata
+ FsOpen
+ FsRemoveFile
+ FsRename
+ FsRead
+ ThreadSleep
+ SystemRandom
+ ExtNodeSys
+ Clone
+ Send
+ Sync
+ std::fmt::Debug
+ 'static
{
}
// ok, implementation
#[allow(clippy::disallowed_types)]
impl DenoLibSys for sys_traits::impls::RealSys {}

View file

@ -2,44 +2,33 @@
use std::io::Write;
use deno_telemetry::OtelConfig;
use deno_telemetry::OtelConsoleConfig;
use deno_runtime::deno_telemetry;
use deno_runtime::deno_telemetry::OtelConfig;
use deno_runtime::deno_telemetry::OtelConsoleConfig;
use super::draw_thread::DrawThread;
struct CliLogger {
struct CliLogger<FnOnLogStart: Fn(), FnOnLogEnd: Fn()> {
otel_console_config: OtelConsoleConfig,
logger: env_logger::Logger,
on_log_start: FnOnLogStart,
on_log_end: FnOnLogEnd,
}
impl CliLogger {
pub fn new(
logger: env_logger::Logger,
otel_console_config: OtelConsoleConfig,
) -> Self {
Self {
logger,
otel_console_config,
}
}
impl<FnOnLogStart: Fn(), FnOnLogEnd: Fn()> CliLogger<FnOnLogStart, FnOnLogEnd> {
pub fn filter(&self) -> log::LevelFilter {
self.logger.filter()
}
}
impl log::Log for CliLogger {
impl<FnOnLogStart: Fn() + Send + Sync, FnOnLogEnd: Fn() + Send + Sync> log::Log
for CliLogger<FnOnLogStart, FnOnLogEnd>
{
fn enabled(&self, metadata: &log::Metadata) -> bool {
self.logger.enabled(metadata)
}
fn log(&self, record: &log::Record) {
if self.enabled(record.metadata()) {
// it was considered to hold the draw thread's internal lock
// across logging, but if outputting to stderr blocks then that
// could potentially block other threads that access the draw
// thread's state
DrawThread::hide();
(self.on_log_start)();
match self.otel_console_config {
OtelConsoleConfig::Ignore => {
@ -54,7 +43,7 @@ impl log::Log for CliLogger {
}
}
DrawThread::show();
(self.on_log_end)();
}
}
@ -63,8 +52,20 @@ impl log::Log for CliLogger {
}
}
pub fn init(maybe_level: Option<log::Level>, otel_config: Option<OtelConfig>) {
let log_level = maybe_level.unwrap_or(log::Level::Info);
pub struct InitLoggingOptions<FnOnLogStart: Fn(), FnOnLogEnd: Fn()> {
pub on_log_start: FnOnLogStart,
pub on_log_end: FnOnLogEnd,
pub maybe_level: Option<log::Level>,
pub otel_config: Option<OtelConfig>,
}
pub fn init<
FOnLogStart: Fn() + Send + Sync + 'static,
FnOnLogEnd: Fn() + Send + Sync + 'static,
>(
options: InitLoggingOptions<FOnLogStart, FnOnLogEnd>,
) {
let log_level = options.maybe_level.unwrap_or(log::Level::Info);
let logger = env_logger::Builder::from_env(
env_logger::Env::new()
// Use `DENO_LOG` and `DENO_LOG_STYLE` instead of `RUST_` prefix
@ -117,12 +118,15 @@ pub fn init(maybe_level: Option<log::Level>, otel_config: Option<OtelConfig>) {
})
.build();
let cli_logger = CliLogger::new(
let cli_logger = CliLogger {
on_log_start: options.on_log_start,
on_log_end: options.on_log_end,
logger,
otel_config
otel_console_config: options
.otel_config
.map(|c| c.console)
.unwrap_or(OtelConsoleConfig::Ignore),
);
};
let max_level = cli_logger.filter();
let r = log::set_boxed_logger(Box::new(cli_logger));
if r.is_ok() {

8
cli/lib/util/mod.rs Normal file
View file

@ -0,0 +1,8 @@
// Copyright 2018-2025 the Deno authors. MIT license.
pub mod checksum;
pub mod hash;
pub mod logger;
pub mod result;
pub mod text_encoding;
pub mod v8;

43
cli/lib/util/result.rs Normal file
View file

@ -0,0 +1,43 @@
// Copyright 2018-2025 the Deno authors. MIT license.
use std::convert::Infallible;
use std::fmt::Debug;
use std::fmt::Display;
use deno_error::JsErrorBox;
use deno_error::JsErrorClass;
use deno_runtime::deno_core::error::AnyError;
use deno_runtime::deno_core::error::CoreError;
pub trait InfallibleResultExt<T> {
fn unwrap_infallible(self) -> T;
}
impl<T> InfallibleResultExt<T> for Result<T, Infallible> {
fn unwrap_infallible(self) -> T {
match self {
Ok(value) => value,
Err(never) => match never {},
}
}
}
pub fn any_and_jserrorbox_downcast_ref<
E: Display + Debug + Send + Sync + 'static,
>(
err: &AnyError,
) -> Option<&E> {
err
.downcast_ref::<E>()
.or_else(|| {
err
.downcast_ref::<JsErrorBox>()
.and_then(|e| e.as_any().downcast_ref::<E>())
})
.or_else(|| {
err.downcast_ref::<CoreError>().and_then(|e| match e {
CoreError::JsBox(e) => e.as_any().downcast_ref::<E>(),
_ => None,
})
})
}

View file

@ -0,0 +1,45 @@
// Copyright 2018-2025 the Deno authors. MIT license.
use std::borrow::Cow;
use std::sync::Arc;
#[inline(always)]
pub fn from_utf8_lossy_owned(bytes: Vec<u8>) -> String {
match String::from_utf8_lossy(&bytes) {
Cow::Owned(code) => code,
// SAFETY: `String::from_utf8_lossy` guarantees that the result is valid
// UTF-8 if `Cow::Borrowed` is returned.
Cow::Borrowed(_) => unsafe { String::from_utf8_unchecked(bytes) },
}
}
#[inline(always)]
pub fn from_utf8_lossy_cow(bytes: Cow<[u8]>) -> Cow<str> {
match bytes {
Cow::Borrowed(bytes) => String::from_utf8_lossy(bytes),
Cow::Owned(bytes) => Cow::Owned(from_utf8_lossy_owned(bytes)),
}
}
/// Converts an `Arc<str>` to an `Arc<[u8]>`.
#[allow(dead_code)]
pub fn arc_str_to_bytes(arc_str: Arc<str>) -> Arc<[u8]> {
let raw = Arc::into_raw(arc_str);
// SAFETY: This is safe because they have the same memory layout.
unsafe { Arc::from_raw(raw as *const [u8]) }
}
/// Converts an `Arc<u8>` to an `Arc<str>` if able.
#[allow(dead_code)]
pub fn arc_u8_to_arc_str(
arc_u8: Arc<[u8]>,
) -> Result<Arc<str>, std::str::Utf8Error> {
// Check that the string is valid UTF-8.
std::str::from_utf8(&arc_u8)?;
// SAFETY: the string is valid UTF-8, and the layout Arc<[u8]> is the same as
// Arc<str>. This is proven by the From<Arc<str>> impl for Arc<[u8]> from the
// standard library.
Ok(unsafe {
std::mem::transmute::<std::sync::Arc<[u8]>, std::sync::Arc<str>>(arc_u8)
})
}

14
cli/lib/util/v8.rs Normal file
View file

@ -0,0 +1,14 @@
// Copyright 2018-2025 the Deno authors. MIT license.
#[inline(always)]
pub fn construct_v8_flags(
default_v8_flags: &[String],
v8_flags: &[String],
env_v8_flags: Vec<String>,
) -> Vec<String> {
std::iter::once("UNUSED_BUT_NECESSARY_ARG0".to_owned())
.chain(default_v8_flags.iter().cloned())
.chain(env_v8_flags)
.chain(v8_flags.iter().cloned())
.collect::<Vec<_>>()
}

94
cli/lib/version.rs Normal file
View file

@ -0,0 +1,94 @@
// Copyright 2018-2025 the Deno authors. MIT license.
use std::borrow::Cow;
use deno_runtime::deno_telemetry::OtelRuntimeConfig;
use crate::shared::ReleaseChannel;
pub fn otel_runtime_config() -> OtelRuntimeConfig {
OtelRuntimeConfig {
runtime_name: Cow::Borrowed("deno"),
runtime_version: Cow::Borrowed(crate::version::DENO_VERSION_INFO.deno),
}
}
const GIT_COMMIT_HASH: &str = env!("GIT_COMMIT_HASH");
const TYPESCRIPT: &str = "5.6.2";
const DENO_VERSION: &str = env!("DENO_VERSION");
// TODO(bartlomieju): ideally we could remove this const.
const IS_CANARY: bool = option_env!("DENO_CANARY").is_some();
// TODO(bartlomieju): this is temporary, to allow Homebrew to cut RC releases as well
const IS_RC: bool = option_env!("DENO_RC").is_some();
pub static DENO_VERSION_INFO: std::sync::LazyLock<DenoVersionInfo> =
std::sync::LazyLock::new(|| {
let release_channel = libsui::find_section("denover")
.and_then(|buf| std::str::from_utf8(buf).ok())
.and_then(|str_| ReleaseChannel::deserialize(str_).ok())
.unwrap_or({
if IS_CANARY {
ReleaseChannel::Canary
} else if IS_RC {
ReleaseChannel::Rc
} else {
ReleaseChannel::Stable
}
});
DenoVersionInfo {
deno: if release_channel == ReleaseChannel::Canary {
concat!(env!("DENO_VERSION"), "+", env!("GIT_COMMIT_HASH_SHORT"))
} else {
env!("DENO_VERSION")
},
release_channel,
git_hash: GIT_COMMIT_HASH,
// Keep in sync with `deno` field.
user_agent: if release_channel == ReleaseChannel::Canary {
concat!(
"Deno/",
env!("DENO_VERSION"),
"+",
env!("GIT_COMMIT_HASH_SHORT")
)
} else {
concat!("Deno/", env!("DENO_VERSION"))
},
typescript: TYPESCRIPT,
}
});
pub struct DenoVersionInfo {
/// Human-readable version of the current Deno binary.
///
/// For stable release, a semver, eg. `v1.46.2`.
/// For canary release, a semver + 7-char git hash, eg. `v1.46.3+asdfqwq`.
pub deno: &'static str,
pub release_channel: ReleaseChannel,
/// A full git hash.
pub git_hash: &'static str,
/// A user-agent header that will be used in HTTP client.
pub user_agent: &'static str,
pub typescript: &'static str,
}
impl DenoVersionInfo {
/// For stable release, a semver like, eg. `v1.46.2`.
/// For canary release a full git hash, eg. `9bdab6fb6b93eb43b1930f40987fa4997287f9c8`.
pub fn version_or_git_hash(&self) -> &'static str {
if self.release_channel == ReleaseChannel::Canary {
self.git_hash
} else {
DENO_VERSION
}
}
}

1
cli/lib/version.txt Normal file
View file

@ -0,0 +1 @@
2.1.6

716
cli/lib/worker.rs Normal file
View file

@ -0,0 +1,716 @@
// Copyright 2018-2025 the Deno authors. MIT license.
use std::path::Path;
use std::path::PathBuf;
use std::rc::Rc;
use std::sync::Arc;
use deno_core::error::JsError;
use deno_node::NodeRequireLoaderRc;
use deno_resolver::npm::DenoInNpmPackageChecker;
use deno_resolver::npm::NpmResolver;
use deno_runtime::colors;
use deno_runtime::deno_broadcast_channel::InMemoryBroadcastChannel;
use deno_runtime::deno_core;
use deno_runtime::deno_core::error::CoreError;
use deno_runtime::deno_core::v8;
use deno_runtime::deno_core::CompiledWasmModuleStore;
use deno_runtime::deno_core::Extension;
use deno_runtime::deno_core::FeatureChecker;
use deno_runtime::deno_core::JsRuntime;
use deno_runtime::deno_core::LocalInspectorSession;
use deno_runtime::deno_core::ModuleLoader;
use deno_runtime::deno_core::SharedArrayBufferStore;
use deno_runtime::deno_fs;
use deno_runtime::deno_node::NodeExtInitServices;
use deno_runtime::deno_node::NodeRequireLoader;
use deno_runtime::deno_node::NodeResolver;
use deno_runtime::deno_permissions::PermissionsContainer;
use deno_runtime::deno_process::NpmProcessStateProviderRc;
use deno_runtime::deno_telemetry::OtelConfig;
use deno_runtime::deno_tls::RootCertStoreProvider;
use deno_runtime::deno_web::BlobStore;
use deno_runtime::fmt_errors::format_js_error;
use deno_runtime::inspector_server::InspectorServer;
use deno_runtime::ops::worker_host::CreateWebWorkerCb;
use deno_runtime::web_worker::WebWorker;
use deno_runtime::web_worker::WebWorkerOptions;
use deno_runtime::web_worker::WebWorkerServiceOptions;
use deno_runtime::worker::MainWorker;
use deno_runtime::worker::WorkerOptions;
use deno_runtime::worker::WorkerServiceOptions;
use deno_runtime::BootstrapOptions;
use deno_runtime::WorkerExecutionMode;
use deno_runtime::WorkerLogLevel;
use deno_runtime::UNSTABLE_GRANULAR_FLAGS;
use node_resolver::errors::ResolvePkgJsonBinExportError;
use url::Url;
use crate::args::has_trace_permissions_enabled;
use crate::sys::DenoLibSys;
use crate::util::checksum;
pub struct CreateModuleLoaderResult {
pub module_loader: Rc<dyn ModuleLoader>,
pub node_require_loader: Rc<dyn NodeRequireLoader>,
}
pub trait ModuleLoaderFactory: Send + Sync {
fn create_for_main(
&self,
root_permissions: PermissionsContainer,
) -> CreateModuleLoaderResult;
fn create_for_worker(
&self,
parent_permissions: PermissionsContainer,
permissions: PermissionsContainer,
) -> CreateModuleLoaderResult;
}
enum StorageKeyResolverStrategy {
Specified(Option<String>),
UseMainModule,
}
pub struct StorageKeyResolver(StorageKeyResolverStrategy);
impl StorageKeyResolver {
pub fn from_flag(location: &Url) -> Self {
// if a location is set, then the ascii serialization of the location is
// used, unless the origin is opaque, and then no storage origin is set, as
// we can't expect the origin to be reproducible
let storage_origin = location.origin();
Self(StorageKeyResolverStrategy::Specified(
if storage_origin.is_tuple() {
Some(storage_origin.ascii_serialization())
} else {
None
},
))
}
pub fn from_config_file_url(url: &Url) -> Self {
Self(StorageKeyResolverStrategy::Specified(Some(url.to_string())))
}
pub fn new_use_main_module() -> Self {
Self(StorageKeyResolverStrategy::UseMainModule)
}
/// Creates a storage key resolver that will always resolve to being empty.
pub fn empty() -> Self {
Self(StorageKeyResolverStrategy::Specified(None))
}
/// Resolves the storage key to use based on the current flags, config, or main module.
pub fn resolve_storage_key(&self, main_module: &Url) -> Option<String> {
// use the stored value or fall back to using the path of the main module.
match &self.0 {
StorageKeyResolverStrategy::Specified(value) => value.clone(),
StorageKeyResolverStrategy::UseMainModule => {
Some(main_module.to_string())
}
}
}
}
pub fn get_cache_storage_dir() -> PathBuf {
// ok because this won't ever be used by the js runtime
#[allow(clippy::disallowed_methods)]
// Note: we currently use temp_dir() to avoid managing storage size.
std::env::temp_dir().join("deno_cache")
}
/// By default V8 uses 1.4Gb heap limit which is meant for browser tabs.
/// Instead probe for the total memory on the system and use it instead
/// as a default.
pub fn create_isolate_create_params() -> Option<v8::CreateParams> {
let maybe_mem_info = deno_runtime::deno_os::sys_info::mem_info();
maybe_mem_info.map(|mem_info| {
v8::CreateParams::default()
.heap_limits_from_system_memory(mem_info.total, 0)
})
}
#[derive(Debug, thiserror::Error, deno_error::JsError)]
pub enum ResolveNpmBinaryEntrypointError {
#[class(inherit)]
#[error(transparent)]
ResolvePkgJsonBinExport(ResolvePkgJsonBinExportError),
#[class(generic)]
#[error("{original:#}\n\nFallback failed: {fallback:#}")]
Fallback {
fallback: ResolveNpmBinaryEntrypointFallbackError,
original: ResolvePkgJsonBinExportError,
},
}
#[derive(Debug, thiserror::Error, deno_error::JsError)]
pub enum ResolveNpmBinaryEntrypointFallbackError {
#[class(inherit)]
#[error(transparent)]
PackageSubpathResolve(node_resolver::errors::PackageSubpathResolveError),
#[class(generic)]
#[error("Cannot find module '{0}'")]
ModuleNotFound(Url),
}
pub struct LibMainWorkerOptions {
pub argv: Vec<String>,
pub log_level: WorkerLogLevel,
pub enable_op_summary_metrics: bool,
pub enable_testing_features: bool,
pub has_node_modules_dir: bool,
pub inspect_brk: bool,
pub inspect_wait: bool,
pub strace_ops: Option<Vec<String>>,
pub is_inspecting: bool,
pub location: Option<Url>,
pub argv0: Option<String>,
pub node_debug: Option<String>,
pub otel_config: OtelConfig,
pub origin_data_folder_path: Option<PathBuf>,
pub seed: Option<u64>,
pub unsafely_ignore_certificate_errors: Option<Vec<String>>,
pub skip_op_registration: bool,
pub node_ipc: Option<i64>,
pub startup_snapshot: Option<&'static [u8]>,
pub serve_port: Option<u16>,
pub serve_host: Option<String>,
}
struct LibWorkerFactorySharedState<TSys: DenoLibSys> {
blob_store: Arc<BlobStore>,
broadcast_channel: InMemoryBroadcastChannel,
code_cache: Option<Arc<dyn deno_runtime::code_cache::CodeCache>>,
compiled_wasm_module_store: CompiledWasmModuleStore,
feature_checker: Arc<FeatureChecker>,
fs: Arc<dyn deno_fs::FileSystem>,
maybe_inspector_server: Option<Arc<InspectorServer>>,
module_loader_factory: Box<dyn ModuleLoaderFactory>,
node_resolver:
Arc<NodeResolver<DenoInNpmPackageChecker, NpmResolver<TSys>, TSys>>,
npm_process_state_provider: NpmProcessStateProviderRc,
pkg_json_resolver: Arc<node_resolver::PackageJsonResolver<TSys>>,
root_cert_store_provider: Arc<dyn RootCertStoreProvider>,
shared_array_buffer_store: SharedArrayBufferStore,
storage_key_resolver: StorageKeyResolver,
sys: TSys,
options: LibMainWorkerOptions,
}
impl<TSys: DenoLibSys> LibWorkerFactorySharedState<TSys> {
fn resolve_unstable_features(
&self,
feature_checker: &FeatureChecker,
) -> Vec<i32> {
let mut unstable_features =
Vec::with_capacity(UNSTABLE_GRANULAR_FLAGS.len());
for granular_flag in UNSTABLE_GRANULAR_FLAGS {
if feature_checker.check(granular_flag.name) {
unstable_features.push(granular_flag.id);
}
}
unstable_features
}
fn create_node_init_services(
&self,
node_require_loader: NodeRequireLoaderRc,
) -> NodeExtInitServices<DenoInNpmPackageChecker, NpmResolver<TSys>, TSys> {
NodeExtInitServices {
node_require_loader,
node_resolver: self.node_resolver.clone(),
pkg_json_resolver: self.pkg_json_resolver.clone(),
sys: self.sys.clone(),
}
}
fn create_web_worker_callback(
self: &Arc<Self>,
stdio: deno_runtime::deno_io::Stdio,
) -> Arc<CreateWebWorkerCb> {
let shared = self.clone();
Arc::new(move |args| {
let maybe_inspector_server = shared.maybe_inspector_server.clone();
let CreateModuleLoaderResult {
module_loader,
node_require_loader,
} = shared.module_loader_factory.create_for_worker(
args.parent_permissions.clone(),
args.permissions.clone(),
);
let create_web_worker_cb =
shared.create_web_worker_callback(stdio.clone());
let maybe_storage_key = shared
.storage_key_resolver
.resolve_storage_key(&args.main_module);
let cache_storage_dir = maybe_storage_key.map(|key| {
// TODO(@satyarohith): storage quota management
get_cache_storage_dir().join(checksum::gen(&[key.as_bytes()]))
});
// TODO(bartlomieju): this is cruft, update FeatureChecker to spit out
// list of enabled features.
let feature_checker = shared.feature_checker.clone();
let unstable_features =
shared.resolve_unstable_features(feature_checker.as_ref());
let services = WebWorkerServiceOptions {
root_cert_store_provider: Some(shared.root_cert_store_provider.clone()),
module_loader,
fs: shared.fs.clone(),
node_services: Some(
shared.create_node_init_services(node_require_loader),
),
blob_store: shared.blob_store.clone(),
broadcast_channel: shared.broadcast_channel.clone(),
shared_array_buffer_store: Some(
shared.shared_array_buffer_store.clone(),
),
compiled_wasm_module_store: Some(
shared.compiled_wasm_module_store.clone(),
),
maybe_inspector_server,
feature_checker,
npm_process_state_provider: Some(
shared.npm_process_state_provider.clone(),
),
permissions: args.permissions,
};
let options = WebWorkerOptions {
name: args.name,
main_module: args.main_module.clone(),
worker_id: args.worker_id,
bootstrap: BootstrapOptions {
deno_version: crate::version::DENO_VERSION_INFO.deno.to_string(),
args: shared.options.argv.clone(),
cpu_count: std::thread::available_parallelism()
.map(|p| p.get())
.unwrap_or(1),
log_level: shared.options.log_level,
enable_op_summary_metrics: shared.options.enable_op_summary_metrics,
enable_testing_features: shared.options.enable_testing_features,
locale: deno_core::v8::icu::get_language_tag(),
location: Some(args.main_module),
no_color: !colors::use_color(),
color_level: colors::get_color_level(),
is_stdout_tty: deno_terminal::is_stdout_tty(),
is_stderr_tty: deno_terminal::is_stderr_tty(),
unstable_features,
user_agent: crate::version::DENO_VERSION_INFO.user_agent.to_string(),
inspect: shared.options.is_inspecting,
has_node_modules_dir: shared.options.has_node_modules_dir,
argv0: shared.options.argv0.clone(),
node_debug: shared.options.node_debug.clone(),
node_ipc_fd: None,
mode: WorkerExecutionMode::Worker,
serve_port: shared.options.serve_port,
serve_host: shared.options.serve_host.clone(),
otel_config: shared.options.otel_config.clone(),
close_on_idle: args.close_on_idle,
},
extensions: vec![],
startup_snapshot: shared.options.startup_snapshot,
create_params: create_isolate_create_params(),
unsafely_ignore_certificate_errors: shared
.options
.unsafely_ignore_certificate_errors
.clone(),
seed: shared.options.seed,
create_web_worker_cb,
format_js_error_fn: Some(Arc::new(format_js_error)),
worker_type: args.worker_type,
stdio: stdio.clone(),
cache_storage_dir,
strace_ops: shared.options.strace_ops.clone(),
close_on_idle: args.close_on_idle,
maybe_worker_metadata: args.maybe_worker_metadata,
enable_stack_trace_arg_in_ops: has_trace_permissions_enabled(),
};
WebWorker::bootstrap_from_options(services, options)
})
}
}
pub struct LibMainWorkerFactory<TSys: DenoLibSys> {
shared: Arc<LibWorkerFactorySharedState<TSys>>,
}
impl<TSys: DenoLibSys> LibMainWorkerFactory<TSys> {
#[allow(clippy::too_many_arguments)]
pub fn new(
blob_store: Arc<BlobStore>,
code_cache: Option<Arc<dyn deno_runtime::code_cache::CodeCache>>,
feature_checker: Arc<FeatureChecker>,
fs: Arc<dyn deno_fs::FileSystem>,
maybe_inspector_server: Option<Arc<InspectorServer>>,
module_loader_factory: Box<dyn ModuleLoaderFactory>,
node_resolver: Arc<
NodeResolver<DenoInNpmPackageChecker, NpmResolver<TSys>, TSys>,
>,
npm_process_state_provider: NpmProcessStateProviderRc,
pkg_json_resolver: Arc<node_resolver::PackageJsonResolver<TSys>>,
root_cert_store_provider: Arc<dyn RootCertStoreProvider>,
storage_key_resolver: StorageKeyResolver,
sys: TSys,
options: LibMainWorkerOptions,
) -> Self {
Self {
shared: Arc::new(LibWorkerFactorySharedState {
blob_store,
broadcast_channel: Default::default(),
code_cache,
compiled_wasm_module_store: Default::default(),
feature_checker,
fs,
maybe_inspector_server,
module_loader_factory,
node_resolver,
npm_process_state_provider,
pkg_json_resolver,
root_cert_store_provider,
shared_array_buffer_store: Default::default(),
storage_key_resolver,
sys,
options,
}),
}
}
pub fn create_main_worker(
&self,
mode: WorkerExecutionMode,
permissions: PermissionsContainer,
main_module: Url,
) -> Result<LibMainWorker, CoreError> {
self.create_custom_worker(
mode,
main_module,
permissions,
vec![],
Default::default(),
)
}
pub fn create_custom_worker(
&self,
mode: WorkerExecutionMode,
main_module: Url,
permissions: PermissionsContainer,
custom_extensions: Vec<Extension>,
stdio: deno_runtime::deno_io::Stdio,
) -> Result<LibMainWorker, CoreError> {
let shared = &self.shared;
let CreateModuleLoaderResult {
module_loader,
node_require_loader,
} = shared
.module_loader_factory
.create_for_main(permissions.clone());
// TODO(bartlomieju): this is cruft, update FeatureChecker to spit out
// list of enabled features.
let feature_checker = shared.feature_checker.clone();
let unstable_features =
shared.resolve_unstable_features(feature_checker.as_ref());
let maybe_storage_key = shared
.storage_key_resolver
.resolve_storage_key(&main_module);
let origin_storage_dir = maybe_storage_key.as_ref().map(|key| {
shared
.options
.origin_data_folder_path
.as_ref()
.unwrap() // must be set if storage key resolver returns a value
.join(checksum::gen(&[key.as_bytes()]))
});
let cache_storage_dir = maybe_storage_key.map(|key| {
// TODO(@satyarohith): storage quota management
get_cache_storage_dir().join(checksum::gen(&[key.as_bytes()]))
});
let services = WorkerServiceOptions {
root_cert_store_provider: Some(shared.root_cert_store_provider.clone()),
module_loader,
fs: shared.fs.clone(),
node_services: Some(
shared.create_node_init_services(node_require_loader),
),
npm_process_state_provider: Some(
shared.npm_process_state_provider.clone(),
),
blob_store: shared.blob_store.clone(),
broadcast_channel: shared.broadcast_channel.clone(),
fetch_dns_resolver: Default::default(),
shared_array_buffer_store: Some(shared.shared_array_buffer_store.clone()),
compiled_wasm_module_store: Some(
shared.compiled_wasm_module_store.clone(),
),
feature_checker,
permissions,
v8_code_cache: shared.code_cache.clone(),
};
let options = WorkerOptions {
bootstrap: BootstrapOptions {
deno_version: crate::version::DENO_VERSION_INFO.deno.to_string(),
args: shared.options.argv.clone(),
cpu_count: std::thread::available_parallelism()
.map(|p| p.get())
.unwrap_or(1),
log_level: shared.options.log_level,
enable_op_summary_metrics: shared.options.enable_op_summary_metrics,
enable_testing_features: shared.options.enable_testing_features,
locale: deno_core::v8::icu::get_language_tag(),
location: shared.options.location.clone(),
no_color: !colors::use_color(),
is_stdout_tty: deno_terminal::is_stdout_tty(),
is_stderr_tty: deno_terminal::is_stderr_tty(),
color_level: colors::get_color_level(),
unstable_features,
user_agent: crate::version::DENO_VERSION_INFO.user_agent.to_string(),
inspect: shared.options.is_inspecting,
has_node_modules_dir: shared.options.has_node_modules_dir,
argv0: shared.options.argv0.clone(),
node_debug: shared.options.node_debug.clone(),
node_ipc_fd: shared.options.node_ipc,
mode,
serve_port: shared.options.serve_port,
serve_host: shared.options.serve_host.clone(),
otel_config: shared.options.otel_config.clone(),
close_on_idle: true,
},
extensions: custom_extensions,
startup_snapshot: shared.options.startup_snapshot,
create_params: create_isolate_create_params(),
unsafely_ignore_certificate_errors: shared
.options
.unsafely_ignore_certificate_errors
.clone(),
seed: shared.options.seed,
format_js_error_fn: Some(Arc::new(format_js_error)),
create_web_worker_cb: shared.create_web_worker_callback(stdio.clone()),
maybe_inspector_server: shared.maybe_inspector_server.clone(),
should_break_on_first_statement: shared.options.inspect_brk,
should_wait_for_inspector_session: shared.options.inspect_wait,
strace_ops: shared.options.strace_ops.clone(),
cache_storage_dir,
origin_storage_dir,
stdio,
skip_op_registration: shared.options.skip_op_registration,
enable_stack_trace_arg_in_ops: has_trace_permissions_enabled(),
};
let worker =
MainWorker::bootstrap_from_options(&main_module, services, options);
Ok(LibMainWorker {
main_module,
worker,
})
}
pub fn resolve_npm_binary_entrypoint(
&self,
package_folder: &Path,
sub_path: Option<&str>,
) -> Result<Url, ResolveNpmBinaryEntrypointError> {
match self
.shared
.node_resolver
.resolve_binary_export(package_folder, sub_path)
{
Ok(specifier) => Ok(specifier),
Err(original_err) => {
// if the binary entrypoint was not found, fallback to regular node resolution
let result =
self.resolve_binary_entrypoint_fallback(package_folder, sub_path);
match result {
Ok(Some(specifier)) => Ok(specifier),
Ok(None) => {
Err(ResolveNpmBinaryEntrypointError::ResolvePkgJsonBinExport(
original_err,
))
}
Err(fallback_err) => Err(ResolveNpmBinaryEntrypointError::Fallback {
original: original_err,
fallback: fallback_err,
}),
}
}
}
}
/// resolve the binary entrypoint using regular node resolution
fn resolve_binary_entrypoint_fallback(
&self,
package_folder: &Path,
sub_path: Option<&str>,
) -> Result<Option<Url>, ResolveNpmBinaryEntrypointFallbackError> {
// only fallback if the user specified a sub path
if sub_path.is_none() {
// it's confusing to users if the package doesn't have any binary
// entrypoint and we just execute the main script which will likely
// have blank output, so do not resolve the entrypoint in this case
return Ok(None);
}
let specifier = self
.shared
.node_resolver
.resolve_package_subpath_from_deno_module(
package_folder,
sub_path,
/* referrer */ None,
node_resolver::ResolutionMode::Import,
node_resolver::NodeResolutionKind::Execution,
)
.map_err(
ResolveNpmBinaryEntrypointFallbackError::PackageSubpathResolve,
)?;
if deno_path_util::url_to_file_path(&specifier)
.map(|p| self.shared.sys.fs_exists_no_err(p))
.unwrap_or(false)
{
Ok(Some(specifier))
} else {
Err(ResolveNpmBinaryEntrypointFallbackError::ModuleNotFound(
specifier,
))
}
}
}
pub struct LibMainWorker {
main_module: Url,
worker: MainWorker,
}
impl LibMainWorker {
pub fn into_main_worker(self) -> MainWorker {
self.worker
}
pub fn main_module(&self) -> &Url {
&self.main_module
}
pub fn js_runtime(&mut self) -> &mut JsRuntime {
&mut self.worker.js_runtime
}
#[inline]
pub fn create_inspector_session(&mut self) -> LocalInspectorSession {
self.worker.create_inspector_session()
}
#[inline]
pub fn dispatch_load_event(&mut self) -> Result<(), JsError> {
self.worker.dispatch_load_event()
}
#[inline]
pub fn dispatch_beforeunload_event(&mut self) -> Result<bool, JsError> {
self.worker.dispatch_beforeunload_event()
}
#[inline]
pub fn dispatch_process_beforeexit_event(&mut self) -> Result<bool, JsError> {
self.worker.dispatch_process_beforeexit_event()
}
#[inline]
pub fn dispatch_unload_event(&mut self) -> Result<(), JsError> {
self.worker.dispatch_unload_event()
}
#[inline]
pub fn dispatch_process_exit_event(&mut self) -> Result<(), JsError> {
self.worker.dispatch_process_exit_event()
}
pub async fn execute_main_module(&mut self) -> Result<(), CoreError> {
let id = self.worker.preload_main_module(&self.main_module).await?;
self.worker.evaluate_module(id).await
}
pub async fn execute_side_module(&mut self) -> Result<(), CoreError> {
let id = self.worker.preload_side_module(&self.main_module).await?;
self.worker.evaluate_module(id).await
}
pub async fn run(&mut self) -> Result<i32, CoreError> {
log::debug!("main_module {}", self.main_module);
self.execute_main_module().await?;
self.worker.dispatch_load_event()?;
loop {
self
.worker
.run_event_loop(/* wait for inspector */ false)
.await?;
let web_continue = self.worker.dispatch_beforeunload_event()?;
if !web_continue {
let node_continue = self.worker.dispatch_process_beforeexit_event()?;
if !node_continue {
break;
}
}
}
self.worker.dispatch_unload_event()?;
self.worker.dispatch_process_exit_event()?;
Ok(self.worker.exit_code())
}
#[inline]
pub async fn run_event_loop(
&mut self,
wait_for_inspector: bool,
) -> Result<(), CoreError> {
self.worker.run_event_loop(wait_for_inspector).await
}
#[inline]
pub fn exit_code(&self) -> i32 {
self.worker.exit_code()
}
}
#[cfg(test)]
mod test {
use super::*;
#[test]
fn storage_key_resolver_test() {
let resolver =
StorageKeyResolver(StorageKeyResolverStrategy::UseMainModule);
let specifier = Url::parse("file:///a.ts").unwrap();
assert_eq!(
resolver.resolve_storage_key(&specifier),
Some(specifier.to_string())
);
let resolver =
StorageKeyResolver(StorageKeyResolverStrategy::Specified(None));
assert_eq!(resolver.resolve_storage_key(&specifier), None);
let resolver = StorageKeyResolver(StorageKeyResolverStrategy::Specified(
Some("value".to_string()),
));
assert_eq!(
resolver.resolve_storage_key(&specifier),
Some("value".to_string())
);
// test empty
let resolver = StorageKeyResolver::empty();
assert_eq!(resolver.resolve_storage_key(&specifier), None);
}
}

View file

@ -10,15 +10,16 @@ use deno_ast::SourceRange;
use deno_ast::SourceRangedForSpanned;
use deno_ast::SourceTextInfo;
use deno_config::workspace::MappedResolution;
use deno_core::error::custom_error;
use deno_core::error::AnyError;
use deno_core::serde::Deserialize;
use deno_core::serde::Serialize;
use deno_core::serde_json;
use deno_core::serde_json::json;
use deno_core::ModuleSpecifier;
use deno_error::JsErrorBox;
use deno_lint::diagnostic::LintDiagnosticRange;
use deno_path_util::url_to_file_path;
use deno_resolver::npm::managed::NpmResolutionCell;
use deno_runtime::deno_node::PathClean;
use deno_semver::jsr::JsrPackageNvReference;
use deno_semver::jsr::JsrPackageReqReference;
@ -31,6 +32,7 @@ use deno_semver::SmallStackString;
use deno_semver::StackString;
use deno_semver::Version;
use import_map::ImportMap;
use node_resolver::InNpmPackageChecker;
use node_resolver::NodeResolutionKind;
use node_resolver::ResolutionMode;
use once_cell::sync::Lazy;
@ -365,7 +367,9 @@ impl<'a> TsResponseImportMapper<'a> {
if let Ok(Some(pkg_id)) =
npm_resolver.resolve_pkg_id_from_specifier(specifier)
{
let pkg_reqs = npm_resolver.resolve_pkg_reqs_from_pkg_id(&pkg_id);
let pkg_reqs = npm_resolver
.resolution()
.resolve_pkg_reqs_from_pkg_id(&pkg_id);
// check if any pkg reqs match what is found in an import map
if !pkg_reqs.is_empty() {
let sub_path = npm_resolver
@ -1070,10 +1074,13 @@ impl CodeActionCollection {
// we wrap tsc, we can't handle the asynchronous response, so it is
// actually easier to return errors if we ever encounter one of these,
// which we really wouldn't expect from the Deno lsp.
return Err(custom_error(
return Err(
JsErrorBox::new(
"UnsupportedFix",
"The action returned from TypeScript is unsupported.",
));
)
.into(),
);
}
let Some(action) =
fix_ts_import_action(specifier, resolution_mode, action, language_server)
@ -1292,6 +1299,19 @@ impl CodeActionCollection {
range: &lsp::Range,
language_server: &language_server::Inner,
) -> Option<lsp::CodeAction> {
fn top_package_req_for_name(
resolution: &NpmResolutionCell,
name: &str,
) -> Option<PackageReq> {
let package_reqs = resolution.package_reqs();
let mut entries = package_reqs
.into_iter()
.filter(|(_, nv)| nv.name == name)
.collect::<Vec<_>>();
entries.sort_by(|a, b| a.1.version.cmp(&b.1.version));
Some(entries.pop()?.0)
}
let (dep_key, dependency, _) =
document.get_maybe_dependency(&range.end)?;
if dependency.maybe_deno_types_specifier.is_some() {
@ -1379,9 +1399,10 @@ impl CodeActionCollection {
.and_then(|versions| versions.first().cloned())?;
let types_specifier_text =
if let Some(npm_resolver) = managed_npm_resolver {
let mut specifier_text = if let Some(req) =
npm_resolver.top_package_req_for_name(&types_package_name)
{
let mut specifier_text = if let Some(req) = top_package_req_for_name(
npm_resolver.resolution(),
&types_package_name,
) {
format!("npm:{req}")
} else {
format!("npm:{}@^{}", &types_package_name, types_package_version)

View file

@ -41,10 +41,13 @@ use deno_core::serde_json::json;
use deno_core::serde_json::Value;
use deno_core::url::Url;
use deno_core::ModuleSpecifier;
use deno_lib::args::has_flag_env_var;
use deno_lib::util::hash::FastInsecureHasher;
use deno_lint::linter::LintConfig as DenoLintConfig;
use deno_npm::npm_rc::ResolvedNpmRc;
use deno_package_json::PackageJsonCache;
use deno_path_util::url_to_file_path;
use deno_resolver::sloppy_imports::SloppyImportsCachedFs;
use deno_runtime::deno_node::PackageJson;
use indexmap::IndexSet;
use lsp_types::ClientCapabilities;
@ -54,17 +57,14 @@ use super::logging::lsp_log;
use super::lsp_custom;
use super::urls::url_to_uri;
use crate::args::discover_npmrc_from_workspace;
use crate::args::has_flag_env_var;
use crate::args::CliLockfile;
use crate::args::CliLockfileReadFromPathOptions;
use crate::args::ConfigFile;
use crate::args::LintFlags;
use crate::args::LintOptions;
use crate::cache::FastInsecureHasher;
use crate::file_fetcher::CliFileFetcher;
use crate::lsp::logging::lsp_warn;
use crate::resolver::CliSloppyImportsResolver;
use crate::resolver::SloppyImportsCachedFs;
use crate::sys::CliSys;
use crate::tools::lint::CliLinter;
use crate::tools::lint::CliLinterOptions;
@ -853,7 +853,8 @@ impl Settings {
Some(false)
} else if let Some(enable_paths) = &enable_paths {
for enable_path in enable_paths {
if path.starts_with(enable_path) {
// Also enable if the checked path is a dir containing an enabled path.
if path.starts_with(enable_path) || enable_path.starts_with(&path) {
return Some(true);
}
}
@ -1245,7 +1246,6 @@ impl ConfigData {
pkg_json_cache: Some(pkg_json_cache),
workspace_cache: Some(workspace_cache),
discover_pkg_json: !has_flag_env_var("DENO_NO_PACKAGE_JSON"),
config_parse_options: Default::default(),
maybe_vendor_override: None,
},
)
@ -1572,11 +1572,11 @@ impl ConfigData {
let resolver = member_dir
.workspace
.create_resolver(
&CliSys::default(),
CreateResolverOptions {
pkg_json_dep_resolution,
specified_import_map,
},
|path| Ok(std::fs::read_to_string(path)?),
)
.inspect_err(|err| {
lsp_warn!(
@ -2077,7 +2077,6 @@ impl deno_config::workspace::WorkspaceCache for WorkspaceMemCache {
#[cfg(test)]
mod tests {
use deno_config::deno_json::ConfigParseOptions;
use deno_core::resolve_url;
use deno_core::serde_json;
use deno_core::serde_json::json;
@ -2351,12 +2350,7 @@ mod tests {
config
.tree
.inject_config_file(
ConfigFile::new(
"{}",
root_uri.join("deno.json").unwrap(),
&ConfigParseOptions::default(),
)
.unwrap(),
ConfigFile::new("{}", root_uri.join("deno.json").unwrap()).unwrap(),
)
.await;
assert!(config.specifier_enabled(&root_uri));
@ -2412,7 +2406,6 @@ mod tests {
})
.to_string(),
root_uri.join("deno.json").unwrap(),
&ConfigParseOptions::default(),
)
.unwrap(),
)
@ -2438,7 +2431,6 @@ mod tests {
})
.to_string(),
root_uri.join("deno.json").unwrap(),
&ConfigParseOptions::default(),
)
.unwrap(),
)
@ -2456,7 +2448,6 @@ mod tests {
})
.to_string(),
root_uri.join("deno.json").unwrap(),
&ConfigParseOptions::default(),
)
.unwrap(),
)

View file

@ -26,6 +26,7 @@ use deno_graph::Resolution;
use deno_graph::ResolutionError;
use deno_graph::SpecifierError;
use deno_lint::linter::LintConfig as DenoLintConfig;
use deno_resolver::sloppy_imports::SloppyImportsCachedFs;
use deno_resolver::sloppy_imports::SloppyImportsResolution;
use deno_resolver::sloppy_imports::SloppyImportsResolutionKind;
use deno_runtime::deno_node;
@ -34,7 +35,7 @@ use deno_semver::jsr::JsrPackageReqReference;
use deno_semver::npm::NpmPackageReqReference;
use deno_semver::package::PackageReq;
use import_map::ImportMap;
use import_map::ImportMapError;
use import_map::ImportMapErrorKind;
use log::error;
use tokio::sync::mpsc;
use tokio::sync::Mutex;
@ -61,7 +62,6 @@ use crate::graph_util;
use crate::graph_util::enhanced_resolution_error_message;
use crate::lsp::lsp_custom::DiagnosticBatchNotificationParams;
use crate::resolver::CliSloppyImportsResolver;
use crate::resolver::SloppyImportsCachedFs;
use crate::sys::CliSys;
use crate::tools::lint::CliLinter;
use crate::tools::lint::CliLinterOptions;
@ -265,7 +265,7 @@ impl TsDiagnosticsStore {
}
pub fn should_send_diagnostic_batch_index_notifications() -> bool {
crate::args::has_flag_env_var(
deno_lib::args::has_flag_env_var(
"DENO_DONT_USE_INTERNAL_LSP_DIAGNOSTIC_SYNC_FLAG",
)
}
@ -1297,8 +1297,8 @@ impl DenoDiagnostic {
let mut message;
message = enhanced_resolution_error_message(err);
if let deno_graph::ResolutionError::ResolverError {error, ..} = err{
if let ResolveError::Other(resolve_error, ..) = (*error).as_ref() {
if let Some(ImportMapError::UnmappedBareSpecifier(specifier, _)) = resolve_error.downcast_ref::<ImportMapError>() {
if let ResolveError::ImportMap(importmap) = (*error).as_ref() {
if let ImportMapErrorKind::UnmappedBareSpecifier(specifier, _) = &**importmap {
if specifier.chars().next().unwrap_or('\0') == '@'{
let hint = format!("\nHint: Use [deno add {}] to add the dependency.", specifier);
message.push_str(hint.as_str());
@ -1695,12 +1695,7 @@ mod tests {
let mut config = Config::new_with_roots([root_uri.clone()]);
if let Some((relative_path, json_string)) = maybe_import_map {
let base_url = root_uri.join(relative_path).unwrap();
let config_file = ConfigFile::new(
json_string,
base_url,
&deno_config::deno_json::ConfigParseOptions::default(),
)
.unwrap();
let config_file = ConfigFile::new(json_string, base_url).unwrap();
config.tree.inject_config_file(config_file).await;
}
let resolver =

View file

@ -18,13 +18,13 @@ use deno_ast::swc::visit::VisitWith;
use deno_ast::MediaType;
use deno_ast::ParsedSource;
use deno_ast::SourceTextInfo;
use deno_core::error::custom_error;
use deno_core::error::AnyError;
use deno_core::futures::future;
use deno_core::futures::future::Shared;
use deno_core::futures::FutureExt;
use deno_core::parking_lot::Mutex;
use deno_core::ModuleSpecifier;
use deno_error::JsErrorBox;
use deno_graph::Resolution;
use deno_path_util::url_to_file_path;
use deno_runtime::deno_node;
@ -480,7 +480,7 @@ impl Document {
let is_cjs_resolver =
resolver.as_is_cjs_resolver(self.file_referrer.as_ref());
let npm_resolver =
resolver.create_graph_npm_resolver(self.file_referrer.as_ref());
resolver.as_graph_npm_resolver(self.file_referrer.as_ref());
let config_data = resolver.as_config_data(self.file_referrer.as_ref());
let jsx_import_source_config =
config_data.and_then(|d| d.maybe_jsx_import_source_config());
@ -503,7 +503,7 @@ impl Document {
s,
&CliJsrUrlProvider,
Some(&resolver),
Some(&npm_resolver),
Some(npm_resolver.as_ref()),
),
)
})
@ -513,7 +513,7 @@ impl Document {
Arc::new(d.with_new_resolver(
&CliJsrUrlProvider,
Some(&resolver),
Some(&npm_resolver),
Some(npm_resolver.as_ref()),
))
});
is_script = self.is_script;
@ -1081,7 +1081,7 @@ impl Documents {
.or_else(|| self.file_system_docs.remove_document(specifier))
.map(Ok)
.unwrap_or_else(|| {
Err(custom_error(
Err(JsErrorBox::new(
"NotFound",
format!("The specifier \"{specifier}\" was not found."),
))
@ -1702,7 +1702,7 @@ fn analyze_module(
) -> (ModuleResult, ResolutionMode) {
match parsed_source_result {
Ok(parsed_source) => {
let npm_resolver = resolver.create_graph_npm_resolver(file_referrer);
let npm_resolver = resolver.as_graph_npm_resolver(file_referrer);
let cli_resolver = resolver.as_cli_resolver(file_referrer);
let is_cjs_resolver = resolver.as_is_cjs_resolver(file_referrer);
let config_data = resolver.as_config_data(file_referrer);
@ -1731,7 +1731,7 @@ fn analyze_module(
file_system: &deno_graph::source::NullFileSystem,
jsr_url_provider: &CliJsrUrlProvider,
maybe_resolver: Some(&resolver),
maybe_npm_resolver: Some(&npm_resolver),
maybe_npm_resolver: Some(npm_resolver.as_ref()),
},
)),
module_resolution_mode,
@ -1767,7 +1767,6 @@ fn bytes_to_content(
#[cfg(test)]
mod tests {
use deno_config::deno_json::ConfigFile;
use deno_config::deno_json::ConfigParseOptions;
use deno_core::serde_json;
use deno_core::serde_json::json;
use pretty_assertions::assert_eq;
@ -1924,7 +1923,6 @@ console.log(b, "hello deno");
})
.to_string(),
config.root_uri().unwrap().join("deno.json").unwrap(),
&ConfigParseOptions::default(),
)
.unwrap(),
)
@ -1968,7 +1966,6 @@ console.log(b, "hello deno");
})
.to_string(),
config.root_uri().unwrap().join("deno.json").unwrap(),
&ConfigParseOptions::default(),
)
.unwrap(),
)

View file

@ -27,6 +27,10 @@ use deno_core::url::Url;
use deno_core::ModuleSpecifier;
use deno_graph::GraphKind;
use deno_graph::Resolution;
use deno_lib::args::get_root_cert_store;
use deno_lib::args::has_flag_env_var;
use deno_lib::args::CaData;
use deno_lib::version::DENO_VERSION_INFO;
use deno_path_util::url_to_file_path;
use deno_runtime::deno_tls::rustls::RootCertStore;
use deno_runtime::deno_tls::RootCertStoreProvider;
@ -94,9 +98,6 @@ use super::urls;
use super::urls::uri_to_url;
use super::urls::url_to_uri;
use crate::args::create_default_npmrc;
use crate::args::get_root_cert_store;
use crate::args::has_flag_env_var;
use crate::args::CaData;
use crate::args::CliOptions;
use crate::args::Flags;
use crate::args::InternalFlags;
@ -122,7 +123,7 @@ use crate::util::sync::AsyncFlag;
struct LspRootCertStoreProvider(RootCertStore);
impl RootCertStoreProvider for LspRootCertStoreProvider {
fn get_or_try_init(&self) -> Result<&RootCertStore, AnyError> {
fn get_or_try_init(&self) -> Result<&RootCertStore, deno_error::JsErrorBox> {
Ok(&self.0)
}
}
@ -703,7 +704,7 @@ impl Inner {
let version = format!(
"{} ({}, {})",
crate::version::DENO_VERSION_INFO.deno,
DENO_VERSION_INFO.deno,
env!("PROFILE"),
env!("TARGET")
);
@ -1419,18 +1420,16 @@ impl Inner {
// the file path is only used to determine what formatter should
// be used to format the file, so give the filepath an extension
// that matches what the user selected as the language
let file_path = document
let ext = document
.maybe_language_id()
.and_then(|id| id.as_extension())
.map(|ext| file_path.with_extension(ext))
.unwrap_or(file_path);
.and_then(|id| id.as_extension().map(|s| s.to_string()));
// it's not a js/ts file, so attempt to format its contents
format_file(
&file_path,
document.content(),
&fmt_options,
&unstable_options,
None,
ext,
)
}
};
@ -3623,8 +3622,6 @@ impl Inner {
deno_json_cache: None,
pkg_json_cache: None,
workspace_cache: None,
config_parse_options:
deno_config::deno_json::ConfigParseOptions::default(),
additional_config_file_names: &[],
discover_pkg_json: !has_flag_env_var("DENO_NO_PACKAGE_JSON"),
maybe_vendor_override: if force_global_cache {
@ -3669,7 +3666,6 @@ impl Inner {
workspace,
force_global_cache,
None,
None,
)?;
let open_docs = self.documents.documents(DocumentsFilter::OpenDiagnosable);
@ -4006,12 +4002,14 @@ mod tests {
temp_dir.write("root1/target/main.ts", ""); // no, because there is a Cargo.toml in the root directory
temp_dir.create_dir_all("root2/folder");
temp_dir.create_dir_all("root2/folder2/inner_folder");
temp_dir.create_dir_all("root2/sub_folder");
temp_dir.create_dir_all("root2/root2.1");
temp_dir.write("root2/file1.ts", ""); // yes, enabled
temp_dir.write("root2/file2.ts", ""); // no, not enabled
temp_dir.write("root2/folder/main.ts", ""); // yes, enabled
temp_dir.write("root2/folder/other.ts", ""); // no, disabled
temp_dir.write("root2/folder2/inner_folder/main.ts", ""); // yes, enabled (regression test for https://github.com/denoland/vscode_deno/issues/1239)
temp_dir.write("root2/sub_folder/a.js", ""); // no, not enabled
temp_dir.write("root2/sub_folder/b.ts", ""); // no, not enabled
temp_dir.write("root2/sub_folder/c.js", ""); // no, not enabled
@ -4052,6 +4050,7 @@ mod tests {
enable_paths: Some(vec![
"file1.ts".to_string(),
"folder".to_string(),
"folder2/inner_folder".to_string(),
]),
disable_paths: vec!["folder/other.ts".to_string()],
..Default::default()
@ -4102,6 +4101,10 @@ mod tests {
temp_dir.url().join("root1/folder/mod.ts").unwrap(),
temp_dir.url().join("root2/folder/main.ts").unwrap(),
temp_dir.url().join("root2/root2.1/main.ts").unwrap(),
temp_dir
.url()
.join("root2/folder2/inner_folder/main.ts")
.unwrap(),
])
);
}

View file

@ -9,7 +9,6 @@ use std::sync::Arc;
use dashmap::DashMap;
use deno_ast::MediaType;
use deno_cache_dir::file_fetcher::CacheSetting;
use deno_cache_dir::npm::NpmCacheDir;
use deno_cache_dir::HttpCache;
use deno_config::deno_json::JsxImportSourceConfig;
@ -21,8 +20,13 @@ use deno_graph::GraphImport;
use deno_graph::ModuleSpecifier;
use deno_graph::Range;
use deno_npm::NpmSystemInfo;
use deno_npm_cache::TarballCache;
use deno_path_util::url_to_file_path;
use deno_resolver::cjs::IsCjsResolutionMode;
use deno_resolver::npm::managed::ManagedInNpmPkgCheckerCreateOptions;
use deno_resolver::npm::managed::NpmResolutionCell;
use deno_resolver::npm::CreateInNpmPkgCheckerOptions;
use deno_resolver::npm::DenoInNpmPackageChecker;
use deno_resolver::npm::NpmReqResolverOptions;
use deno_resolver::DenoResolverOptions;
use deno_resolver::NodeAndNpmReqResolver;
@ -32,7 +36,6 @@ use deno_semver::npm::NpmPackageReqReference;
use deno_semver::package::PackageNv;
use deno_semver::package::PackageReq;
use indexmap::IndexMap;
use node_resolver::InNpmPackageChecker;
use node_resolver::NodeResolutionKind;
use node_resolver::ResolutionMode;
@ -40,6 +43,8 @@ use super::cache::LspCache;
use super::jsr::JsrCacheResolver;
use crate::args::create_default_npmrc;
use crate::args::CliLockfile;
use crate::args::LifecycleScriptsConfig;
use crate::args::NpmCachingStrategy;
use crate::args::NpmInstallDepsProvider;
use crate::factory::Deferred;
use crate::graph_util::to_node_resolution_kind;
@ -51,21 +56,24 @@ use crate::lsp::config::ConfigData;
use crate::lsp::logging::lsp_warn;
use crate::node::CliNodeResolver;
use crate::node::CliPackageJsonResolver;
use crate::npm::create_cli_npm_resolver_for_lsp;
use crate::npm::installer::NpmInstaller;
use crate::npm::installer::NpmResolutionInstaller;
use crate::npm::CliByonmNpmResolverCreateOptions;
use crate::npm::CliManagedInNpmPkgCheckerCreateOptions;
use crate::npm::CliManagedNpmResolver;
use crate::npm::CliManagedNpmResolverCreateOptions;
use crate::npm::CliNpmCache;
use crate::npm::CliNpmCacheHttpClient;
use crate::npm::CliNpmRegistryInfoProvider;
use crate::npm::CliNpmResolver;
use crate::npm::CliNpmResolverCreateOptions;
use crate::npm::CliNpmResolverManagedSnapshotOption;
use crate::npm::CreateInNpmPkgCheckerOptions;
use crate::npm::ManagedCliNpmResolver;
use crate::npm::NpmResolutionInitializer;
use crate::resolver::CliDenoResolver;
use crate::resolver::CliIsCjsResolver;
use crate::resolver::CliNpmGraphResolver;
use crate::resolver::CliNpmReqResolver;
use crate::resolver::CliResolver;
use crate::resolver::CliResolverOptions;
use crate::resolver::IsCjsResolver;
use crate::resolver::WorkerCliNpmGraphResolver;
use crate::resolver::FoundPackageJsonDepFlag;
use crate::sys::CliSys;
use crate::tsc::into_specifier_and_media_type;
use crate::util::progress_bar::ProgressBar;
@ -74,10 +82,13 @@ use crate::util::progress_bar::ProgressBarStyle;
#[derive(Debug, Clone)]
struct LspScopeResolver {
resolver: Arc<CliResolver>,
in_npm_pkg_checker: Arc<dyn InNpmPackageChecker>,
is_cjs_resolver: Arc<IsCjsResolver>,
in_npm_pkg_checker: DenoInNpmPackageChecker,
is_cjs_resolver: Arc<CliIsCjsResolver>,
jsr_resolver: Option<Arc<JsrCacheResolver>>,
npm_resolver: Option<Arc<dyn CliNpmResolver>>,
npm_graph_resolver: Arc<CliNpmGraphResolver>,
npm_installer: Option<Arc<NpmInstaller>>,
npm_resolution: Arc<NpmResolutionCell>,
npm_resolver: Option<CliNpmResolver>,
node_resolver: Option<Arc<CliNodeResolver>>,
npm_pkg_req_resolver: Option<Arc<CliNpmReqResolver>>,
pkg_json_resolver: Arc<CliPackageJsonResolver>,
@ -96,8 +107,11 @@ impl Default for LspScopeResolver {
in_npm_pkg_checker: factory.in_npm_pkg_checker().clone(),
is_cjs_resolver: factory.is_cjs_resolver().clone(),
jsr_resolver: None,
npm_graph_resolver: factory.npm_graph_resolver().clone(),
npm_installer: None,
npm_resolver: None,
node_resolver: None,
npm_resolution: factory.services.npm_resolution.clone(),
npm_pkg_req_resolver: None,
pkg_json_resolver: factory.pkg_json_resolver().clone(),
redirect_resolver: None,
@ -121,6 +135,7 @@ impl LspScopeResolver {
}
let in_npm_pkg_checker = factory.in_npm_pkg_checker().clone();
let npm_resolver = factory.npm_resolver().cloned();
let npm_installer = factory.npm_installer().cloned();
let node_resolver = factory.node_resolver().cloned();
let npm_pkg_req_resolver = factory.npm_pkg_req_resolver().cloned();
let cli_resolver = factory.cli_resolver().clone();
@ -133,8 +148,7 @@ impl LspScopeResolver {
cache.for_specifier(config_data.map(|d| d.scope.as_ref())),
config_data.and_then(|d| d.lockfile.clone()),
)));
let npm_graph_resolver = cli_resolver
.create_graph_npm_resolver(crate::graph_util::NpmCachingStrategy::Eager);
let npm_graph_resolver = factory.npm_graph_resolver();
let maybe_jsx_import_source_config =
config_data.and_then(|d| d.maybe_jsx_import_source_config());
let graph_imports = config_data
@ -156,7 +170,7 @@ impl LspScopeResolver {
imports,
&CliJsrUrlProvider,
Some(&resolver),
Some(&npm_graph_resolver),
Some(npm_graph_resolver.as_ref()),
);
(referrer, graph_import)
})
@ -207,8 +221,11 @@ impl LspScopeResolver {
in_npm_pkg_checker,
is_cjs_resolver: factory.is_cjs_resolver().clone(),
jsr_resolver,
npm_graph_resolver: factory.npm_graph_resolver().clone(),
npm_pkg_req_resolver,
npm_resolver,
npm_installer,
npm_resolution: factory.services.npm_resolution.clone(),
node_resolver,
pkg_json_resolver,
redirect_resolver,
@ -220,18 +237,68 @@ impl LspScopeResolver {
}
fn snapshot(&self) -> Arc<Self> {
// create a copy of the resolution and then re-initialize the npm resolver from that
// todo(dsherret): this is pretty terrible... we should improve this. It should
// be possible to just change the npm_resolution on the new factory then access
// another method to create a new npm resolver
let mut factory = ResolverFactory::new(self.config_data.as_ref());
let npm_resolver =
self.npm_resolver.as_ref().map(|r| r.clone_snapshotted());
factory
.services
.npm_resolution
.set_snapshot(self.npm_resolution.snapshot());
let npm_resolver = self.npm_resolver.as_ref();
if let Some(npm_resolver) = &npm_resolver {
factory.set_npm_resolver(npm_resolver.clone());
factory.set_npm_resolver(CliNpmResolver::new::<CliSys>(
match npm_resolver {
CliNpmResolver::Byonm(byonm_npm_resolver) => {
CliNpmResolverCreateOptions::Byonm(
CliByonmNpmResolverCreateOptions {
root_node_modules_dir: byonm_npm_resolver
.root_node_modules_path()
.map(|p| p.to_path_buf()),
sys: CliSys::default(),
pkg_json_resolver: self.pkg_json_resolver.clone(),
},
)
}
CliNpmResolver::Managed(managed_npm_resolver) => {
CliNpmResolverCreateOptions::Managed({
let npmrc = self
.config_data
.as_ref()
.and_then(|d| d.npmrc.clone())
.unwrap_or_else(create_default_npmrc);
let npm_cache_dir = Arc::new(NpmCacheDir::new(
&CliSys::default(),
managed_npm_resolver.global_cache_root_path().to_path_buf(),
npmrc.get_all_known_registries_urls(),
));
CliManagedNpmResolverCreateOptions {
sys: CliSys::default(),
npm_cache_dir,
maybe_node_modules_path: managed_npm_resolver
.root_node_modules_path()
.map(|p| p.to_path_buf()),
npmrc,
npm_resolution: factory.services.npm_resolution.clone(),
npm_system_info: NpmSystemInfo::default(),
}
})
}
},
));
}
Arc::new(Self {
resolver: factory.cli_resolver().clone(),
in_npm_pkg_checker: factory.in_npm_pkg_checker().clone(),
is_cjs_resolver: factory.is_cjs_resolver().clone(),
jsr_resolver: self.jsr_resolver.clone(),
npm_graph_resolver: factory.npm_graph_resolver().clone(),
// npm installer isn't necessary for a snapshot
npm_installer: None,
npm_pkg_req_resolver: factory.npm_pkg_req_resolver().cloned(),
npm_resolution: factory.services.npm_resolution.clone(),
npm_resolver: factory.npm_resolver().cloned(),
node_resolver: factory.node_resolver().cloned(),
redirect_resolver: self.redirect_resolver.clone(),
@ -318,18 +385,16 @@ impl LspResolver {
if let Some(dep_info) = dep_info {
*resolver.dep_info.lock() = dep_info.clone();
}
if let Some(npm_resolver) = resolver.npm_resolver.as_ref() {
if let Some(npm_resolver) = npm_resolver.as_managed() {
if let Some(npm_installer) = resolver.npm_installer.as_ref() {
let reqs = dep_info
.map(|i| i.npm_reqs.iter().cloned().collect::<Vec<_>>())
.unwrap_or_default();
if let Err(err) = npm_resolver.set_package_reqs(&reqs).await {
if let Err(err) = npm_installer.set_package_reqs(&reqs).await {
lsp_warn!("Could not set npm package requirements: {:#}", err);
}
}
}
}
}
pub fn as_cli_resolver(
&self,
@ -339,20 +404,18 @@ impl LspResolver {
resolver.resolver.as_ref()
}
pub fn create_graph_npm_resolver(
pub fn as_graph_npm_resolver(
&self,
file_referrer: Option<&ModuleSpecifier>,
) -> WorkerCliNpmGraphResolver {
) -> &Arc<CliNpmGraphResolver> {
let resolver = self.get_scope_resolver(file_referrer);
resolver
.resolver
.create_graph_npm_resolver(crate::graph_util::NpmCachingStrategy::Eager)
&resolver.npm_graph_resolver
}
pub fn as_is_cjs_resolver(
&self,
file_referrer: Option<&ModuleSpecifier>,
) -> &IsCjsResolver {
) -> &CliIsCjsResolver {
let resolver = self.get_scope_resolver(file_referrer);
resolver.is_cjs_resolver.as_ref()
}
@ -368,7 +431,7 @@ impl LspResolver {
pub fn in_npm_pkg_checker(
&self,
file_referrer: Option<&ModuleSpecifier>,
) -> &Arc<dyn InNpmPackageChecker> {
) -> &DenoInNpmPackageChecker {
let resolver = self.get_scope_resolver(file_referrer);
&resolver.in_npm_pkg_checker
}
@ -376,7 +439,7 @@ impl LspResolver {
pub fn maybe_managed_npm_resolver(
&self,
file_referrer: Option<&ModuleSpecifier>,
) -> Option<&ManagedCliNpmResolver> {
) -> Option<&CliManagedNpmResolver> {
let resolver = self.get_scope_resolver(file_referrer);
resolver.npm_resolver.as_ref().and_then(|r| r.as_managed())
}
@ -590,11 +653,15 @@ pub struct ScopeDepInfo {
#[derive(Default)]
struct ResolverFactoryServices {
cli_resolver: Deferred<Arc<CliResolver>>,
in_npm_pkg_checker: Deferred<Arc<dyn InNpmPackageChecker>>,
is_cjs_resolver: Deferred<Arc<IsCjsResolver>>,
found_pkg_json_dep_flag: Arc<FoundPackageJsonDepFlag>,
in_npm_pkg_checker: Deferred<DenoInNpmPackageChecker>,
is_cjs_resolver: Deferred<Arc<CliIsCjsResolver>>,
node_resolver: Deferred<Option<Arc<CliNodeResolver>>>,
npm_graph_resolver: Deferred<Arc<CliNpmGraphResolver>>,
npm_installer: Option<Arc<NpmInstaller>>,
npm_pkg_req_resolver: Deferred<Option<Arc<CliNpmReqResolver>>>,
npm_resolver: Option<Arc<dyn CliNpmResolver>>,
npm_resolver: Option<CliNpmResolver>,
npm_resolution: Arc<NpmResolutionCell>,
}
struct ResolverFactory<'a> {
@ -616,6 +683,10 @@ impl<'a> ResolverFactory<'a> {
}
}
// todo(dsherret): probably this method could be removed in the future
// and instead just `npm_resolution_initializer.ensure_initialized()` could
// be called. The reason this exists is because creating the npm resolvers
// used to be async.
async fn init_npm_resolver(
&mut self,
http_client_provider: &Arc<HttpClientProvider>,
@ -645,11 +716,30 @@ impl<'a> ResolverFactory<'a> {
cache.deno_dir().npm_folder_path(),
npmrc.get_all_known_registries_urls(),
));
CliNpmResolverCreateOptions::Managed(CliManagedNpmResolverCreateOptions {
http_client_provider: http_client_provider.clone(),
// only used for top level install, so we can ignore this
npm_install_deps_provider: Arc::new(NpmInstallDepsProvider::empty()),
snapshot: match self.config_data.and_then(|d| d.lockfile.as_ref()) {
let npm_cache = Arc::new(CliNpmCache::new(
npm_cache_dir.clone(),
sys.clone(),
// Use an "only" cache setting in order to make the
// user do an explicit "cache" command and prevent
// the cache from being filled with lots of packages while
// the user is typing.
deno_npm_cache::NpmCacheSetting::Only,
npmrc.clone(),
));
let pb = ProgressBar::new(ProgressBarStyle::TextOnly);
let npm_client = Arc::new(CliNpmCacheHttpClient::new(
http_client_provider.clone(),
pb.clone(),
));
let registry_info_provider = Arc::new(CliNpmRegistryInfoProvider::new(
npm_cache.clone(),
npm_client.clone(),
npmrc.clone(),
));
let npm_resolution_initializer = Arc::new(NpmResolutionInitializer::new(
registry_info_provider.clone(),
self.services.npm_resolution.clone(),
match self.config_data.and_then(|d| d.lockfile.as_ref()) {
Some(lockfile) => {
CliNpmResolverManagedSnapshotOption::ResolveFromLockfile(
lockfile.clone(),
@ -657,33 +747,69 @@ impl<'a> ResolverFactory<'a> {
}
None => CliNpmResolverManagedSnapshotOption::Specified(None),
},
sys: CliSys::default(),
npm_cache_dir,
// Use an "only" cache setting in order to make the
// user do an explicit "cache" command and prevent
// the cache from being filled with lots of packages while
// the user is typing.
cache_setting: CacheSetting::Only,
text_only_progress_bar: ProgressBar::new(ProgressBarStyle::TextOnly),
));
// Don't provide the lockfile. We don't want these resolvers
// updating it. Only the cache request should update the lockfile.
maybe_lockfile: None,
maybe_node_modules_path: self
.config_data
.and_then(|d| d.node_modules_dir.clone()),
let maybe_lockfile: Option<Arc<CliLockfile>> = None;
let maybe_node_modules_path =
self.config_data.and_then(|d| d.node_modules_dir.clone());
let tarball_cache = Arc::new(TarballCache::new(
npm_cache.clone(),
npm_client.clone(),
sys.clone(),
npmrc.clone(),
));
let npm_resolution_installer = Arc::new(NpmResolutionInstaller::new(
registry_info_provider,
self.services.npm_resolution.clone(),
maybe_lockfile.clone(),
));
let npm_installer = Arc::new(NpmInstaller::new(
npm_cache.clone(),
Arc::new(NpmInstallDepsProvider::empty()),
self.services.npm_resolution.clone(),
npm_resolution_initializer.clone(),
npm_resolution_installer,
&pb,
sys.clone(),
tarball_cache.clone(),
maybe_lockfile,
maybe_node_modules_path.clone(),
LifecycleScriptsConfig::default(),
NpmSystemInfo::default(),
));
self.set_npm_installer(npm_installer);
// spawn due to the lsp's `Send` requirement
deno_core::unsync::spawn(async move {
if let Err(err) = npm_resolution_initializer.ensure_initialized().await
{
log::warn!("failed to initialize npm resolution: {}", err);
}
})
.await
.unwrap();
CliNpmResolverCreateOptions::Managed(CliManagedNpmResolverCreateOptions {
sys: CliSys::default(),
npm_cache_dir,
maybe_node_modules_path,
npmrc,
npm_resolution: self.services.npm_resolution.clone(),
npm_system_info: NpmSystemInfo::default(),
lifecycle_scripts: Default::default(),
})
};
self.set_npm_resolver(create_cli_npm_resolver_for_lsp(options).await);
self.set_npm_resolver(CliNpmResolver::new(options));
}
pub fn set_npm_resolver(&mut self, npm_resolver: Arc<dyn CliNpmResolver>) {
pub fn set_npm_installer(&mut self, npm_installer: Arc<NpmInstaller>) {
self.services.npm_installer = Some(npm_installer);
}
pub fn set_npm_resolver(&mut self, npm_resolver: CliNpmResolver) {
self.services.npm_resolver = Some(npm_resolver);
}
pub fn npm_resolver(&self) -> Option<&Arc<dyn CliNpmResolver>> {
pub fn npm_resolver(&self) -> Option<&CliNpmResolver> {
self.services.npm_resolver.as_ref()
}
@ -720,13 +846,27 @@ impl<'a> ResolverFactory<'a> {
is_byonm: self.config_data.map(|d| d.byonm).unwrap_or(false),
maybe_vendor_dir: self.config_data.and_then(|d| d.vendor_dir.as_ref()),
}));
Arc::new(CliResolver::new(CliResolverOptions {
Arc::new(CliResolver::new(
deno_resolver,
npm_resolver: self.npm_resolver().cloned(),
bare_node_builtins_enabled: self
self.services.found_pkg_json_dep_flag.clone(),
))
})
}
pub fn npm_installer(&self) -> Option<&Arc<NpmInstaller>> {
self.services.npm_installer.as_ref()
}
pub fn npm_graph_resolver(&self) -> &Arc<CliNpmGraphResolver> {
self.services.npm_graph_resolver.get_or_init(|| {
Arc::new(CliNpmGraphResolver::new(
None,
self.services.found_pkg_json_dep_flag.clone(),
self
.config_data
.is_some_and(|d| d.unstable.contains("bare-node-builtins")),
}))
NpmCachingStrategy::Eager,
))
})
}
@ -734,29 +874,27 @@ impl<'a> ResolverFactory<'a> {
&self.pkg_json_resolver
}
pub fn in_npm_pkg_checker(&self) -> &Arc<dyn InNpmPackageChecker> {
pub fn in_npm_pkg_checker(&self) -> &DenoInNpmPackageChecker {
self.services.in_npm_pkg_checker.get_or_init(|| {
crate::npm::create_in_npm_pkg_checker(
match self.services.npm_resolver.as_ref().map(|r| r.as_inner()) {
Some(crate::npm::InnerCliNpmResolverRef::Byonm(_)) | None => {
DenoInNpmPackageChecker::new(match &self.services.npm_resolver {
Some(CliNpmResolver::Byonm(_)) | None => {
CreateInNpmPkgCheckerOptions::Byonm
}
Some(crate::npm::InnerCliNpmResolverRef::Managed(m)) => {
Some(CliNpmResolver::Managed(m)) => {
CreateInNpmPkgCheckerOptions::Managed(
CliManagedInNpmPkgCheckerCreateOptions {
ManagedInNpmPkgCheckerCreateOptions {
root_cache_dir_url: m.global_cache_root_url(),
maybe_node_modules_path: m.maybe_node_modules_path(),
maybe_node_modules_path: m.root_node_modules_path(),
},
)
}
},
)
})
})
}
pub fn is_cjs_resolver(&self) -> &Arc<IsCjsResolver> {
pub fn is_cjs_resolver(&self) -> &Arc<CliIsCjsResolver> {
self.services.is_cjs_resolver.get_or_init(|| {
Arc::new(IsCjsResolver::new(
Arc::new(CliIsCjsResolver::new(
self.in_npm_pkg_checker().clone(),
self.pkg_json_resolver().clone(),
if self
@ -780,9 +918,10 @@ impl<'a> ResolverFactory<'a> {
Some(Arc::new(CliNodeResolver::new(
self.in_npm_pkg_checker().clone(),
RealIsBuiltInNodeModuleChecker,
npm_resolver.clone().into_npm_pkg_folder_resolver(),
npm_resolver.clone(),
self.pkg_json_resolver.clone(),
self.sys.clone(),
node_resolver::ConditionsFromResolutionMode::default(),
)))
})
.as_ref()
@ -796,10 +935,9 @@ impl<'a> ResolverFactory<'a> {
let node_resolver = self.node_resolver()?;
let npm_resolver = self.npm_resolver()?;
Some(Arc::new(CliNpmReqResolver::new(NpmReqResolverOptions {
byonm_resolver: (npm_resolver.clone()).into_maybe_byonm(),
in_npm_pkg_checker: self.in_npm_pkg_checker().clone(),
node_resolver: node_resolver.clone(),
npm_req_resolver: npm_resolver.clone().into_npm_req_resolver(),
npm_resolver: npm_resolver.clone(),
sys: self.sys.clone(),
})))
})

View file

@ -5,6 +5,7 @@ use std::collections::HashSet;
use deno_core::error::AnyError;
use deno_core::ModuleSpecifier;
use deno_lib::util::checksum;
use lsp::Range;
use tower_lsp::lsp_types as lsp;
@ -15,7 +16,6 @@ use crate::lsp::logging::lsp_warn;
use crate::lsp::urls::url_to_uri;
use crate::tools::test::TestDescription;
use crate::tools::test::TestStepDescription;
use crate::util::checksum;
#[derive(Debug, Clone, PartialEq)]
pub struct TestDefinition {

View file

@ -2,8 +2,8 @@
use std::collections::HashMap;
use deno_core::error::custom_error;
use deno_core::error::AnyError;
use deno_error::JsErrorBox;
use dissimilar::diff;
use dissimilar::Chunk;
use text_size::TextRange;
@ -137,7 +137,7 @@ impl LineIndex {
if let Some(line_offset) = self.utf8_offsets.get(position.line as usize) {
Ok(line_offset + col)
} else {
Err(custom_error("OutOfRange", "The position is out of range."))
Err(JsErrorBox::new("OutOfRange", "The position is out of range.").into())
}
}
@ -157,7 +157,7 @@ impl LineIndex {
if let Some(line_offset) = self.utf16_offsets.get(position.line as usize) {
Ok(line_offset + TextSize::from(position.character))
} else {
Err(custom_error("OutOfRange", "The position is out of range."))
Err(JsErrorBox::new("OutOfRange", "The position is out of range.").into())
}
}

View file

@ -20,7 +20,6 @@ use deno_core::anyhow::Context as _;
use deno_core::convert::Smi;
use deno_core::convert::ToV8;
use deno_core::error::AnyError;
use deno_core::error::StdAnyError;
use deno_core::futures::stream::FuturesOrdered;
use deno_core::futures::FutureExt;
use deno_core::futures::StreamExt;
@ -40,6 +39,8 @@ use deno_core::ModuleSpecifier;
use deno_core::OpState;
use deno_core::PollEventLoopOptions;
use deno_core::RuntimeOptions;
use deno_lib::util::result::InfallibleResultExt;
use deno_lib::worker::create_isolate_create_params;
use deno_path_util::url_to_file_path;
use deno_runtime::deno_node::SUPPORTED_BUILTIN_NODE_MODULES;
use deno_runtime::inspector_server::InspectorServer;
@ -73,6 +74,7 @@ use super::documents::Document;
use super::documents::DocumentsFilter;
use super::language_server;
use super::language_server::StateSnapshot;
use super::logging::lsp_log;
use super::performance::Performance;
use super::performance::PerformanceMark;
use super::refactor::RefactorCodeActionData;
@ -95,9 +97,7 @@ use crate::tsc::ResolveArgs;
use crate::tsc::MISSING_DEPENDENCY_SPECIFIER;
use crate::util::path::relative_specifier;
use crate::util::path::to_percent_decoded_str;
use crate::util::result::InfallibleResultExt;
use crate::util::v8::convert;
use crate::worker::create_isolate_create_params;
static BRACKET_ACCESSOR_RE: Lazy<Regex> =
lazy_regex!(r#"^\[['"](.+)[\['"]\]$"#);
@ -3973,6 +3973,11 @@ impl CompletionEntry {
if let Some(mut new_specifier) = import_mapper
.check_specifier(&import_data.normalized, specifier)
.or_else(|| relative_specifier(specifier, &import_data.normalized))
.or_else(|| {
ModuleSpecifier::parse(&import_data.raw.module_specifier)
.is_ok()
.then(|| import_data.normalized.to_string())
})
{
if new_specifier.contains("/node_modules/") {
return None;
@ -4331,15 +4336,17 @@ impl TscSpecifierMap {
pub fn normalize<S: AsRef<str>>(
&self,
specifier: S,
) -> Result<ModuleSpecifier, AnyError> {
) -> Result<ModuleSpecifier, deno_core::url::ParseError> {
let original = specifier.as_ref();
if let Some(specifier) = self.normalized_specifiers.get(original) {
return Ok(specifier.clone());
}
let specifier_str = original.replace(".d.ts.d.ts", ".d.ts");
let specifier_str = original
.replace(".d.ts.d.ts", ".d.ts")
.replace("$node_modules", "node_modules");
let specifier = match ModuleSpecifier::parse(&specifier_str) {
Ok(s) => s,
Err(err) => return Err(err.into()),
Err(err) => return Err(err),
};
if specifier.as_str() != original {
self
@ -4437,6 +4444,16 @@ fn op_is_node_file(state: &mut OpState, #[string] path: String) -> bool {
r
}
#[derive(Debug, thiserror::Error, deno_error::JsError)]
enum LoadError {
#[error("{0}")]
#[class(inherit)]
UrlParse(#[from] deno_core::url::ParseError),
#[error("{0}")]
#[class(inherit)]
SerdeV8(#[from] serde_v8::Error),
}
#[derive(Debug, Serialize)]
#[serde(rename_all = "camelCase")]
struct LoadResponse {
@ -4451,7 +4468,7 @@ fn op_load<'s>(
scope: &'s mut v8::HandleScope,
state: &mut OpState,
#[string] specifier: &str,
) -> Result<v8::Local<'s, v8::Value>, AnyError> {
) -> Result<v8::Local<'s, v8::Value>, LoadError> {
let state = state.borrow_mut::<State>();
let mark = state
.performance
@ -4482,7 +4499,7 @@ fn op_load<'s>(
fn op_release(
state: &mut OpState,
#[string] specifier: &str,
) -> Result<(), AnyError> {
) -> Result<(), deno_core::url::ParseError> {
let state = state.borrow_mut::<State>();
let mark = state
.performance
@ -4495,11 +4512,12 @@ fn op_release(
#[op2]
#[serde]
#[allow(clippy::type_complexity)]
fn op_resolve(
state: &mut OpState,
#[string] base: String,
#[serde] specifiers: Vec<(bool, String)>,
) -> Result<Vec<Option<(String, String)>>, AnyError> {
) -> Result<Vec<Option<(String, Option<String>)>>, deno_core::url::ParseError> {
op_resolve_inner(state, ResolveArgs { base, specifiers })
}
@ -4511,7 +4529,7 @@ struct TscRequestArray {
}
impl<'a> ToV8<'a> for TscRequestArray {
type Error = StdAnyError;
type Error = serde_v8::Error;
fn to_v8(
self,
@ -4526,9 +4544,7 @@ impl<'a> ToV8<'a> for TscRequestArray {
.unwrap()
.into();
let args = args.unwrap_or_else(|| v8::Array::new(scope, 0).into());
let scope_url = serde_v8::to_v8(scope, self.scope)
.map_err(AnyError::from)
.map_err(StdAnyError::from)?;
let scope_url = serde_v8::to_v8(scope, self.scope)?;
let change = self.change.to_v8(scope).unwrap_infallible();
@ -4583,10 +4599,11 @@ async fn op_poll_requests(
}
#[inline]
#[allow(clippy::type_complexity)]
fn op_resolve_inner(
state: &mut OpState,
args: ResolveArgs,
) -> Result<Vec<Option<(String, String)>>, AnyError> {
) -> Result<Vec<Option<(String, Option<String>)>>, deno_core::url::ParseError> {
let state = state.borrow_mut::<State>();
let mark = state.performance.mark_with_args("tsc.op.op_resolve", &args);
let referrer = state.specifier_map.normalize(&args.base)?;
@ -4599,7 +4616,11 @@ fn op_resolve_inner(
o.map(|(s, mt)| {
(
state.specifier_map.denormalize(&s),
mt.as_ts_extension().to_string(),
if matches!(mt, MediaType::Unknown) {
None
} else {
Some(mt.as_ts_extension().to_string())
},
)
})
})
@ -4677,10 +4698,27 @@ fn op_script_names(state: &mut OpState) -> ScriptNames {
.graph_imports_by_referrer(scope)
{
for specifier in specifiers {
if let Ok(req_ref) =
deno_semver::npm::NpmPackageReqReference::from_specifier(specifier)
{
let Some((resolved, _)) =
state.state_snapshot.resolver.npm_to_file_url(
&req_ref,
scope,
ResolutionMode::Import,
Some(scope),
)
else {
lsp_log!("failed to resolve {req_ref} to file URL");
continue;
};
script_names.insert(resolved.to_string());
} else {
script_names.insert(specifier.to_string());
}
}
}
}
// finally include the documents
let docs = state
@ -4743,7 +4781,7 @@ fn op_script_names(state: &mut OpState) -> ScriptNames {
fn op_script_version(
state: &mut OpState,
#[string] specifier: &str,
) -> Result<Option<String>, AnyError> {
) -> Result<Option<String>, deno_core::url::ParseError> {
let state = state.borrow_mut::<State>();
let mark = state.performance.mark("tsc.op.op_script_version");
let specifier = state.specifier_map.normalize(specifier)?;
@ -5398,7 +5436,8 @@ impl TscRequest {
fn to_server_request<'s>(
&self,
scope: &mut v8::HandleScope<'s>,
) -> Result<(&'static str, Option<v8::Local<'s, v8::Value>>), AnyError> {
) -> Result<(&'static str, Option<v8::Local<'s, v8::Value>>), serde_v8::Error>
{
let args = match self {
TscRequest::GetDiagnostics(args) => {
("$getDiagnostics", Some(serde_v8::to_v8(scope, args)?))
@ -5570,7 +5609,6 @@ mod tests {
})
.to_string(),
temp_dir.url().join("deno.json").unwrap(),
&Default::default(),
)
.unwrap(),
)
@ -6227,7 +6265,40 @@ mod tests {
"kind": "keyword"
}
],
"documentation": []
"documentation": [
{
"text": "Outputs a message to the console",
"kind": "text",
},
],
"tags": [
{
"name": "param",
"text": [
{
"text": "data",
"kind": "parameterName",
},
{
"text": " ",
"kind": "space",
},
{
"text": "Values to be printed to the console",
"kind": "text",
},
],
},
{
"name": "example",
"text": [
{
"text": "```ts\nconsole.log('Hello', 'World', 123);\n```",
"kind": "text",
},
],
},
]
})
);
}
@ -6448,7 +6519,7 @@ mod tests {
resolved,
vec![Some((
temp_dir.url().join("b.ts").unwrap().to_string(),
MediaType::TypeScript.as_ts_extension().to_string()
Some(MediaType::TypeScript.as_ts_extension().to_string())
))]
);
}

View file

@ -81,7 +81,7 @@ fn hash_data_specifier(specifier: &ModuleSpecifier) -> String {
file_name_str.push('?');
file_name_str.push_str(query);
}
crate::util::checksum::gen(&[file_name_str.as_bytes()])
deno_lib::util::checksum::gen(&[file_name_str.as_bytes()])
}
fn to_deno_uri(specifier: &Url) -> String {
@ -282,24 +282,26 @@ impl LspUrlMap {
}
}
/// Convert a e.g. `deno-notebook-cell:` specifier to a `file:` specifier.
/// Convert a e.g. `vscode-notebook-cell:` specifier to a `file:` specifier.
/// ```rust
/// assert_eq!(
/// file_like_to_file_specifier(
/// &Url::parse("deno-notebook-cell:/path/to/file.ipynb#abc").unwrap(),
/// &Url::parse("vscode-notebook-cell:/path/to/file.ipynb#abc").unwrap(),
/// ),
/// Some(Url::parse("file:///path/to/file.ipynb.ts?scheme=deno-notebook-cell#abc").unwrap()),
/// Some(Url::parse("file:///path/to/file.ipynb?scheme=untitled#abc").unwrap()),
/// );
fn file_like_to_file_specifier(specifier: &Url) -> Option<Url> {
if matches!(specifier.scheme(), "untitled" | "deno-notebook-cell") {
if matches!(
specifier.scheme(),
"untitled" | "vscode-notebook-cell" | "deno-notebook-cell"
) {
if let Ok(mut s) = ModuleSpecifier::parse(&format!(
"file://{}",
"file:///{}",
&specifier.as_str()[deno_core::url::quirks::internal_components(specifier)
.host_end as usize..],
.host_end as usize..].trim_start_matches('/'),
)) {
s.query_pairs_mut()
.append_pair("scheme", specifier.scheme());
s.set_path(&format!("{}.ts", s.path()));
return Some(s);
}
}
@ -432,11 +434,11 @@ mod tests {
fn test_file_like_to_file_specifier() {
assert_eq!(
file_like_to_file_specifier(
&Url::parse("deno-notebook-cell:/path/to/file.ipynb#abc").unwrap(),
&Url::parse("vscode-notebook-cell:/path/to/file.ipynb#abc").unwrap(),
),
Some(
Url::parse(
"file:///path/to/file.ipynb.ts?scheme=deno-notebook-cell#abc"
"file:///path/to/file.ipynb?scheme=vscode-notebook-cell#abc"
)
.unwrap()
),
@ -446,8 +448,7 @@ mod tests {
&Url::parse("untitled:/path/to/file.ipynb#123").unwrap(),
),
Some(
Url::parse("file:///path/to/file.ipynb.ts?scheme=untitled#123")
.unwrap()
Url::parse("file:///path/to/file.ipynb?scheme=untitled#123").unwrap()
),
);
}

View file

@ -4,7 +4,6 @@ mod args;
mod cache;
mod cdp;
mod emit;
mod errors;
mod factory;
mod file_fetcher;
mod graph_container;
@ -18,16 +17,18 @@ mod node;
mod npm;
mod ops;
mod resolver;
mod shared;
mod standalone;
mod sys;
mod task_runner;
mod tools;
mod tsc;
mod util;
mod version;
mod worker;
pub mod sys {
#[allow(clippy::disallowed_types)] // ok, definition
pub type CliSys = sys_traits::impls::RealSys;
}
use std::env;
use std::future::Future;
use std::io::IsTerminal;
@ -38,21 +39,25 @@ use std::sync::Arc;
use args::TaskFlags;
use deno_core::anyhow::Context;
use deno_core::error::AnyError;
use deno_core::error::JsError;
use deno_core::error::CoreError;
use deno_core::futures::FutureExt;
use deno_core::unsync::JoinHandle;
use deno_npm::resolution::SnapshotFromLockfileError;
use deno_lib::util::result::any_and_jserrorbox_downcast_ref;
use deno_resolver::npm::ByonmResolvePkgFolderFromDenoReqError;
use deno_resolver::npm::ResolvePkgFolderFromDenoReqError;
use deno_runtime::fmt_errors::format_js_error;
use deno_runtime::tokio_util::create_and_run_current_thread_with_maybe_metrics;
use deno_runtime::WorkerExecutionMode;
pub use deno_runtime::UNSTABLE_GRANULAR_FLAGS;
use deno_telemetry::OtelConfig;
use deno_terminal::colors;
use factory::CliFactory;
use standalone::MODULE_NOT_FOUND;
use standalone::UNSUPPORTED_SCHEME;
const MODULE_NOT_FOUND: &str = "Module not found";
const UNSUPPORTED_SCHEME: &str = "Unsupported scheme";
use self::npm::ResolveSnapshotError;
use self::util::draw_thread::DrawThread;
use crate::args::flags_from_vec;
use crate::args::DenoSubcommand;
use crate::args::Flags;
@ -202,7 +207,7 @@ async fn run_subcommand(flags: Arc<Flags>) -> Result<i32, AnyError> {
match result {
Ok(v) => Ok(v),
Err(script_err) => {
if let Some(ResolvePkgFolderFromDenoReqError::Byonm(ByonmResolvePkgFolderFromDenoReqError::UnmatchedReq(_))) = script_err.downcast_ref::<ResolvePkgFolderFromDenoReqError>() {
if let Some(worker::CreateCustomWorkerError::ResolvePkgFolderFromDenoReq(ResolvePkgFolderFromDenoReqError::Byonm(ByonmResolvePkgFolderFromDenoReqError::UnmatchedReq(_)))) = any_and_jserrorbox_downcast_ref::<worker::CreateCustomWorkerError>(&script_err) {
if flags.node_modules_dir.is_none() {
let mut flags = flags.deref().clone();
let watch = match &flags.subcommand {
@ -352,7 +357,7 @@ fn setup_panic_hook() {
eprintln!("var set and include the backtrace in your report.");
eprintln!();
eprintln!("Platform: {} {}", env::consts::OS, env::consts::ARCH);
eprintln!("Version: {}", version::DENO_VERSION_INFO.deno);
eprintln!("Version: {}", deno_lib::version::DENO_VERSION_INFO.deno);
eprintln!("Args: {:?}", env::args().collect::<Vec<_>>());
eprintln!();
orig_hook(panic_info);
@ -373,14 +378,18 @@ fn exit_for_error(error: AnyError) -> ! {
let mut error_string = format!("{error:?}");
let mut error_code = 1;
if let Some(e) = error.downcast_ref::<JsError>() {
error_string = format_js_error(e);
} else if let Some(SnapshotFromLockfileError::IntegrityCheckFailed(e)) =
error.downcast_ref::<SnapshotFromLockfileError>()
if let Some(CoreError::Js(e)) =
any_and_jserrorbox_downcast_ref::<CoreError>(&error)
{
error_string = format_js_error(e);
} else if let Some(e @ ResolveSnapshotError { .. }) =
any_and_jserrorbox_downcast_ref::<ResolveSnapshotError>(&error)
{
if let Some(e) = e.maybe_integrity_check_error() {
error_string = e.to_string();
error_code = 10;
}
}
exit_with_message(&error_string, error_code);
}
@ -437,19 +446,19 @@ fn resolve_flags_and_init(
if err.kind() == clap::error::ErrorKind::DisplayVersion =>
{
// Ignore results to avoid BrokenPipe errors.
util::logger::init(None, None);
init_logging(None, None);
let _ = err.print();
deno_runtime::exit(0);
}
Err(err) => {
util::logger::init(None, None);
init_logging(None, None);
exit_for_error(AnyError::from(err))
}
};
let otel_config = flags.otel_config();
deno_telemetry::init(crate::args::otel_runtime_config(), &otel_config)?;
util::logger::init(flags.log_level, Some(otel_config));
deno_telemetry::init(deno_lib::version::otel_runtime_config(), &otel_config)?;
init_logging(flags.log_level, Some(otel_config));
// TODO(bartlomieju): remove in Deno v2.5 and hard error then.
if flags.unstable_config.legacy_flag_enabled {
@ -482,3 +491,19 @@ fn resolve_flags_and_init(
Ok(flags)
}
fn init_logging(
maybe_level: Option<log::Level>,
otel_config: Option<OtelConfig>,
) {
deno_lib::util::logger::init(deno_lib::util::logger::InitLoggingOptions {
maybe_level,
otel_config,
// it was considered to hold the draw thread's internal lock
// across logging, but if outputting to stderr blocks then that
// could potentially block other threads that access the draw
// thread's state
on_log_start: DrawThread::hide,
on_log_end: DrawThread::show,
})
}

View file

@ -13,12 +13,8 @@ use std::sync::Arc;
use deno_ast::MediaType;
use deno_ast::ModuleKind;
use deno_core::anyhow::anyhow;
use deno_core::anyhow::bail;
use deno_core::anyhow::Context;
use deno_core::error::custom_error;
use deno_core::error::generic_error;
use deno_core::error::AnyError;
use deno_core::error::ModuleLoaderError;
use deno_core::futures::future::FutureExt;
use deno_core::futures::Future;
use deno_core::parking_lot::Mutex;
@ -31,6 +27,8 @@ use deno_core::ModuleSpecifier;
use deno_core::ModuleType;
use deno_core::RequestedModuleType;
use deno_core::SourceCodeCacheInfo;
use deno_error::JsErrorBox;
use deno_error::JsErrorClass;
use deno_graph::GraphKind;
use deno_graph::JsModule;
use deno_graph::JsonModule;
@ -39,9 +37,19 @@ use deno_graph::ModuleGraph;
use deno_graph::ModuleGraphError;
use deno_graph::Resolution;
use deno_graph::WasmModule;
use deno_lib::loader::ModuleCodeStringSource;
use deno_lib::loader::NotSupportedKindInNpmError;
use deno_lib::loader::NpmModuleLoadError;
use deno_lib::npm::NpmRegistryReadPermissionChecker;
use deno_lib::util::hash::FastInsecureHasher;
use deno_lib::worker::CreateModuleLoaderResult;
use deno_lib::worker::ModuleLoaderFactory;
use deno_resolver::npm::DenoInNpmPackageChecker;
use deno_runtime::code_cache;
use deno_runtime::deno_node::create_host_defined_options;
use deno_runtime::deno_node::ops::require::UnableToGetCwdError;
use deno_runtime::deno_node::NodeRequireLoader;
use deno_runtime::deno_node::RealIsBuiltInNodeModuleChecker;
use deno_runtime::deno_permissions::PermissionsContainer;
use deno_semver::npm::NpmPackageReqReference;
use node_resolver::errors::ClosestPkgJsonError;
@ -56,10 +64,8 @@ use crate::args::CliOptions;
use crate::args::DenoSubcommand;
use crate::args::TsTypeLib;
use crate::cache::CodeCache;
use crate::cache::FastInsecureHasher;
use crate::cache::ParsedSourceCache;
use crate::emit::Emitter;
use crate::errors::get_module_error_class;
use crate::graph_container::MainModuleGraphContainer;
use crate::graph_container::ModuleGraphContainer;
use crate::graph_container::ModuleGraphUpdatePermit;
@ -67,25 +73,48 @@ use crate::graph_util::enhance_graph_error;
use crate::graph_util::CreateGraphOptions;
use crate::graph_util::EnhanceGraphErrorMode;
use crate::graph_util::ModuleGraphBuilder;
use crate::node::CliCjsCodeAnalyzer;
use crate::node::CliNodeCodeTranslator;
use crate::node::CliNodeResolver;
use crate::npm::CliNpmResolver;
use crate::npm::NpmRegistryReadPermissionChecker;
use crate::resolver::CjsTracker;
use crate::resolver::CliCjsTracker;
use crate::resolver::CliNpmReqResolver;
use crate::resolver::CliResolver;
use crate::resolver::ModuleCodeStringSource;
use crate::resolver::NotSupportedKindInNpmError;
use crate::resolver::NpmModuleLoader;
use crate::sys::CliSys;
use crate::tools::check;
use crate::tools::check::MaybeDiagnostics;
use crate::tools::check::CheckError;
use crate::tools::check::TypeChecker;
use crate::util::progress_bar::ProgressBar;
use crate::util::text_encoding::code_without_source_map;
use crate::util::text_encoding::source_map_from_code;
use crate::worker::CreateModuleLoaderResult;
use crate::worker::ModuleLoaderFactory;
pub type CliNpmModuleLoader = deno_lib::loader::NpmModuleLoader<
CliCjsCodeAnalyzer,
DenoInNpmPackageChecker,
RealIsBuiltInNodeModuleChecker,
CliNpmResolver,
CliSys,
>;
#[derive(Debug, thiserror::Error, deno_error::JsError)]
pub enum PrepareModuleLoadError {
#[class(inherit)]
#[error(transparent)]
BuildGraphWithNpmResolution(
#[from] crate::graph_util::BuildGraphWithNpmResolutionError,
),
#[class(inherit)]
#[error(transparent)]
Check(#[from] CheckError),
#[class(inherit)]
#[error(transparent)]
AtomicWriteFileWithRetries(
#[from] crate::args::AtomicWriteFileWithRetriesError,
),
#[class(inherit)]
#[error(transparent)]
Other(#[from] JsErrorBox),
}
pub struct ModuleLoadPreparer {
options: Arc<CliOptions>,
@ -126,7 +155,7 @@ impl ModuleLoadPreparer {
lib: TsTypeLib,
permissions: PermissionsContainer,
ext_overwrite: Option<&String>,
) -> Result<(), MaybeDiagnostics> {
) -> Result<(), PrepareModuleLoadError> {
log::debug!("Preparing module load.");
let _pb_clear_guard = self.progress_bar.clear_guard();
@ -207,7 +236,7 @@ impl ModuleLoadPreparer {
&self,
graph: &ModuleGraph,
roots: &[ModuleSpecifier],
) -> Result<(), AnyError> {
) -> Result<(), JsErrorBox> {
self.module_graph_builder.graph_roots_valid(graph, roots)
}
}
@ -219,18 +248,19 @@ struct SharedCliModuleLoaderState {
initial_cwd: PathBuf,
is_inspecting: bool,
is_repl: bool,
cjs_tracker: Arc<CjsTracker>,
cjs_tracker: Arc<CliCjsTracker>,
code_cache: Option<Arc<CodeCache>>,
emitter: Arc<Emitter>,
in_npm_pkg_checker: Arc<dyn InNpmPackageChecker>,
in_npm_pkg_checker: DenoInNpmPackageChecker,
main_module_graph_container: Arc<MainModuleGraphContainer>,
module_load_preparer: Arc<ModuleLoadPreparer>,
node_code_translator: Arc<CliNodeCodeTranslator>,
node_resolver: Arc<CliNodeResolver>,
npm_module_loader: NpmModuleLoader,
npm_registry_permission_checker: Arc<NpmRegistryReadPermissionChecker>,
npm_module_loader: CliNpmModuleLoader,
npm_registry_permission_checker:
Arc<NpmRegistryReadPermissionChecker<CliSys>>,
npm_req_resolver: Arc<CliNpmReqResolver>,
npm_resolver: Arc<dyn CliNpmResolver>,
npm_resolver: CliNpmResolver,
parsed_source_cache: Arc<ParsedSourceCache>,
resolver: Arc<CliResolver>,
sys: CliSys,
@ -280,18 +310,20 @@ impl CliModuleLoaderFactory {
#[allow(clippy::too_many_arguments)]
pub fn new(
options: &CliOptions,
cjs_tracker: Arc<CjsTracker>,
cjs_tracker: Arc<CliCjsTracker>,
code_cache: Option<Arc<CodeCache>>,
emitter: Arc<Emitter>,
in_npm_pkg_checker: Arc<dyn InNpmPackageChecker>,
in_npm_pkg_checker: DenoInNpmPackageChecker,
main_module_graph_container: Arc<MainModuleGraphContainer>,
module_load_preparer: Arc<ModuleLoadPreparer>,
node_code_translator: Arc<CliNodeCodeTranslator>,
node_resolver: Arc<CliNodeResolver>,
npm_module_loader: NpmModuleLoader,
npm_registry_permission_checker: Arc<NpmRegistryReadPermissionChecker>,
npm_module_loader: CliNpmModuleLoader,
npm_registry_permission_checker: Arc<
NpmRegistryReadPermissionChecker<CliSys>,
>,
npm_req_resolver: Arc<CliNpmReqResolver>,
npm_resolver: Arc<dyn CliNpmResolver>,
npm_resolver: CliNpmResolver,
parsed_source_cache: Arc<ParsedSourceCache>,
resolver: Arc<CliResolver>,
sys: CliSys,
@ -401,6 +433,55 @@ impl ModuleLoaderFactory for CliModuleLoaderFactory {
}
}
#[derive(Debug, thiserror::Error, deno_error::JsError)]
pub enum LoadCodeSourceError {
#[class(inherit)]
#[error(transparent)]
NpmModuleLoad(NpmModuleLoadError),
#[class(inherit)]
#[error(transparent)]
LoadPreparedModule(#[from] LoadPreparedModuleError),
#[class(generic)]
#[error("Loading unprepared module: {}{}", .specifier, .maybe_referrer.as_ref().map(|r| format!(", imported from: {}", r)).unwrap_or_default())]
LoadUnpreparedModule {
specifier: ModuleSpecifier,
maybe_referrer: Option<ModuleSpecifier>,
},
}
#[derive(Debug, thiserror::Error, deno_error::JsError)]
pub enum LoadPreparedModuleError {
#[class(inherit)]
#[error(transparent)]
NpmModuleLoad(#[from] crate::emit::EmitParsedSourceHelperError),
#[class(inherit)]
#[error(transparent)]
LoadMaybeCjs(#[from] LoadMaybeCjsError),
#[class(inherit)]
#[error(transparent)]
Other(#[from] JsErrorBox),
}
#[derive(Debug, thiserror::Error, deno_error::JsError)]
pub enum LoadMaybeCjsError {
#[class(inherit)]
#[error(transparent)]
NpmModuleLoad(#[from] crate::emit::EmitParsedSourceHelperError),
#[class(inherit)]
#[error(transparent)]
TranslateCjsToEsm(#[from] node_resolver::analyze::TranslateCjsToEsmError),
}
#[derive(Debug, thiserror::Error, deno_error::JsError)]
#[class(inherit)]
#[error("Could not resolve '{reference}'")]
pub struct CouldNotResolveError {
reference: deno_semver::npm::NpmPackageNvReference,
#[source]
#[inherit]
source: node_resolver::errors::PackageSubpathResolveError,
}
struct CliModuleLoaderInner<TGraphContainer: ModuleGraphContainer> {
lib: TsTypeLib,
is_worker: bool,
@ -424,8 +505,11 @@ impl<TGraphContainer: ModuleGraphContainer>
specifier: &ModuleSpecifier,
maybe_referrer: Option<&ModuleSpecifier>,
requested_module_type: RequestedModuleType,
) -> Result<ModuleSource, AnyError> {
let code_source = self.load_code_source(specifier, maybe_referrer).await?;
) -> Result<ModuleSource, ModuleLoaderError> {
let code_source = self
.load_code_source(specifier, maybe_referrer)
.await
.map_err(JsErrorBox::from_err)?;
let code = if self.shared.is_inspecting
|| code_source.media_type == MediaType::Wasm
{
@ -447,7 +531,7 @@ impl<TGraphContainer: ModuleGraphContainer>
if module_type == ModuleType::Json
&& requested_module_type != RequestedModuleType::Json
{
return Err(generic_error("Attempted to load JSON module without specifying \"type\": \"json\" attribute in the import statement."));
return Err(JsErrorBox::generic("Attempted to load JSON module without specifying \"type\": \"json\" attribute in the import statement.").into());
}
let code_cache = if module_type == ModuleType::JavaScript {
@ -486,7 +570,7 @@ impl<TGraphContainer: ModuleGraphContainer>
&self,
specifier: &ModuleSpecifier,
maybe_referrer: Option<&ModuleSpecifier>,
) -> Result<ModuleCodeStringSource, AnyError> {
) -> Result<ModuleCodeStringSource, LoadCodeSourceError> {
if let Some(code_source) = self.load_prepared_module(specifier).await? {
return Ok(code_source);
}
@ -495,20 +579,20 @@ impl<TGraphContainer: ModuleGraphContainer>
.shared
.npm_module_loader
.load(specifier, maybe_referrer)
.await;
.await
.map_err(LoadCodeSourceError::NpmModuleLoad);
}
let mut msg = format!("Loading unprepared module: {specifier}");
if let Some(referrer) = maybe_referrer {
msg = format!("{}, imported from: {}", msg, referrer.as_str());
}
Err(anyhow!(msg))
Err(LoadCodeSourceError::LoadUnpreparedModule {
specifier: specifier.clone(),
maybe_referrer: maybe_referrer.cloned(),
})
}
fn resolve_referrer(
&self,
referrer: &str,
) -> Result<ModuleSpecifier, AnyError> {
) -> Result<ModuleSpecifier, ModuleLoaderError> {
let referrer = if referrer.is_empty() && self.shared.is_repl {
// FIXME(bartlomieju): this is a hacky way to provide compatibility with REPL
// and `Deno.core.evalContext` API. Ideally we should always have a referrer filled
@ -525,7 +609,8 @@ impl<TGraphContainer: ModuleGraphContainer>
.map_err(|e| e.into())
} else {
// this cwd check is slow, so try to avoid it
let cwd = std::env::current_dir().context("Unable to get CWD")?;
let cwd = std::env::current_dir()
.map_err(|e| JsErrorBox::from_err(UnableToGetCwdError(e)))?;
deno_core::resolve_path(referrer, &cwd).map_err(|e| e.into())
}
}
@ -534,7 +619,7 @@ impl<TGraphContainer: ModuleGraphContainer>
&self,
raw_specifier: &str,
referrer: &ModuleSpecifier,
) -> Result<ModuleSpecifier, AnyError> {
) -> Result<ModuleSpecifier, ModuleLoaderError> {
let graph = self.graph_container.graph();
let resolution = match graph.get(referrer) {
Some(Module::Js(module)) => module
@ -548,19 +633,25 @@ impl<TGraphContainer: ModuleGraphContainer>
let specifier = match resolution {
Resolution::Ok(resolved) => Cow::Borrowed(&resolved.specifier),
Resolution::Err(err) => {
return Err(custom_error(
"TypeError",
format!("{}\n", err.to_string_with_range()),
));
return Err(
JsErrorBox::type_error(format!("{}\n", err.to_string_with_range()))
.into(),
);
}
Resolution::None => Cow::Owned(self.shared.resolver.resolve(
Resolution::None => Cow::Owned(
self
.shared
.resolver
.resolve(
raw_specifier,
referrer,
deno_graph::Position::zeroed(),
// if we're here, that means it's resolving a dynamic import
ResolutionMode::Import,
NodeResolutionKind::Execution,
)?),
)
.map_err(JsErrorBox::from_err)?,
),
};
if self.shared.is_repl {
@ -575,7 +666,7 @@ impl<TGraphContainer: ModuleGraphContainer>
ResolutionMode::Import,
NodeResolutionKind::Execution,
)
.map_err(AnyError::from);
.map_err(|e| JsErrorBox::from_err(e).into());
}
}
@ -586,7 +677,8 @@ impl<TGraphContainer: ModuleGraphContainer>
.npm_resolver
.as_managed()
.unwrap() // byonm won't create a Module::Npm
.resolve_pkg_folder_from_deno_module(module.nv_reference.nv())?;
.resolve_pkg_folder_from_deno_module(module.nv_reference.nv())
.map_err(JsErrorBox::from_err)?;
self
.shared
.node_resolver
@ -597,8 +689,11 @@ impl<TGraphContainer: ModuleGraphContainer>
ResolutionMode::Import,
NodeResolutionKind::Execution,
)
.with_context(|| {
format!("Could not resolve '{}'.", module.nv_reference)
.map_err(|source| {
JsErrorBox::from_err(CouldNotResolveError {
reference: module.nv_reference.clone(),
source,
})
})?
}
Some(Module::Node(module)) => module.specifier.clone(),
@ -619,7 +714,7 @@ impl<TGraphContainer: ModuleGraphContainer>
async fn load_prepared_module(
&self,
specifier: &ModuleSpecifier,
) -> Result<Option<ModuleCodeStringSource>, AnyError> {
) -> Result<Option<ModuleCodeStringSource>, LoadPreparedModuleError> {
// Note: keep this in sync with the sync version below
let graph = self.graph_container.graph();
match self.load_prepared_module_or_defer_emit(&graph, specifier)? {
@ -651,7 +746,8 @@ impl<TGraphContainer: ModuleGraphContainer>
}) => self
.load_maybe_cjs(specifier, media_type, source)
.await
.map(Some),
.map(Some)
.map_err(LoadPreparedModuleError::LoadMaybeCjs),
None => Ok(None),
}
}
@ -702,7 +798,7 @@ impl<TGraphContainer: ModuleGraphContainer>
&self,
graph: &'graph ModuleGraph,
specifier: &ModuleSpecifier,
) -> Result<Option<CodeOrDeferredEmit<'graph>>, AnyError> {
) -> Result<Option<CodeOrDeferredEmit<'graph>>, JsErrorBox> {
if specifier.scheme() == "node" {
// Node built-in modules should be handled internally.
unreachable!("Deno bug. {} was misconfigured internally.", specifier);
@ -711,8 +807,8 @@ impl<TGraphContainer: ModuleGraphContainer>
let maybe_module = match graph.try_get(specifier) {
Ok(module) => module,
Err(err) => {
return Err(custom_error(
get_module_error_class(err),
return Err(JsErrorBox::new(
err.get_class(),
enhance_graph_error(
&self.shared.sys,
&ModuleGraphError::ModuleError(err.clone()),
@ -740,11 +836,12 @@ impl<TGraphContainer: ModuleGraphContainer>
is_script,
..
})) => {
if self.shared.cjs_tracker.is_cjs_with_known_is_script(
specifier,
*media_type,
*is_script,
)? {
if self
.shared
.cjs_tracker
.is_cjs_with_known_is_script(specifier, *media_type, *is_script)
.map_err(JsErrorBox::from_err)?
{
return Ok(Some(CodeOrDeferredEmit::Cjs {
specifier,
media_type: *media_type,
@ -811,7 +908,7 @@ impl<TGraphContainer: ModuleGraphContainer>
specifier: &ModuleSpecifier,
media_type: MediaType,
original_source: &Arc<str>,
) -> Result<ModuleCodeStringSource, AnyError> {
) -> Result<ModuleCodeStringSource, LoadMaybeCjsError> {
let js_source = if media_type.is_emittable() {
Cow::Owned(
self
@ -876,16 +973,16 @@ impl<TGraphContainer: ModuleGraphContainer> ModuleLoader
specifier: &str,
referrer: &str,
_kind: deno_core::ResolutionKind,
) -> Result<ModuleSpecifier, AnyError> {
) -> Result<ModuleSpecifier, ModuleLoaderError> {
fn ensure_not_jsr_non_jsr_remote_import(
specifier: &ModuleSpecifier,
referrer: &ModuleSpecifier,
) -> Result<(), AnyError> {
) -> Result<(), JsErrorBox> {
if referrer.as_str().starts_with(jsr_url().as_str())
&& !specifier.as_str().starts_with(jsr_url().as_str())
&& matches!(specifier.scheme(), "http" | "https")
{
bail!("Importing {} blocked. JSR packages cannot import non-JSR remote modules for security reasons.", specifier);
return Err(JsErrorBox::generic(format!("Importing {} blocked. JSR packages cannot import non-JSR remote modules for security reasons.", specifier)));
}
Ok(())
}
@ -938,7 +1035,7 @@ impl<TGraphContainer: ModuleGraphContainer> ModuleLoader
specifier: &ModuleSpecifier,
_maybe_referrer: Option<String>,
is_dynamic: bool,
) -> Pin<Box<dyn Future<Output = Result<(), AnyError>>>> {
) -> Pin<Box<dyn Future<Output = Result<(), ModuleLoaderError>>>> {
self.0.shared.in_flight_loads_tracker.increase();
if self.0.shared.in_npm_pkg_checker.in_npm_package(specifier) {
return Box::pin(deno_core::futures::future::ready(Ok(())));
@ -987,7 +1084,8 @@ impl<TGraphContainer: ModuleGraphContainer> ModuleLoader
permissions,
None,
)
.await?;
.await
.map_err(JsErrorBox::from_err)?;
update_permit.commit();
Ok(())
}
@ -1116,12 +1214,13 @@ impl ModuleGraphUpdatePermit for WorkerModuleGraphUpdatePermit {
#[derive(Debug)]
struct CliNodeRequireLoader<TGraphContainer: ModuleGraphContainer> {
cjs_tracker: Arc<CjsTracker>,
cjs_tracker: Arc<CliCjsTracker>,
emitter: Arc<Emitter>,
sys: CliSys,
graph_container: TGraphContainer,
in_npm_pkg_checker: Arc<dyn InNpmPackageChecker>,
npm_registry_permission_checker: Arc<NpmRegistryReadPermissionChecker>,
in_npm_pkg_checker: DenoInNpmPackageChecker,
npm_registry_permission_checker:
Arc<NpmRegistryReadPermissionChecker<CliSys>>,
}
impl<TGraphContainer: ModuleGraphContainer> NodeRequireLoader
@ -1131,35 +1230,37 @@ impl<TGraphContainer: ModuleGraphContainer> NodeRequireLoader
&self,
permissions: &mut dyn deno_runtime::deno_node::NodePermissions,
path: &'a Path,
) -> Result<std::borrow::Cow<'a, Path>, AnyError> {
) -> Result<Cow<'a, Path>, JsErrorBox> {
if let Ok(url) = deno_path_util::url_from_file_path(path) {
// allow reading if it's in the module graph
if self.graph_container.graph().get(&url).is_some() {
return Ok(std::borrow::Cow::Borrowed(path));
return Ok(Cow::Borrowed(path));
}
}
self
.npm_registry_permission_checker
.ensure_read_permission(permissions, path)
.map_err(JsErrorBox::from_err)
}
fn load_text_file_lossy(
&self,
path: &Path,
) -> Result<Cow<'static, str>, AnyError> {
) -> Result<Cow<'static, str>, JsErrorBox> {
// todo(dsherret): use the preloaded module from the graph if available?
let media_type = MediaType::from_path(path);
let text = self.sys.fs_read_to_string_lossy(path)?;
let text = self
.sys
.fs_read_to_string_lossy(path)
.map_err(JsErrorBox::from_err)?;
if media_type.is_emittable() {
let specifier = deno_path_util::url_from_file_path(path)?;
let specifier = deno_path_util::url_from_file_path(path)
.map_err(JsErrorBox::from_err)?;
if self.in_npm_pkg_checker.in_npm_package(&specifier) {
return Err(
NotSupportedKindInNpmError {
return Err(JsErrorBox::from_err(NotSupportedKindInNpmError {
media_type,
specifier,
}
.into(),
);
}));
}
self
.emitter
@ -1173,6 +1274,7 @@ impl<TGraphContainer: ModuleGraphContainer> NodeRequireLoader
&text.into(),
)
.map(Cow::Owned)
.map_err(JsErrorBox::from_err)
} else {
Ok(text)
}

View file

@ -5,8 +5,9 @@ use std::sync::Arc;
use deno_ast::MediaType;
use deno_ast::ModuleSpecifier;
use deno_core::error::AnyError;
use deno_error::JsErrorBox;
use deno_graph::ParsedSourceStore;
use deno_resolver::npm::DenoInNpmPackageChecker;
use deno_runtime::deno_fs;
use deno_runtime::deno_node::RealIsBuiltInNodeModuleChecker;
use node_resolver::analyze::CjsAnalysis as ExtNodeCjsAnalysis;
@ -19,15 +20,22 @@ use serde::Serialize;
use crate::cache::CacheDBHash;
use crate::cache::NodeAnalysisCache;
use crate::cache::ParsedSourceCache;
use crate::resolver::CjsTracker;
use crate::npm::CliNpmResolver;
use crate::resolver::CliCjsTracker;
use crate::sys::CliSys;
pub type CliNodeCodeTranslator = NodeCodeTranslator<
CliCjsCodeAnalyzer,
DenoInNpmPackageChecker,
RealIsBuiltInNodeModuleChecker,
CliNpmResolver,
CliSys,
>;
pub type CliNodeResolver = deno_runtime::deno_node::NodeResolver<
DenoInNpmPackageChecker,
CliNpmResolver,
CliSys,
>;
pub type CliNodeResolver = deno_runtime::deno_node::NodeResolver<CliSys>;
pub type CliPackageJsonResolver = node_resolver::PackageJsonResolver<CliSys>;
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
@ -43,7 +51,7 @@ pub enum CliCjsAnalysis {
pub struct CliCjsCodeAnalyzer {
cache: NodeAnalysisCache,
cjs_tracker: Arc<CjsTracker>,
cjs_tracker: Arc<CliCjsTracker>,
fs: deno_fs::FileSystemRc,
parsed_source_cache: Option<Arc<ParsedSourceCache>>,
}
@ -51,7 +59,7 @@ pub struct CliCjsCodeAnalyzer {
impl CliCjsCodeAnalyzer {
pub fn new(
cache: NodeAnalysisCache,
cjs_tracker: Arc<CjsTracker>,
cjs_tracker: Arc<CliCjsTracker>,
fs: deno_fs::FileSystemRc,
parsed_source_cache: Option<Arc<ParsedSourceCache>>,
) -> Self {
@ -67,8 +75,8 @@ impl CliCjsCodeAnalyzer {
&self,
specifier: &ModuleSpecifier,
source: &str,
) -> Result<CliCjsAnalysis, AnyError> {
let source_hash = CacheDBHash::from_source(source);
) -> Result<CliCjsAnalysis, JsErrorBox> {
let source_hash = CacheDBHash::from_hashable(source);
if let Some(analysis) =
self.cache.get_cjs_analysis(specifier.as_str(), source_hash)
{
@ -84,7 +92,9 @@ impl CliCjsCodeAnalyzer {
}
let cjs_tracker = self.cjs_tracker.clone();
let is_maybe_cjs = cjs_tracker.is_maybe_cjs(specifier, media_type)?;
let is_maybe_cjs = cjs_tracker
.is_maybe_cjs(specifier, media_type)
.map_err(JsErrorBox::from_err)?;
let analysis = if is_maybe_cjs {
let maybe_parsed_source = self
.parsed_source_cache
@ -94,9 +104,10 @@ impl CliCjsCodeAnalyzer {
deno_core::unsync::spawn_blocking({
let specifier = specifier.clone();
let source: Arc<str> = source.into();
move || -> Result<_, AnyError> {
let parsed_source =
maybe_parsed_source.map(Ok).unwrap_or_else(|| {
move || -> Result<_, JsErrorBox> {
let parsed_source = maybe_parsed_source
.map(Ok)
.unwrap_or_else(|| {
deno_ast::parse_program(deno_ast::ParseParams {
specifier,
text: source,
@ -105,13 +116,16 @@ impl CliCjsCodeAnalyzer {
scope_analysis: false,
maybe_syntax: None,
})
})?;
})
.map_err(JsErrorBox::from_err)?;
let is_script = parsed_source.compute_is_script();
let is_cjs = cjs_tracker.is_cjs_with_known_is_script(
let is_cjs = cjs_tracker
.is_cjs_with_known_is_script(
parsed_source.specifier(),
media_type,
is_script,
)?;
)
.map_err(JsErrorBox::from_err)?;
if is_cjs {
let analysis = parsed_source.analyze_cjs();
Ok(CliCjsAnalysis::Cjs {
@ -143,7 +157,7 @@ impl CjsCodeAnalyzer for CliCjsCodeAnalyzer {
&self,
specifier: &ModuleSpecifier,
source: Option<Cow<'a, str>>,
) -> Result<ExtNodeCjsAnalysis<'a>, AnyError> {
) -> Result<ExtNodeCjsAnalysis<'a>, JsErrorBox> {
let source = match source {
Some(source) => source,
None => {

View file

@ -1,78 +0,0 @@
// Copyright 2018-2025 the Deno authors. MIT license.
use std::path::Path;
use std::sync::Arc;
use deno_core::serde_json;
use deno_resolver::npm::ByonmNpmResolver;
use deno_resolver::npm::ByonmNpmResolverCreateOptions;
use deno_resolver::npm::CliNpmReqResolver;
use deno_runtime::ops::process::NpmProcessStateProvider;
use node_resolver::NpmPackageFolderResolver;
use super::CliNpmResolver;
use super::InnerCliNpmResolverRef;
use crate::args::NpmProcessState;
use crate::args::NpmProcessStateKind;
use crate::sys::CliSys;
pub type CliByonmNpmResolverCreateOptions =
ByonmNpmResolverCreateOptions<CliSys>;
pub type CliByonmNpmResolver = ByonmNpmResolver<CliSys>;
// todo(dsherret): the services hanging off `CliNpmResolver` doesn't seem ideal. We should probably decouple.
#[derive(Debug)]
struct CliByonmWrapper(Arc<CliByonmNpmResolver>);
impl NpmProcessStateProvider for CliByonmWrapper {
fn get_npm_process_state(&self) -> String {
serde_json::to_string(&NpmProcessState {
kind: NpmProcessStateKind::Byonm,
local_node_modules_path: self
.0
.root_node_modules_dir()
.map(|p| p.to_string_lossy().to_string()),
})
.unwrap()
}
}
impl CliNpmResolver for CliByonmNpmResolver {
fn into_npm_pkg_folder_resolver(
self: Arc<Self>,
) -> Arc<dyn NpmPackageFolderResolver> {
self
}
fn into_npm_req_resolver(self: Arc<Self>) -> Arc<dyn CliNpmReqResolver> {
self
}
fn into_process_state_provider(
self: Arc<Self>,
) -> Arc<dyn NpmProcessStateProvider> {
Arc::new(CliByonmWrapper(self))
}
fn into_maybe_byonm(self: Arc<Self>) -> Option<Arc<CliByonmNpmResolver>> {
Some(self)
}
fn clone_snapshotted(&self) -> Arc<dyn CliNpmResolver> {
Arc::new(self.clone())
}
fn as_inner(&self) -> InnerCliNpmResolverRef {
InnerCliNpmResolverRef::Byonm(self)
}
fn root_node_modules_path(&self) -> Option<&Path> {
self.root_node_modules_dir()
}
fn check_state_hash(&self) -> Option<u64> {
// it is very difficult to determine the check state hash for byonm
// so we just return None to signify check caching is not supported
None
}
}

View file

@ -6,12 +6,9 @@ use std::collections::VecDeque;
use std::path::Path;
use std::path::PathBuf;
use deno_core::anyhow::Context;
use deno_core::error::AnyError;
use deno_npm::resolution::NpmResolutionSnapshot;
use deno_npm::NpmPackageId;
use crate::npm::managed::NpmResolutionPackage;
use deno_npm::NpmResolutionPackage;
#[derive(Default)]
pub struct BinEntries<'a> {
@ -50,6 +47,48 @@ pub fn warn_missing_entrypoint(
);
}
#[derive(Debug, thiserror::Error, deno_error::JsError)]
pub enum BinEntriesError {
#[class(inherit)]
#[error("Creating '{path}'")]
Creating {
path: PathBuf,
#[source]
#[inherit]
source: std::io::Error,
},
#[cfg(unix)]
#[class(inherit)]
#[error("Setting permissions on '{path}'")]
Permissions {
path: PathBuf,
#[source]
#[inherit]
source: std::io::Error,
},
#[class(inherit)]
#[error("Can't set up '{name}' bin at {path}")]
SetUpBin {
name: String,
path: PathBuf,
#[source]
#[inherit]
source: Box<Self>,
},
#[cfg(unix)]
#[class(inherit)]
#[error("Setting permissions on '{path}'")]
RemoveBinSymlink {
path: PathBuf,
#[source]
#[inherit]
source: std::io::Error,
},
#[class(inherit)]
#[error(transparent)]
Io(#[from] std::io::Error),
}
impl<'a> BinEntries<'a> {
pub fn new() -> Self {
Self::default()
@ -92,15 +131,15 @@ impl<'a> BinEntries<'a> {
mut already_seen: impl FnMut(
&Path,
&str, // bin script
) -> Result<(), AnyError>,
) -> Result<(), BinEntriesError>,
mut new: impl FnMut(
&NpmResolutionPackage,
&Path,
&str, // bin name
&str, // bin script
) -> Result<(), AnyError>,
) -> Result<(), BinEntriesError>,
mut filter: impl FnMut(&NpmResolutionPackage) -> bool,
) -> Result<(), AnyError> {
) -> Result<(), BinEntriesError> {
if !self.collisions.is_empty() && !self.sorted {
// walking the dependency tree to find out the depth of each package
// is sort of expensive, so we only do it if there's a collision
@ -168,11 +207,14 @@ impl<'a> BinEntries<'a> {
bin_node_modules_dir_path: &Path,
filter: impl FnMut(&NpmResolutionPackage) -> bool,
mut handler: impl FnMut(&EntrySetupOutcome<'_>),
) -> Result<(), AnyError> {
) -> Result<(), BinEntriesError> {
if !self.entries.is_empty() && !bin_node_modules_dir_path.exists() {
std::fs::create_dir_all(bin_node_modules_dir_path).with_context(
|| format!("Creating '{}'", bin_node_modules_dir_path.display()),
)?;
std::fs::create_dir_all(bin_node_modules_dir_path).map_err(|source| {
BinEntriesError::Creating {
path: bin_node_modules_dir_path.to_path_buf(),
source,
}
})?;
}
self.for_each_entry(
@ -209,7 +251,7 @@ impl<'a> BinEntries<'a> {
snapshot: &NpmResolutionSnapshot,
bin_node_modules_dir_path: &Path,
handler: impl FnMut(&EntrySetupOutcome<'_>),
) -> Result<(), AnyError> {
) -> Result<(), BinEntriesError> {
self.set_up_entries_filtered(
snapshot,
bin_node_modules_dir_path,
@ -226,7 +268,7 @@ impl<'a> BinEntries<'a> {
bin_node_modules_dir_path: &Path,
handler: impl FnMut(&EntrySetupOutcome<'_>),
only: &HashSet<&NpmPackageId>,
) -> Result<(), AnyError> {
) -> Result<(), BinEntriesError> {
self.set_up_entries_filtered(
snapshot,
bin_node_modules_dir_path,
@ -301,7 +343,7 @@ pub fn set_up_bin_entry<'a>(
#[allow(unused_variables)] bin_script: &str,
#[allow(unused_variables)] package_path: &'a Path,
bin_node_modules_dir_path: &Path,
) -> Result<EntrySetupOutcome<'a>, AnyError> {
) -> Result<EntrySetupOutcome<'a>, BinEntriesError> {
#[cfg(windows)]
{
set_up_bin_shim(package, bin_name, bin_node_modules_dir_path)?;
@ -324,14 +366,16 @@ fn set_up_bin_shim(
package: &NpmResolutionPackage,
bin_name: &str,
bin_node_modules_dir_path: &Path,
) -> Result<(), AnyError> {
) -> Result<(), BinEntriesError> {
use std::fs;
let mut cmd_shim = bin_node_modules_dir_path.join(bin_name);
cmd_shim.set_extension("cmd");
let shim = format!("@deno run -A npm:{}/{bin_name} %*", package.id.nv);
fs::write(&cmd_shim, shim).with_context(|| {
format!("Can't set up '{}' bin at {}", bin_name, cmd_shim.display())
fs::write(&cmd_shim, shim).map_err(|err| BinEntriesError::SetUpBin {
name: bin_name.to_string(),
path: cmd_shim.clone(),
source: Box::new(err.into()),
})?;
Ok(())
@ -340,7 +384,7 @@ fn set_up_bin_shim(
#[cfg(unix)]
/// Make the file at `path` executable if it exists.
/// Returns `true` if the file exists, `false` otherwise.
fn make_executable_if_exists(path: &Path) -> Result<bool, AnyError> {
fn make_executable_if_exists(path: &Path) -> Result<bool, BinEntriesError> {
use std::io;
use std::os::unix::fs::PermissionsExt;
let mut perms = match std::fs::metadata(path) {
@ -355,8 +399,11 @@ fn make_executable_if_exists(path: &Path) -> Result<bool, AnyError> {
if perms.mode() & 0o111 == 0 {
// if the original file is not executable, make it executable
perms.set_mode(perms.mode() | 0o111);
std::fs::set_permissions(path, perms).with_context(|| {
format!("Setting permissions on '{}'", path.display())
std::fs::set_permissions(path, perms).map_err(|source| {
BinEntriesError::Permissions {
path: path.to_path_buf(),
source,
}
})?;
}
@ -395,14 +442,18 @@ fn symlink_bin_entry<'a>(
bin_script: &str,
package_path: &'a Path,
bin_node_modules_dir_path: &Path,
) -> Result<EntrySetupOutcome<'a>, AnyError> {
) -> Result<EntrySetupOutcome<'a>, BinEntriesError> {
use std::io;
use std::os::unix::fs::symlink;
let link = bin_node_modules_dir_path.join(bin_name);
let original = package_path.join(bin_script);
let found = make_executable_if_exists(&original).with_context(|| {
format!("Can't set up '{}' bin at {}", bin_name, original.display())
let found = make_executable_if_exists(&original).map_err(|source| {
BinEntriesError::SetUpBin {
name: bin_name.to_string(),
path: original.to_path_buf(),
source: Box::new(source),
}
})?;
if !found {
return Ok(EntrySetupOutcome::MissingEntrypoint {
@ -420,27 +471,25 @@ fn symlink_bin_entry<'a>(
if let Err(err) = symlink(&original_relative, &link) {
if err.kind() == io::ErrorKind::AlreadyExists {
// remove and retry
std::fs::remove_file(&link).with_context(|| {
format!(
"Failed to remove existing bin symlink at {}",
link.display()
)
std::fs::remove_file(&link).map_err(|source| {
BinEntriesError::RemoveBinSymlink {
path: link.clone(),
source,
}
})?;
symlink(&original_relative, &link).with_context(|| {
format!(
"Can't set up '{}' bin at {}",
bin_name,
original_relative.display()
)
symlink(&original_relative, &link).map_err(|source| {
BinEntriesError::SetUpBin {
name: bin_name.to_string(),
path: original_relative.to_path_buf(),
source: Box::new(source.into()),
}
})?;
return Ok(EntrySetupOutcome::Success);
}
return Err(err).with_context(|| {
format!(
"Can't set up '{}' bin at {}",
bin_name,
original_relative.display()
)
return Err(BinEntriesError::SetUpBin {
name: bin_name.to_string(),
path: original_relative.to_path_buf(),
source: Box::new(err.into()),
});
}

View file

@ -6,7 +6,6 @@ use std::path::Path;
use std::path::PathBuf;
use std::rc::Rc;
use deno_core::anyhow::Context;
use deno_core::error::AnyError;
use deno_npm::resolution::NpmResolutionSnapshot;
use deno_npm::NpmResolutionPackage;
@ -29,7 +28,7 @@ pub trait LifecycleScriptsStrategy {
fn warn_on_scripts_not_run(
&self,
packages: &[(&NpmResolutionPackage, PathBuf)],
) -> Result<(), AnyError>;
) -> Result<(), std::io::Error>;
fn has_warned(&self, package: &NpmResolutionPackage) -> bool;
@ -38,7 +37,7 @@ pub trait LifecycleScriptsStrategy {
fn did_run_scripts(
&self,
package: &NpmResolutionPackage,
) -> Result<(), AnyError>;
) -> Result<(), std::io::Error>;
}
pub struct LifecycleScripts<'a> {
@ -84,6 +83,27 @@ fn is_broken_default_install_script(script: &str, package_path: &Path) -> bool {
script == "node-gyp rebuild" && !package_path.join("binding.gyp").exists()
}
#[derive(Debug, thiserror::Error, deno_error::JsError)]
pub enum LifecycleScriptsError {
#[class(inherit)]
#[error(transparent)]
Io(#[from] std::io::Error),
#[class(inherit)]
#[error(transparent)]
BinEntries(#[from] super::bin_entries::BinEntriesError),
#[class(inherit)]
#[error(
"failed to create npm process state tempfile for running lifecycle scripts"
)]
CreateNpmProcessState(#[source] std::io::Error),
#[class(generic)]
#[error(transparent)]
Task(AnyError),
#[class(generic)]
#[error("failed to run scripts for packages: {}", .0.join(", "))]
RunScripts(Vec<String>),
}
impl<'a> LifecycleScripts<'a> {
pub fn can_run_scripts(&self, package_nv: &PackageNv) -> bool {
if !self.strategy.can_run_scripts() {
@ -141,7 +161,7 @@ impl<'a> LifecycleScripts<'a> {
}
}
pub fn warn_not_run_scripts(&self) -> Result<(), AnyError> {
pub fn warn_not_run_scripts(&self) -> Result<(), std::io::Error> {
if !self.packages_with_scripts_not_run.is_empty() {
self
.strategy
@ -156,7 +176,7 @@ impl<'a> LifecycleScripts<'a> {
packages: &[NpmResolutionPackage],
root_node_modules_dir_path: &Path,
progress_bar: &ProgressBar,
) -> Result<(), AnyError> {
) -> Result<(), LifecycleScriptsError> {
let kill_signal = KillSignal::default();
let _drop_signal = kill_signal.clone().drop_guard();
// we don't run with signals forwarded because once signals
@ -179,7 +199,7 @@ impl<'a> LifecycleScripts<'a> {
root_node_modules_dir_path: &Path,
progress_bar: &ProgressBar,
kill_signal: KillSignal,
) -> Result<(), AnyError> {
) -> Result<(), LifecycleScriptsError> {
self.warn_not_run_scripts()?;
let get_package_path =
|p: &NpmResolutionPackage| self.strategy.package_path(p);
@ -198,9 +218,9 @@ impl<'a> LifecycleScripts<'a> {
snapshot,
packages,
get_package_path,
)?;
);
let init_cwd = &self.config.initial_cwd;
let process_state = crate::npm::managed::npm_process_state(
let process_state = deno_lib::npm::npm_process_state(
snapshot.as_valid_serialized(),
Some(root_node_modules_dir_path),
);
@ -220,14 +240,15 @@ impl<'a> LifecycleScripts<'a> {
// However, if we concurrently run scripts in the future we will
// have to have multiple temp files.
let temp_file_fd =
deno_runtime::ops::process::npm_process_state_tempfile(
deno_runtime::deno_process::npm_process_state_tempfile(
process_state.as_bytes(),
).context("failed to create npm process state tempfile for running lifecycle scripts")?;
)
.map_err(LifecycleScriptsError::CreateNpmProcessState)?;
// SAFETY: fd/handle is valid
let _temp_file =
unsafe { std::fs::File::from_raw_io_handle(temp_file_fd) }; // make sure the file gets closed
env_vars.insert(
deno_runtime::ops::process::NPM_RESOLUTION_STATE_FD_ENV_VAR_NAME
deno_runtime::deno_process::NPM_RESOLUTION_STATE_FD_ENV_VAR_NAME
.to_string(),
(temp_file_fd as usize).to_string(),
);
@ -240,7 +261,7 @@ impl<'a> LifecycleScripts<'a> {
package,
snapshot,
get_package_path,
)?;
);
for script_name in ["preinstall", "install", "postinstall"] {
if let Some(script) = package.scripts.get(script_name) {
if script_name == "install"
@ -273,7 +294,8 @@ impl<'a> LifecycleScripts<'a> {
kill_signal: kill_signal.clone(),
},
)
.await?;
.await
.map_err(LifecycleScriptsError::Task)?;
let stdout = stdout.unwrap();
let stderr = stderr.unwrap();
if exit_code != 0 {
@ -322,14 +344,12 @@ impl<'a> LifecycleScripts<'a> {
if failed_packages.is_empty() {
Ok(())
} else {
Err(AnyError::msg(format!(
"failed to run scripts for packages: {}",
Err(LifecycleScriptsError::RunScripts(
failed_packages
.iter()
.map(|p| p.to_string())
.collect::<Vec<_>>()
.join(", ")
)))
.collect::<Vec<_>>(),
))
}
}
}
@ -349,7 +369,7 @@ fn resolve_baseline_custom_commands<'a>(
snapshot: &'a NpmResolutionSnapshot,
packages: &'a [NpmResolutionPackage],
get_package_path: impl Fn(&NpmResolutionPackage) -> PathBuf,
) -> Result<crate::task_runner::TaskCustomCommands, AnyError> {
) -> crate::task_runner::TaskCustomCommands {
let mut custom_commands = crate::task_runner::TaskCustomCommands::new();
custom_commands
.insert("npx".to_string(), Rc::new(crate::task_runner::NpxCommand));
@ -390,7 +410,7 @@ fn resolve_custom_commands_from_packages<
snapshot: &'a NpmResolutionSnapshot,
packages: P,
get_package_path: impl Fn(&'a NpmResolutionPackage) -> PathBuf,
) -> Result<crate::task_runner::TaskCustomCommands, AnyError> {
) -> crate::task_runner::TaskCustomCommands {
for package in packages {
let package_path = get_package_path(package);
@ -409,7 +429,7 @@ fn resolve_custom_commands_from_packages<
);
}
Ok(commands)
commands
}
// resolves the custom commands from the dependencies of a package
@ -420,7 +440,7 @@ fn resolve_custom_commands_from_deps(
package: &NpmResolutionPackage,
snapshot: &NpmResolutionSnapshot,
get_package_path: impl Fn(&NpmResolutionPackage) -> PathBuf,
) -> Result<crate::task_runner::TaskCustomCommands, AnyError> {
) -> crate::task_runner::TaskCustomCommands {
let mut bin_entries = BinEntries::new();
resolve_custom_commands_from_packages(
&mut bin_entries,

View file

@ -0,0 +1,18 @@
// Copyright 2018-2025 the Deno authors. MIT license.
use async_trait::async_trait;
use deno_error::JsErrorBox;
use super::PackageCaching;
pub mod bin_entries;
pub mod lifecycle_scripts;
/// Part of the resolution that interacts with the file system.
#[async_trait(?Send)]
pub trait NpmPackageFsInstaller: std::fmt::Debug + Send + Sync {
async fn cache_packages<'a>(
&self,
caching: PackageCaching<'a>,
) -> Result<(), JsErrorBox>;
}

View file

@ -1,151 +1,61 @@
// Copyright 2018-2025 the Deno authors. MIT license.
//! Code for global npm cache resolution.
use std::borrow::Cow;
use std::path::Path;
use std::path::PathBuf;
use std::sync::Arc;
use async_trait::async_trait;
use deno_ast::ModuleSpecifier;
use deno_core::error::AnyError;
use deno_core::futures::stream::FuturesUnordered;
use deno_core::futures::StreamExt;
use deno_npm::NpmPackageCacheFolderId;
use deno_npm::NpmPackageId;
use deno_error::JsErrorBox;
use deno_lib::util::hash::FastInsecureHasher;
use deno_npm::NpmResolutionPackage;
use deno_npm::NpmSystemInfo;
use node_resolver::errors::PackageFolderResolveError;
use node_resolver::errors::PackageNotFoundError;
use node_resolver::errors::ReferrerNotFoundError;
use deno_resolver::npm::managed::NpmResolutionCell;
use super::super::resolution::NpmResolution;
use super::common::lifecycle_scripts::LifecycleScriptsStrategy;
use super::common::NpmPackageFsResolver;
use super::common::NpmPackageFsInstaller;
use super::PackageCaching;
use crate::args::LifecycleScriptsConfig;
use crate::cache::FastInsecureHasher;
use crate::colors;
use crate::npm::managed::PackageCaching;
use crate::npm::CliNpmCache;
use crate::npm::CliNpmTarballCache;
/// Resolves packages from the global npm cache.
#[derive(Debug)]
pub struct GlobalNpmPackageResolver {
pub struct GlobalNpmPackageInstaller {
cache: Arc<CliNpmCache>,
tarball_cache: Arc<CliNpmTarballCache>,
resolution: Arc<NpmResolution>,
system_info: NpmSystemInfo,
resolution: Arc<NpmResolutionCell>,
lifecycle_scripts: LifecycleScriptsConfig,
system_info: NpmSystemInfo,
}
impl GlobalNpmPackageResolver {
impl GlobalNpmPackageInstaller {
pub fn new(
cache: Arc<CliNpmCache>,
tarball_cache: Arc<CliNpmTarballCache>,
resolution: Arc<NpmResolution>,
system_info: NpmSystemInfo,
resolution: Arc<NpmResolutionCell>,
lifecycle_scripts: LifecycleScriptsConfig,
system_info: NpmSystemInfo,
) -> Self {
Self {
cache,
tarball_cache,
resolution,
system_info,
lifecycle_scripts,
system_info,
}
}
}
#[async_trait(?Send)]
impl NpmPackageFsResolver for GlobalNpmPackageResolver {
fn node_modules_path(&self) -> Option<&Path> {
None
}
fn maybe_package_folder(&self, id: &NpmPackageId) -> Option<PathBuf> {
let folder_id = self
.resolution
.resolve_pkg_cache_folder_id_from_pkg_id(id)?;
Some(self.cache.package_folder_for_id(&folder_id))
}
fn resolve_package_folder_from_package(
&self,
name: &str,
referrer: &ModuleSpecifier,
) -> Result<PathBuf, PackageFolderResolveError> {
use deno_npm::resolution::PackageNotFoundFromReferrerError;
let Some(referrer_cache_folder_id) = self
.cache
.resolve_package_folder_id_from_specifier(referrer)
else {
return Err(
ReferrerNotFoundError {
referrer: referrer.clone(),
referrer_extra: None,
}
.into(),
);
};
let resolve_result = self
.resolution
.resolve_package_from_package(name, &referrer_cache_folder_id);
match resolve_result {
Ok(pkg) => match self.maybe_package_folder(&pkg.id) {
Some(folder) => Ok(folder),
None => Err(
PackageNotFoundError {
package_name: name.to_string(),
referrer: referrer.clone(),
referrer_extra: Some(format!(
"{} -> {}",
referrer_cache_folder_id,
pkg.id.as_serialized()
)),
}
.into(),
),
},
Err(err) => match *err {
PackageNotFoundFromReferrerError::Referrer(cache_folder_id) => Err(
ReferrerNotFoundError {
referrer: referrer.clone(),
referrer_extra: Some(cache_folder_id.to_string()),
}
.into(),
),
PackageNotFoundFromReferrerError::Package {
name,
referrer: cache_folder_id_referrer,
} => Err(
PackageNotFoundError {
package_name: name,
referrer: referrer.clone(),
referrer_extra: Some(cache_folder_id_referrer.to_string()),
}
.into(),
),
},
}
}
fn resolve_package_cache_folder_id_from_specifier(
&self,
specifier: &ModuleSpecifier,
) -> Result<Option<NpmPackageCacheFolderId>, AnyError> {
Ok(
self
.cache
.resolve_package_folder_id_from_specifier(specifier),
)
}
impl NpmPackageFsInstaller for GlobalNpmPackageInstaller {
async fn cache_packages<'a>(
&self,
caching: PackageCaching<'a>,
) -> Result<(), AnyError> {
) -> Result<(), JsErrorBox> {
let package_partitions = match caching {
PackageCaching::All => self
.resolution
@ -155,13 +65,16 @@ impl NpmPackageFsResolver for GlobalNpmPackageResolver {
.subset(&reqs)
.all_system_packages_partitioned(&self.system_info),
};
cache_packages(&package_partitions.packages, &self.tarball_cache).await?;
cache_packages(&package_partitions.packages, &self.tarball_cache)
.await
.map_err(JsErrorBox::from_err)?;
// create the copy package folders
for copy in package_partitions.copy_packages {
self
.cache
.ensure_copy_package(&copy.get_package_cache_folder_id())?;
.ensure_copy_package(&copy.get_package_cache_folder_id())
.map_err(JsErrorBox::from_err)?;
}
let mut lifecycle_scripts =
@ -174,7 +87,9 @@ impl NpmPackageFsResolver for GlobalNpmPackageResolver {
lifecycle_scripts.add(package, Cow::Borrowed(&package_folder));
}
lifecycle_scripts.warn_not_run_scripts()?;
lifecycle_scripts
.warn_not_run_scripts()
.map_err(JsErrorBox::from_err)?;
Ok(())
}
@ -183,7 +98,7 @@ impl NpmPackageFsResolver for GlobalNpmPackageResolver {
async fn cache_packages(
packages: &[NpmResolutionPackage],
tarball_cache: &Arc<CliNpmTarballCache>,
) -> Result<(), AnyError> {
) -> Result<(), deno_npm_cache::EnsurePackageError> {
let mut futures_unordered = FuturesUnordered::new();
for package in packages {
futures_unordered.push(async move {
@ -200,17 +115,17 @@ async fn cache_packages(
}
struct GlobalLifecycleScripts<'a> {
resolver: &'a GlobalNpmPackageResolver,
installer: &'a GlobalNpmPackageInstaller,
path_hash: u64,
}
impl<'a> GlobalLifecycleScripts<'a> {
fn new(resolver: &'a GlobalNpmPackageResolver, root_dir: &Path) -> Self {
fn new(installer: &'a GlobalNpmPackageInstaller, root_dir: &Path) -> Self {
let mut hasher = FastInsecureHasher::new_without_deno_version();
hasher.write(root_dir.to_string_lossy().as_bytes());
let path_hash = hasher.finish();
Self {
resolver,
installer,
path_hash,
}
}
@ -229,13 +144,13 @@ impl<'a> super::common::lifecycle_scripts::LifecycleScriptsStrategy
false
}
fn package_path(&self, package: &NpmResolutionPackage) -> PathBuf {
self.resolver.cache.package_folder_for_nv(&package.id.nv)
self.installer.cache.package_folder_for_nv(&package.id.nv)
}
fn warn_on_scripts_not_run(
&self,
packages: &[(&NpmResolutionPackage, PathBuf)],
) -> std::result::Result<(), deno_core::anyhow::Error> {
) -> std::result::Result<(), std::io::Error> {
log::warn!("{} The following packages contained npm lifecycle scripts ({}) that were not executed:", colors::yellow("Warning"), colors::gray("preinstall/install/postinstall"));
for (package, _) in packages {
log::warn!("┠─ {}", colors::gray(format!("npm:{}", package.id.nv)));
@ -261,7 +176,7 @@ impl<'a> super::common::lifecycle_scripts::LifecycleScriptsStrategy
fn did_run_scripts(
&self,
_package: &NpmResolutionPackage,
) -> std::result::Result<(), deno_core::anyhow::Error> {
) -> Result<(), std::io::Error> {
Ok(())
}

View file

@ -2,7 +2,6 @@
//! Code for local node_modules resolution.
use std::borrow::Cow;
use std::cell::RefCell;
use std::cmp::Ordering;
use std::collections::hash_map::Entry;
@ -17,40 +16,28 @@ use std::rc::Rc;
use std::sync::Arc;
use async_trait::async_trait;
use deno_ast::ModuleSpecifier;
use deno_cache_dir::npm::mixed_case_package_name_decode;
use deno_core::anyhow::Context;
use deno_core::error::AnyError;
use deno_core::futures::stream::FuturesUnordered;
use deno_core::futures::StreamExt;
use deno_core::parking_lot::Mutex;
use deno_core::url::Url;
use deno_error::JsErrorBox;
use deno_npm::resolution::NpmResolutionSnapshot;
use deno_npm::NpmPackageCacheFolderId;
use deno_npm::NpmPackageId;
use deno_npm::NpmResolutionPackage;
use deno_npm::NpmSystemInfo;
use deno_path_util::fs::atomic_write_file_with_retries;
use deno_path_util::fs::canonicalize_path_maybe_not_exists;
use deno_resolver::npm::normalize_pkg_name_for_node_modules_deno_folder;
use deno_resolver::npm::get_package_folder_id_folder_name;
use deno_resolver::npm::managed::NpmResolutionCell;
use deno_semver::package::PackageNv;
use deno_semver::StackString;
use node_resolver::errors::PackageFolderResolveError;
use node_resolver::errors::PackageFolderResolveIoError;
use node_resolver::errors::PackageNotFoundError;
use node_resolver::errors::ReferrerNotFoundError;
use serde::Deserialize;
use serde::Serialize;
use sys_traits::FsMetadata;
use super::super::resolution::NpmResolution;
use super::common::bin_entries;
use super::common::NpmPackageFsResolver;
use super::common::NpmPackageFsInstaller;
use super::PackageCaching;
use crate::args::LifecycleScriptsConfig;
use crate::args::NpmInstallDepsProvider;
use crate::cache::CACHE_PERM;
use crate::colors;
use crate::npm::managed::PackageCaching;
use crate::npm::CliNpmCache;
use crate::npm::CliNpmTarballCache;
use crate::sys::CliSys;
@ -63,31 +50,30 @@ use crate::util::progress_bar::ProgressMessagePrompt;
/// Resolver that creates a local node_modules directory
/// and resolves packages from it.
#[derive(Debug)]
pub struct LocalNpmPackageResolver {
pub struct LocalNpmPackageInstaller {
cache: Arc<CliNpmCache>,
npm_install_deps_provider: Arc<NpmInstallDepsProvider>,
progress_bar: ProgressBar,
resolution: Arc<NpmResolution>,
resolution: Arc<NpmResolutionCell>,
sys: CliSys,
tarball_cache: Arc<CliNpmTarballCache>,
root_node_modules_path: PathBuf,
root_node_modules_url: Url,
system_info: NpmSystemInfo,
lifecycle_scripts: LifecycleScriptsConfig,
root_node_modules_path: PathBuf,
system_info: NpmSystemInfo,
}
impl LocalNpmPackageResolver {
impl LocalNpmPackageInstaller {
#[allow(clippy::too_many_arguments)]
pub fn new(
cache: Arc<CliNpmCache>,
npm_install_deps_provider: Arc<NpmInstallDepsProvider>,
progress_bar: ProgressBar,
resolution: Arc<NpmResolution>,
resolution: Arc<NpmResolutionCell>,
sys: CliSys,
tarball_cache: Arc<CliNpmTarballCache>,
node_modules_folder: PathBuf,
system_info: NpmSystemInfo,
lifecycle_scripts: LifecycleScriptsConfig,
system_info: NpmSystemInfo,
) -> Self {
Self {
cache,
@ -96,162 +82,19 @@ impl LocalNpmPackageResolver {
resolution,
tarball_cache,
sys,
root_node_modules_url: Url::from_directory_path(&node_modules_folder)
.unwrap(),
lifecycle_scripts,
root_node_modules_path: node_modules_folder,
system_info,
lifecycle_scripts,
}
}
fn resolve_package_root(&self, path: &Path) -> PathBuf {
let mut last_found = path;
loop {
let parent = last_found.parent().unwrap();
if parent.file_name().unwrap() == "node_modules" {
return last_found.to_path_buf();
} else {
last_found = parent;
}
}
}
fn resolve_folder_for_specifier(
&self,
specifier: &ModuleSpecifier,
) -> Result<Option<PathBuf>, std::io::Error> {
let Some(relative_url) =
self.root_node_modules_url.make_relative(specifier)
else {
return Ok(None);
};
if relative_url.starts_with("../") {
return Ok(None);
}
// it's within the directory, so use it
let Some(path) = specifier.to_file_path().ok() else {
return Ok(None);
};
// Canonicalize the path so it's not pointing to the symlinked directory
// in `node_modules` directory of the referrer.
canonicalize_path_maybe_not_exists(&self.sys, &path).map(Some)
}
fn resolve_package_folder_from_specifier(
&self,
specifier: &ModuleSpecifier,
) -> Result<Option<PathBuf>, AnyError> {
let Some(local_path) = self.resolve_folder_for_specifier(specifier)? else {
return Ok(None);
};
let package_root_path = self.resolve_package_root(&local_path);
Ok(Some(package_root_path))
}
}
#[async_trait(?Send)]
impl NpmPackageFsResolver for LocalNpmPackageResolver {
fn node_modules_path(&self) -> Option<&Path> {
Some(self.root_node_modules_path.as_ref())
}
fn maybe_package_folder(&self, id: &NpmPackageId) -> Option<PathBuf> {
let cache_folder_id = self
.resolution
.resolve_pkg_cache_folder_id_from_pkg_id(id)?;
// package is stored at:
// node_modules/.deno/<package_cache_folder_id_folder_name>/node_modules/<package_name>
Some(
self
.root_node_modules_path
.join(".deno")
.join(get_package_folder_id_folder_name(&cache_folder_id))
.join("node_modules")
.join(&cache_folder_id.nv.name),
)
}
fn resolve_package_folder_from_package(
&self,
name: &str,
referrer: &ModuleSpecifier,
) -> Result<PathBuf, PackageFolderResolveError> {
let maybe_local_path = self
.resolve_folder_for_specifier(referrer)
.map_err(|err| PackageFolderResolveIoError {
package_name: name.to_string(),
referrer: referrer.clone(),
source: err,
})?;
let Some(local_path) = maybe_local_path else {
return Err(
ReferrerNotFoundError {
referrer: referrer.clone(),
referrer_extra: None,
}
.into(),
);
};
let package_root_path = self.resolve_package_root(&local_path);
let mut current_folder = package_root_path.as_path();
while let Some(parent_folder) = current_folder.parent() {
current_folder = parent_folder;
let node_modules_folder = if current_folder.ends_with("node_modules") {
Cow::Borrowed(current_folder)
} else {
Cow::Owned(current_folder.join("node_modules"))
};
let sub_dir = join_package_name(&node_modules_folder, name);
if self.sys.fs_is_dir_no_err(&sub_dir) {
return Ok(sub_dir);
}
if current_folder == self.root_node_modules_path {
break;
}
}
Err(
PackageNotFoundError {
package_name: name.to_string(),
referrer: referrer.clone(),
referrer_extra: None,
}
.into(),
)
}
fn resolve_package_cache_folder_id_from_specifier(
&self,
specifier: &ModuleSpecifier,
) -> Result<Option<NpmPackageCacheFolderId>, AnyError> {
let Some(folder_path) =
self.resolve_package_folder_from_specifier(specifier)?
else {
return Ok(None);
};
// ex. project/node_modules/.deno/preact@10.24.3/node_modules/preact/
let Some(node_modules_ancestor) = folder_path
.ancestors()
.find(|ancestor| ancestor.ends_with("node_modules"))
else {
return Ok(None);
};
let Some(folder_name) =
node_modules_ancestor.parent().and_then(|p| p.file_name())
else {
return Ok(None);
};
Ok(get_package_folder_id_from_folder_name(
&folder_name.to_string_lossy(),
))
}
impl NpmPackageFsInstaller for LocalNpmPackageInstaller {
async fn cache_packages<'a>(
&self,
caching: PackageCaching<'a>,
) -> Result<(), AnyError> {
) -> Result<(), JsErrorBox> {
let snapshot = match caching {
PackageCaching::All => self.resolution.snapshot(),
PackageCaching::Only(reqs) => self.resolution.subset(&reqs),
@ -263,10 +106,12 @@ impl NpmPackageFsResolver for LocalNpmPackageResolver {
&self.progress_bar,
&self.tarball_cache,
&self.root_node_modules_path,
&self.sys,
&self.system_info,
&self.lifecycle_scripts,
)
.await
.map_err(JsErrorBox::from_err)
}
}
@ -285,6 +130,38 @@ fn local_node_modules_package_contents_path(
.join(&package.id.nv.name)
}
#[derive(Debug, thiserror::Error, deno_error::JsError)]
pub enum SyncResolutionWithFsError {
#[class(inherit)]
#[error("Creating '{path}'")]
Creating {
path: PathBuf,
#[source]
#[inherit]
source: std::io::Error,
},
#[class(inherit)]
#[error(transparent)]
CopyDirRecursive(#[from] crate::util::fs::CopyDirRecursiveError),
#[class(inherit)]
#[error(transparent)]
SymlinkPackageDir(#[from] SymlinkPackageDirError),
#[class(inherit)]
#[error(transparent)]
BinEntries(#[from] bin_entries::BinEntriesError),
#[class(inherit)]
#[error(transparent)]
LifecycleScripts(
#[from] super::common::lifecycle_scripts::LifecycleScriptsError,
),
#[class(inherit)]
#[error(transparent)]
Io(#[from] std::io::Error),
#[class(inherit)]
#[error(transparent)]
Other(#[from] JsErrorBox),
}
/// Creates a pnpm style folder structure.
#[allow(clippy::too_many_arguments)]
async fn sync_resolution_with_fs(
@ -294,9 +171,10 @@ async fn sync_resolution_with_fs(
progress_bar: &ProgressBar,
tarball_cache: &Arc<CliNpmTarballCache>,
root_node_modules_dir_path: &Path,
sys: &CliSys,
system_info: &NpmSystemInfo,
lifecycle_scripts: &LifecycleScriptsConfig,
) -> Result<(), AnyError> {
) -> Result<(), SyncResolutionWithFsError> {
if snapshot.is_empty()
&& npm_install_deps_provider.workspace_pkgs().is_empty()
{
@ -311,12 +189,18 @@ async fn sync_resolution_with_fs(
let deno_local_registry_dir = root_node_modules_dir_path.join(".deno");
let deno_node_modules_dir = deno_local_registry_dir.join("node_modules");
fs::create_dir_all(&deno_node_modules_dir).with_context(|| {
format!("Creating '{}'", deno_local_registry_dir.display())
fs::create_dir_all(&deno_node_modules_dir).map_err(|source| {
SyncResolutionWithFsError::Creating {
path: deno_local_registry_dir.to_path_buf(),
source,
}
})?;
let bin_node_modules_dir_path = root_node_modules_dir_path.join(".bin");
fs::create_dir_all(&bin_node_modules_dir_path).with_context(|| {
format!("Creating '{}'", bin_node_modules_dir_path.display())
fs::create_dir_all(&bin_node_modules_dir_path).map_err(|source| {
SyncResolutionWithFsError::Creating {
path: deno_local_registry_dir.to_path_buf(),
source,
}
})?;
let single_process_lock = LaxSingleProcessFsFlag::lock(
@ -420,7 +304,8 @@ async fn sync_resolution_with_fs(
cache_futures.push(async move {
tarball_cache
.ensure_package(&package.id.nv, &package.dist)
.await?;
.await
.map_err(JsErrorBox::from_err)?;
let pb_guard = progress_bar.update_with_prompt(
ProgressMessagePrompt::Initialize,
&package.id.nv.to_string(),
@ -432,19 +317,18 @@ async fn sync_resolution_with_fs(
deno_core::unsync::spawn_blocking({
let package_path = package_path.clone();
let sys = sys.clone();
move || {
clone_dir_recursive(
&crate::sys::CliSys::default(),
&cache_folder,
&package_path,
)?;
clone_dir_recursive(&sys, &cache_folder, &package_path)?;
// write out a file that indicates this folder has been initialized
fs::write(initialized_file, tags)?;
Ok::<_, AnyError>(())
Ok::<_, SyncResolutionWithFsError>(())
}
})
.await??;
.await
.map_err(JsErrorBox::from_err)?
.map_err(JsErrorBox::from_err)?;
if package.bin.is_some() {
bin_entries_to_setup.borrow_mut().add(package, package_path);
@ -458,7 +342,7 @@ async fn sync_resolution_with_fs(
// finally stop showing the progress bar
drop(pb_guard); // explicit for clarity
Ok::<_, AnyError>(())
Ok::<_, JsErrorBox>(())
});
} else if matches!(package_state, PackageFolderState::TagsOutdated) {
fs::write(initialized_file, tags)?;
@ -494,11 +378,7 @@ async fn sync_resolution_with_fs(
&package.id.nv.name,
);
clone_dir_recursive(
&crate::sys::CliSys::default(),
&source_path,
&package_path,
)?;
clone_dir_recursive(sys, &source_path, &package_path)?;
// write out a file that indicates this folder has been initialized
fs::write(initialized_file, "")?;
}
@ -597,8 +477,11 @@ async fn sync_resolution_with_fs(
// symlink the dep into the package's child node_modules folder
let dest_node_modules = remote.base_dir.join("node_modules");
if !existing_child_node_modules_dirs.contains(&dest_node_modules) {
fs::create_dir_all(&dest_node_modules).with_context(|| {
format!("Creating '{}'", dest_node_modules.display())
fs::create_dir_all(&dest_node_modules).map_err(|source| {
SyncResolutionWithFsError::Creating {
path: dest_node_modules.clone(),
source,
}
})?;
existing_child_node_modules_dirs.insert(dest_node_modules.clone());
}
@ -813,7 +696,7 @@ impl<'a> super::common::lifecycle_scripts::LifecycleScriptsStrategy
fn did_run_scripts(
&self,
package: &NpmResolutionPackage,
) -> std::result::Result<(), deno_core::anyhow::Error> {
) -> std::result::Result<(), std::io::Error> {
std::fs::write(self.ran_scripts_file(package), "")?;
Ok(())
}
@ -821,7 +704,7 @@ impl<'a> super::common::lifecycle_scripts::LifecycleScriptsStrategy
fn warn_on_scripts_not_run(
&self,
packages: &[(&NpmResolutionPackage, std::path::PathBuf)],
) -> Result<(), AnyError> {
) -> Result<(), std::io::Error> {
if !packages.is_empty() {
log::warn!("{} The following packages contained npm lifecycle scripts ({}) that were not executed:", colors::yellow("Warning"), colors::gray("preinstall/install/postinstall"));
@ -1004,52 +887,42 @@ impl SetupCache {
}
}
fn get_package_folder_id_folder_name(
folder_id: &NpmPackageCacheFolderId,
) -> String {
let copy_str = if folder_id.copy_index == 0 {
Cow::Borrowed("")
} else {
Cow::Owned(format!("_{}", folder_id.copy_index))
};
let nv = &folder_id.nv;
let name = normalize_pkg_name_for_node_modules_deno_folder(&nv.name);
format!("{}@{}{}", name, nv.version, copy_str)
}
fn get_package_folder_id_from_folder_name(
folder_name: &str,
) -> Option<NpmPackageCacheFolderId> {
let folder_name = folder_name.replace('+', "/");
let (name, ending) = folder_name.rsplit_once('@')?;
let name: StackString = if let Some(encoded_name) = name.strip_prefix('_') {
StackString::from_string(mixed_case_package_name_decode(encoded_name)?)
} else {
name.into()
};
let (raw_version, copy_index) = match ending.split_once('_') {
Some((raw_version, copy_index)) => {
let copy_index = copy_index.parse::<u8>().ok()?;
(raw_version, copy_index)
}
None => (ending, 0),
};
let version = deno_semver::Version::parse_from_npm(raw_version).ok()?;
Some(NpmPackageCacheFolderId {
nv: PackageNv { name, version },
copy_index,
})
#[derive(Debug, thiserror::Error, deno_error::JsError)]
pub enum SymlinkPackageDirError {
#[class(inherit)]
#[error("Creating '{parent}'")]
Creating {
parent: PathBuf,
#[source]
#[inherit]
source: std::io::Error,
},
#[class(inherit)]
#[error(transparent)]
Other(#[from] std::io::Error),
#[cfg(windows)]
#[class(inherit)]
#[error("Creating junction in node_modules folder")]
FailedCreatingJunction {
#[source]
#[inherit]
source: std::io::Error,
},
}
fn symlink_package_dir(
old_path: &Path,
new_path: &Path,
) -> Result<(), AnyError> {
) -> Result<(), SymlinkPackageDirError> {
let new_parent = new_path.parent().unwrap();
if new_parent.file_name().unwrap() != "node_modules" {
// create the parent folder that will contain the symlink
fs::create_dir_all(new_parent)
.with_context(|| format!("Creating '{}'", new_parent.display()))?;
fs::create_dir_all(new_parent).map_err(|source| {
SymlinkPackageDirError::Creating {
parent: new_parent.to_path_buf(),
source,
}
})?;
}
// need to delete the previous symlink before creating a new one
@ -1075,7 +948,7 @@ fn junction_or_symlink_dir(
old_path_relative: &Path,
old_path: &Path,
new_path: &Path,
) -> Result<(), AnyError> {
) -> Result<(), SymlinkPackageDirError> {
static USE_JUNCTIONS: std::sync::atomic::AtomicBool =
std::sync::atomic::AtomicBool::new(false);
@ -1084,8 +957,9 @@ fn junction_or_symlink_dir(
// needing to elevate privileges on Windows.
// Note: junctions don't support relative paths, so we need to use the
// absolute path here.
return junction::create(old_path, new_path)
.context("Failed creating junction in node_modules folder");
return junction::create(old_path, new_path).map_err(|source| {
SymlinkPackageDirError::FailedCreatingJunction { source }
});
}
match symlink_dir(&crate::sys::CliSys::default(), old_path_relative, new_path)
@ -1095,8 +969,9 @@ fn junction_or_symlink_dir(
if symlink_err.kind() == std::io::ErrorKind::PermissionDenied =>
{
USE_JUNCTIONS.store(true, std::sync::atomic::Ordering::Relaxed);
junction::create(old_path, new_path)
.context("Failed creating junction in node_modules folder")
junction::create(old_path, new_path).map_err(|source| {
SymlinkPackageDirError::FailedCreatingJunction { source }
})
}
Err(symlink_err) => {
log::warn!(
@ -1104,8 +979,9 @@ fn junction_or_symlink_dir(
colors::yellow("Warning")
);
USE_JUNCTIONS.store(true, std::sync::atomic::Ordering::Relaxed);
junction::create(old_path, new_path)
.context("Failed creating junction in node_modules folder")
junction::create(old_path, new_path).map_err(|source| {
SymlinkPackageDirError::FailedCreatingJunction { source }
})
}
}
}
@ -1121,37 +997,10 @@ fn join_package_name(path: &Path, package_name: &str) -> PathBuf {
#[cfg(test)]
mod test {
use deno_npm::NpmPackageCacheFolderId;
use deno_semver::package::PackageNv;
use test_util::TempDir;
use super::*;
#[test]
fn test_get_package_folder_id_folder_name() {
let cases = vec![
(
NpmPackageCacheFolderId {
nv: PackageNv::from_str("@types/foo@1.2.3").unwrap(),
copy_index: 1,
},
"@types+foo@1.2.3_1".to_string(),
),
(
NpmPackageCacheFolderId {
nv: PackageNv::from_str("JSON@3.2.1").unwrap(),
copy_index: 0,
},
"_jjju6tq@3.2.1".to_string(),
),
];
for (input, output) in cases {
assert_eq!(get_package_folder_id_folder_name(&input), output);
let folder_id = get_package_folder_id_from_folder_name(&output).unwrap();
assert_eq!(folder_id, input);
}
}
#[test]
fn test_setup_cache() {
let temp_dir = TempDir::new();

283
cli/npm/installer/mod.rs Normal file
View file

@ -0,0 +1,283 @@
// Copyright 2018-2025 the Deno authors. MIT license.
use std::borrow::Cow;
use std::path::PathBuf;
use std::sync::Arc;
use deno_core::error::AnyError;
use deno_core::unsync::sync::AtomicFlag;
use deno_error::JsErrorBox;
use deno_npm::registry::NpmPackageInfo;
use deno_npm::registry::NpmRegistryPackageInfoLoadError;
use deno_npm::NpmSystemInfo;
use deno_resolver::npm::managed::NpmResolutionCell;
use deno_runtime::colors;
use deno_semver::package::PackageReq;
pub use self::common::NpmPackageFsInstaller;
use self::global::GlobalNpmPackageInstaller;
use self::local::LocalNpmPackageInstaller;
pub use self::resolution::AddPkgReqsResult;
pub use self::resolution::NpmResolutionInstaller;
use super::NpmResolutionInitializer;
use crate::args::CliLockfile;
use crate::args::LifecycleScriptsConfig;
use crate::args::NpmInstallDepsProvider;
use crate::args::PackageJsonDepValueParseWithLocationError;
use crate::npm::CliNpmCache;
use crate::npm::CliNpmTarballCache;
use crate::sys::CliSys;
use crate::util::progress_bar::ProgressBar;
mod common;
mod global;
mod local;
mod resolution;
#[derive(Debug, Clone, PartialEq, Eq)]
pub enum PackageCaching<'a> {
Only(Cow<'a, [PackageReq]>),
All,
}
#[derive(Debug)]
pub struct NpmInstaller {
fs_installer: Arc<dyn NpmPackageFsInstaller>,
npm_install_deps_provider: Arc<NpmInstallDepsProvider>,
npm_resolution_initializer: Arc<NpmResolutionInitializer>,
npm_resolution_installer: Arc<NpmResolutionInstaller>,
maybe_lockfile: Option<Arc<CliLockfile>>,
npm_resolution: Arc<NpmResolutionCell>,
top_level_install_flag: AtomicFlag,
}
impl NpmInstaller {
#[allow(clippy::too_many_arguments)]
pub fn new(
npm_cache: Arc<CliNpmCache>,
npm_install_deps_provider: Arc<NpmInstallDepsProvider>,
npm_resolution: Arc<NpmResolutionCell>,
npm_resolution_initializer: Arc<NpmResolutionInitializer>,
npm_resolution_installer: Arc<NpmResolutionInstaller>,
progress_bar: &ProgressBar,
sys: CliSys,
tarball_cache: Arc<CliNpmTarballCache>,
maybe_lockfile: Option<Arc<CliLockfile>>,
maybe_node_modules_path: Option<PathBuf>,
lifecycle_scripts: LifecycleScriptsConfig,
system_info: NpmSystemInfo,
) -> Self {
let fs_installer: Arc<dyn NpmPackageFsInstaller> =
match maybe_node_modules_path {
Some(node_modules_folder) => Arc::new(LocalNpmPackageInstaller::new(
npm_cache,
npm_install_deps_provider.clone(),
progress_bar.clone(),
npm_resolution.clone(),
sys,
tarball_cache,
node_modules_folder,
lifecycle_scripts,
system_info,
)),
None => Arc::new(GlobalNpmPackageInstaller::new(
npm_cache,
tarball_cache,
npm_resolution.clone(),
lifecycle_scripts,
system_info,
)),
};
Self {
fs_installer,
npm_install_deps_provider,
npm_resolution,
npm_resolution_initializer,
npm_resolution_installer,
maybe_lockfile,
top_level_install_flag: Default::default(),
}
}
/// Adds package requirements to the resolver and ensures everything is setup.
/// This includes setting up the `node_modules` directory, if applicable.
pub async fn add_and_cache_package_reqs(
&self,
packages: &[PackageReq],
) -> Result<(), JsErrorBox> {
self.npm_resolution_initializer.ensure_initialized().await?;
self
.add_package_reqs_raw(
packages,
Some(PackageCaching::Only(packages.into())),
)
.await
.dependencies_result
}
pub async fn add_package_reqs_no_cache(
&self,
packages: &[PackageReq],
) -> Result<(), JsErrorBox> {
self.npm_resolution_initializer.ensure_initialized().await?;
self
.add_package_reqs_raw(packages, None)
.await
.dependencies_result
}
pub async fn add_package_reqs(
&self,
packages: &[PackageReq],
caching: PackageCaching<'_>,
) -> Result<(), JsErrorBox> {
self
.add_package_reqs_raw(packages, Some(caching))
.await
.dependencies_result
}
pub async fn add_package_reqs_raw<'a>(
&self,
packages: &[PackageReq],
caching: Option<PackageCaching<'a>>,
) -> AddPkgReqsResult {
if packages.is_empty() {
return AddPkgReqsResult {
dependencies_result: Ok(()),
results: vec![],
};
}
#[cfg(debug_assertions)]
self.npm_resolution_initializer.debug_assert_initialized();
let mut result = self
.npm_resolution_installer
.add_package_reqs(packages)
.await;
if result.dependencies_result.is_ok() {
if let Some(lockfile) = self.maybe_lockfile.as_ref() {
result.dependencies_result = lockfile.error_if_changed();
}
}
if result.dependencies_result.is_ok() {
if let Some(caching) = caching {
result.dependencies_result = self.cache_packages(caching).await;
}
}
result
}
/// Sets package requirements to the resolver, removing old requirements and adding new ones.
///
/// This will retrieve and resolve package information, but not cache any package files.
pub async fn set_package_reqs(
&self,
packages: &[PackageReq],
) -> Result<(), AnyError> {
self
.npm_resolution_installer
.set_package_reqs(packages)
.await
}
pub async fn inject_synthetic_types_node_package(
&self,
) -> Result<(), JsErrorBox> {
self.npm_resolution_initializer.ensure_initialized().await?;
let reqs = &[PackageReq::from_str("@types/node").unwrap()];
// add and ensure this isn't added to the lockfile
self
.add_package_reqs(reqs, PackageCaching::Only(reqs.into()))
.await?;
Ok(())
}
pub async fn cache_package_info(
&self,
package_name: &str,
) -> Result<Arc<NpmPackageInfo>, NpmRegistryPackageInfoLoadError> {
self
.npm_resolution_installer
.cache_package_info(package_name)
.await
}
pub async fn cache_packages(
&self,
caching: PackageCaching<'_>,
) -> Result<(), JsErrorBox> {
self.npm_resolution_initializer.ensure_initialized().await?;
self.fs_installer.cache_packages(caching).await
}
pub fn ensure_no_pkg_json_dep_errors(
&self,
) -> Result<(), Box<PackageJsonDepValueParseWithLocationError>> {
for err in self.npm_install_deps_provider.pkg_json_dep_errors() {
match err.source.as_kind() {
deno_package_json::PackageJsonDepValueParseErrorKind::VersionReq(_) => {
return Err(Box::new(err.clone()));
}
deno_package_json::PackageJsonDepValueParseErrorKind::Unsupported {
..
} => {
// only warn for this one
log::warn!(
"{} {}\n at {}",
colors::yellow("Warning"),
err.source,
err.location,
)
}
}
}
Ok(())
}
/// Ensures that the top level `package.json` dependencies are installed.
/// This may set up the `node_modules` directory.
///
/// Returns `true` if the top level packages are already installed. A
/// return value of `false` means that new packages were added to the NPM resolution.
pub async fn ensure_top_level_package_json_install(
&self,
) -> Result<bool, JsErrorBox> {
if !self.top_level_install_flag.raise() {
return Ok(true); // already did this
}
self.npm_resolution_initializer.ensure_initialized().await?;
let pkg_json_remote_pkgs = self.npm_install_deps_provider.remote_pkgs();
if pkg_json_remote_pkgs.is_empty() {
return Ok(true);
}
// check if something needs resolving before bothering to load all
// the package information (which is slow)
if pkg_json_remote_pkgs.iter().all(|pkg| {
self
.npm_resolution
.resolve_pkg_id_from_pkg_req(&pkg.req)
.is_ok()
}) {
log::debug!(
"All package.json deps resolvable. Skipping top level install."
);
return Ok(true); // everything is already resolvable
}
let pkg_reqs = pkg_json_remote_pkgs
.iter()
.map(|pkg| pkg.req.clone())
.collect::<Vec<_>>();
self.add_package_reqs_no_cache(&pkg_reqs).await?;
Ok(false)
}
}

View file

@ -1,27 +1,21 @@
// Copyright 2018-2025 the Deno authors. MIT license.
use std::collections::HashMap;
use std::collections::HashSet;
use std::sync::Arc;
use capacity_builder::StringBuilder;
use deno_core::error::AnyError;
use deno_error::JsErrorBox;
use deno_lockfile::NpmPackageDependencyLockfileInfo;
use deno_lockfile::NpmPackageLockfileInfo;
use deno_npm::registry::NpmPackageInfo;
use deno_npm::registry::NpmRegistryApi;
use deno_npm::registry::NpmRegistryPackageInfoLoadError;
use deno_npm::resolution::AddPkgReqsOptions;
use deno_npm::resolution::NpmPackagesPartitioned;
use deno_npm::resolution::NpmResolutionError;
use deno_npm::resolution::NpmResolutionSnapshot;
use deno_npm::resolution::PackageCacheFolderIdNotFoundError;
use deno_npm::resolution::PackageNotFoundFromReferrerError;
use deno_npm::resolution::PackageNvNotFoundError;
use deno_npm::resolution::PackageReqNotFoundError;
use deno_npm::resolution::ValidSerializedNpmResolutionSnapshot;
use deno_npm::NpmPackageCacheFolderId;
use deno_npm::NpmPackageId;
use deno_npm::NpmResolutionPackage;
use deno_npm::NpmSystemInfo;
use deno_resolver::npm::managed::NpmResolutionCell;
use deno_semver::jsr::JsrDepPackageReq;
use deno_semver::package::PackageNv;
use deno_semver::package::PackageReq;
@ -30,7 +24,7 @@ use deno_semver::VersionReq;
use crate::args::CliLockfile;
use crate::npm::CliNpmRegistryInfoProvider;
use crate::util::sync::SyncReadAsyncWriteLock;
use crate::util::sync::TaskQueue;
pub struct AddPkgReqsResult {
/// Results from adding the individual packages.
@ -39,63 +33,51 @@ pub struct AddPkgReqsResult {
/// package requirements.
pub results: Vec<Result<PackageNv, NpmResolutionError>>,
/// The final result of resolving and caching all the package requirements.
pub dependencies_result: Result<(), AnyError>,
pub dependencies_result: Result<(), JsErrorBox>,
}
/// Handles updating and storing npm resolution in memory where the underlying
/// snapshot can be updated concurrently. Additionally handles updating the lockfile
/// based on changes to the resolution.
///
/// This does not interact with the file system.
pub struct NpmResolution {
/// Updates the npm resolution with the provided package requirements.
#[derive(Debug)]
pub struct NpmResolutionInstaller {
registry_info_provider: Arc<CliNpmRegistryInfoProvider>,
snapshot: SyncReadAsyncWriteLock<NpmResolutionSnapshot>,
resolution: Arc<NpmResolutionCell>,
maybe_lockfile: Option<Arc<CliLockfile>>,
update_queue: TaskQueue,
}
impl std::fmt::Debug for NpmResolution {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
let snapshot = self.snapshot.read();
f.debug_struct("NpmResolution")
.field("snapshot", &snapshot.as_valid_serialized().as_serialized())
.finish()
}
}
impl NpmResolution {
pub fn from_serialized(
registry_info_provider: Arc<CliNpmRegistryInfoProvider>,
initial_snapshot: Option<ValidSerializedNpmResolutionSnapshot>,
maybe_lockfile: Option<Arc<CliLockfile>>,
) -> Self {
let snapshot =
NpmResolutionSnapshot::new(initial_snapshot.unwrap_or_default());
Self::new(registry_info_provider, snapshot, maybe_lockfile)
}
impl NpmResolutionInstaller {
pub fn new(
registry_info_provider: Arc<CliNpmRegistryInfoProvider>,
initial_snapshot: NpmResolutionSnapshot,
resolution: Arc<NpmResolutionCell>,
maybe_lockfile: Option<Arc<CliLockfile>>,
) -> Self {
Self {
registry_info_provider,
snapshot: SyncReadAsyncWriteLock::new(initial_snapshot),
resolution,
maybe_lockfile,
update_queue: Default::default(),
}
}
pub async fn cache_package_info(
&self,
package_name: &str,
) -> Result<Arc<NpmPackageInfo>, NpmRegistryPackageInfoLoadError> {
// this will internally cache the package information
self.registry_info_provider.package_info(package_name).await
}
pub async fn add_package_reqs(
&self,
package_reqs: &[PackageReq],
) -> AddPkgReqsResult {
// only allow one thread in here at a time
let snapshot_lock = self.snapshot.acquire().await;
let _snapshot_lock = self.update_queue.acquire().await;
let result = add_package_reqs_to_snapshot(
&self.registry_info_provider,
package_reqs,
self.maybe_lockfile.clone(),
|| snapshot_lock.read().clone(),
|| self.resolution.snapshot(),
)
.await;
@ -103,10 +85,10 @@ impl NpmResolution {
results: result.results,
dependencies_result: match result.dep_graph_result {
Ok(snapshot) => {
*snapshot_lock.write() = snapshot;
self.resolution.set_snapshot(snapshot);
Ok(())
}
Err(err) => Err(err.into()),
Err(err) => Err(JsErrorBox::from_err(err)),
},
}
}
@ -116,7 +98,7 @@ impl NpmResolution {
package_reqs: &[PackageReq],
) -> Result<(), AnyError> {
// only allow one thread in here at a time
let snapshot_lock = self.snapshot.acquire().await;
let _snapshot_lock = self.update_queue.acquire().await;
let reqs_set = package_reqs.iter().collect::<HashSet<_>>();
let snapshot = add_package_reqs_to_snapshot(
@ -124,7 +106,7 @@ impl NpmResolution {
package_reqs,
self.maybe_lockfile.clone(),
|| {
let snapshot = snapshot_lock.read().clone();
let snapshot = self.resolution.snapshot();
let has_removed_package = !snapshot
.package_reqs()
.keys()
@ -140,127 +122,10 @@ impl NpmResolution {
.await
.into_result()?;
*snapshot_lock.write() = snapshot;
self.resolution.set_snapshot(snapshot);
Ok(())
}
pub fn resolve_pkg_cache_folder_id_from_pkg_id(
&self,
id: &NpmPackageId,
) -> Option<NpmPackageCacheFolderId> {
self
.snapshot
.read()
.package_from_id(id)
.map(|p| p.get_package_cache_folder_id())
}
pub fn resolve_pkg_id_from_pkg_cache_folder_id(
&self,
id: &NpmPackageCacheFolderId,
) -> Result<NpmPackageId, PackageCacheFolderIdNotFoundError> {
self
.snapshot
.read()
.resolve_pkg_from_pkg_cache_folder_id(id)
.map(|pkg| pkg.id.clone())
}
pub fn resolve_package_from_package(
&self,
name: &str,
referrer: &NpmPackageCacheFolderId,
) -> Result<NpmResolutionPackage, Box<PackageNotFoundFromReferrerError>> {
self
.snapshot
.read()
.resolve_package_from_package(name, referrer)
.cloned()
}
/// Resolve a node package from a deno module.
pub fn resolve_pkg_id_from_pkg_req(
&self,
req: &PackageReq,
) -> Result<NpmPackageId, PackageReqNotFoundError> {
self
.snapshot
.read()
.resolve_pkg_from_pkg_req(req)
.map(|pkg| pkg.id.clone())
}
pub fn resolve_pkg_reqs_from_pkg_id(
&self,
id: &NpmPackageId,
) -> Vec<PackageReq> {
let snapshot = self.snapshot.read();
let mut pkg_reqs = snapshot
.package_reqs()
.iter()
.filter(|(_, nv)| *nv == &id.nv)
.map(|(req, _)| req.clone())
.collect::<Vec<_>>();
pkg_reqs.sort(); // be deterministic
pkg_reqs
}
pub fn resolve_pkg_id_from_deno_module(
&self,
id: &PackageNv,
) -> Result<NpmPackageId, PackageNvNotFoundError> {
self
.snapshot
.read()
.resolve_package_from_deno_module(id)
.map(|pkg| pkg.id.clone())
}
pub fn package_reqs(&self) -> HashMap<PackageReq, PackageNv> {
self.snapshot.read().package_reqs().clone()
}
pub fn all_system_packages(
&self,
system_info: &NpmSystemInfo,
) -> Vec<NpmResolutionPackage> {
self.snapshot.read().all_system_packages(system_info)
}
pub fn all_system_packages_partitioned(
&self,
system_info: &NpmSystemInfo,
) -> NpmPackagesPartitioned {
self
.snapshot
.read()
.all_system_packages_partitioned(system_info)
}
pub fn snapshot(&self) -> NpmResolutionSnapshot {
self.snapshot.read().clone()
}
pub fn serialized_valid_snapshot(
&self,
) -> ValidSerializedNpmResolutionSnapshot {
self.snapshot.read().as_valid_serialized()
}
pub fn serialized_valid_snapshot_for_system(
&self,
system_info: &NpmSystemInfo,
) -> ValidSerializedNpmResolutionSnapshot {
self
.snapshot
.read()
.as_valid_serialized_for_system(system_info)
}
pub fn subset(&self, package_reqs: &[PackageReq]) -> NpmResolutionSnapshot {
self.snapshot.read().subset(package_reqs)
}
}
async fn add_package_reqs_to_snapshot(
@ -333,6 +198,25 @@ fn populate_lockfile_from_snapshot(
lockfile: &CliLockfile,
snapshot: &NpmResolutionSnapshot,
) {
fn npm_package_to_lockfile_info(
pkg: &NpmResolutionPackage,
) -> NpmPackageLockfileInfo {
let dependencies = pkg
.dependencies
.iter()
.map(|(name, id)| NpmPackageDependencyLockfileInfo {
name: name.clone(),
id: id.as_serialized(),
})
.collect();
NpmPackageLockfileInfo {
serialized_id: pkg.id.as_serialized(),
integrity: pkg.dist.integrity().for_lockfile(),
dependencies,
}
}
let mut lockfile = lockfile.lock();
for (package_req, nv) in snapshot.package_reqs() {
let id = &snapshot.resolve_package_from_deno_module(nv).unwrap().id;
@ -351,22 +235,3 @@ fn populate_lockfile_from_snapshot(
lockfile.insert_npm_package(npm_package_to_lockfile_info(package));
}
}
fn npm_package_to_lockfile_info(
pkg: &NpmResolutionPackage,
) -> NpmPackageLockfileInfo {
let dependencies = pkg
.dependencies
.iter()
.map(|(name, id)| NpmPackageDependencyLockfileInfo {
name: name.clone(),
id: id.as_serialized(),
})
.collect();
NpmPackageLockfileInfo {
serialized_id: pkg.id.as_serialized(),
integrity: pkg.dist.integrity().for_lockfile(),
dependencies,
}
}

203
cli/npm/managed.rs Normal file
View file

@ -0,0 +1,203 @@
// Copyright 2018-2025 the Deno authors. MIT license.
use std::path::PathBuf;
use std::sync::Arc;
use deno_core::parking_lot::Mutex;
use deno_error::JsError;
use deno_error::JsErrorBox;
use deno_npm::registry::NpmRegistryApi;
use deno_npm::resolution::NpmResolutionSnapshot;
use deno_npm::resolution::ValidSerializedNpmResolutionSnapshot;
use deno_resolver::npm::managed::ManagedNpmResolverCreateOptions;
use deno_resolver::npm::managed::NpmResolutionCell;
use thiserror::Error;
use super::CliNpmRegistryInfoProvider;
use crate::args::CliLockfile;
use crate::sys::CliSys;
pub type CliManagedNpmResolverCreateOptions =
ManagedNpmResolverCreateOptions<CliSys>;
#[derive(Debug, Clone)]
pub enum CliNpmResolverManagedSnapshotOption {
ResolveFromLockfile(Arc<CliLockfile>),
Specified(Option<ValidSerializedNpmResolutionSnapshot>),
}
#[derive(Debug)]
enum SyncState {
Pending(Option<CliNpmResolverManagedSnapshotOption>),
Err(ResolveSnapshotError),
Success,
}
#[derive(Debug)]
pub struct NpmResolutionInitializer {
npm_registry_info_provider: Arc<CliNpmRegistryInfoProvider>,
npm_resolution: Arc<NpmResolutionCell>,
queue: tokio::sync::Mutex<()>,
sync_state: Mutex<SyncState>,
}
impl NpmResolutionInitializer {
pub fn new(
npm_registry_info_provider: Arc<CliNpmRegistryInfoProvider>,
npm_resolution: Arc<NpmResolutionCell>,
snapshot_option: CliNpmResolverManagedSnapshotOption,
) -> Self {
Self {
npm_registry_info_provider,
npm_resolution,
queue: tokio::sync::Mutex::new(()),
sync_state: Mutex::new(SyncState::Pending(Some(snapshot_option))),
}
}
#[cfg(debug_assertions)]
pub fn debug_assert_initialized(&self) {
if !matches!(*self.sync_state.lock(), SyncState::Success) {
panic!("debug assert: npm resolution must be initialized before calling this code");
}
}
pub async fn ensure_initialized(&self) -> Result<(), JsErrorBox> {
// fast exit if not pending
{
match &*self.sync_state.lock() {
SyncState::Pending(_) => {}
SyncState::Err(err) => return Err(JsErrorBox::from_err(err.clone())),
SyncState::Success => return Ok(()),
}
}
// only allow one task in here at a time
let _guard = self.queue.lock().await;
let snapshot_option = {
let mut sync_state = self.sync_state.lock();
match &mut *sync_state {
SyncState::Pending(snapshot_option) => {
// this should never panic, but if it does it means that a
// previous future was dropped while initialization occurred...
// that should never happen because this is initialized during
// startup
snapshot_option.take().unwrap()
}
// another thread updated the state while we were waiting
SyncState::Err(resolve_snapshot_error) => {
return Err(JsErrorBox::from_err(resolve_snapshot_error.clone()));
}
SyncState::Success => {
return Ok(());
}
}
};
match resolve_snapshot(&self.npm_registry_info_provider, snapshot_option)
.await
{
Ok(maybe_snapshot) => {
if let Some(snapshot) = maybe_snapshot {
self
.npm_resolution
.set_snapshot(NpmResolutionSnapshot::new(snapshot));
}
let mut sync_state = self.sync_state.lock();
*sync_state = SyncState::Success;
Ok(())
}
Err(err) => {
let mut sync_state = self.sync_state.lock();
*sync_state = SyncState::Err(err.clone());
Err(JsErrorBox::from_err(err))
}
}
}
}
#[derive(Debug, Error, Clone, JsError)]
#[error("failed reading lockfile '{}'", lockfile_path.display())]
#[class(inherit)]
pub struct ResolveSnapshotError {
lockfile_path: PathBuf,
#[inherit]
#[source]
source: SnapshotFromLockfileError,
}
impl ResolveSnapshotError {
pub fn maybe_integrity_check_error(
&self,
) -> Option<&deno_npm::resolution::IntegrityCheckFailedError> {
match &self.source {
SnapshotFromLockfileError::SnapshotFromLockfile(
deno_npm::resolution::SnapshotFromLockfileError::IntegrityCheckFailed(
err,
),
) => Some(err),
_ => None,
}
}
}
async fn resolve_snapshot(
registry_info_provider: &Arc<CliNpmRegistryInfoProvider>,
snapshot: CliNpmResolverManagedSnapshotOption,
) -> Result<Option<ValidSerializedNpmResolutionSnapshot>, ResolveSnapshotError>
{
match snapshot {
CliNpmResolverManagedSnapshotOption::ResolveFromLockfile(lockfile) => {
if !lockfile.overwrite() {
let snapshot = snapshot_from_lockfile(
lockfile.clone(),
&registry_info_provider.as_npm_registry_api(),
)
.await
.map_err(|source| ResolveSnapshotError {
lockfile_path: lockfile.filename.clone(),
source,
})?;
Ok(Some(snapshot))
} else {
Ok(None)
}
}
CliNpmResolverManagedSnapshotOption::Specified(snapshot) => Ok(snapshot),
}
}
#[derive(Debug, Error, Clone, JsError)]
pub enum SnapshotFromLockfileError {
#[error(transparent)]
#[class(inherit)]
IncompleteError(
#[from] deno_npm::resolution::IncompleteSnapshotFromLockfileError,
),
#[error(transparent)]
#[class(inherit)]
SnapshotFromLockfile(#[from] deno_npm::resolution::SnapshotFromLockfileError),
}
async fn snapshot_from_lockfile(
lockfile: Arc<CliLockfile>,
api: &dyn NpmRegistryApi,
) -> Result<ValidSerializedNpmResolutionSnapshot, SnapshotFromLockfileError> {
let (incomplete_snapshot, skip_integrity_check) = {
let lock = lockfile.lock();
(
deno_npm::resolution::incomplete_snapshot_from_lockfile(&lock)?,
lock.overwrite,
)
};
let snapshot = deno_npm::resolution::snapshot_from_lockfile(
deno_npm::resolution::SnapshotFromLockfileParams {
incomplete_snapshot,
api,
skip_integrity_check,
},
)
.await?;
Ok(snapshot)
}

View file

@ -1,776 +0,0 @@
// Copyright 2018-2025 the Deno authors. MIT license.
use std::borrow::Cow;
use std::path::Path;
use std::path::PathBuf;
use std::sync::Arc;
use deno_ast::ModuleSpecifier;
use deno_cache_dir::npm::NpmCacheDir;
use deno_core::anyhow::Context;
use deno_core::error::AnyError;
use deno_core::serde_json;
use deno_core::url::Url;
use deno_npm::npm_rc::ResolvedNpmRc;
use deno_npm::registry::NpmPackageInfo;
use deno_npm::registry::NpmRegistryApi;
use deno_npm::resolution::NpmResolutionSnapshot;
use deno_npm::resolution::PackageReqNotFoundError;
use deno_npm::resolution::ValidSerializedNpmResolutionSnapshot;
use deno_npm::NpmPackageId;
use deno_npm::NpmResolutionPackage;
use deno_npm::NpmSystemInfo;
use deno_npm_cache::NpmCacheSetting;
use deno_path_util::fs::canonicalize_path_maybe_not_exists;
use deno_resolver::npm::CliNpmReqResolver;
use deno_runtime::colors;
use deno_runtime::ops::process::NpmProcessStateProvider;
use deno_semver::package::PackageNv;
use deno_semver::package::PackageReq;
use node_resolver::errors::PackageFolderResolveError;
use node_resolver::errors::PackageFolderResolveIoError;
use node_resolver::InNpmPackageChecker;
use node_resolver::NpmPackageFolderResolver;
use resolution::AddPkgReqsResult;
use self::resolution::NpmResolution;
use self::resolvers::create_npm_fs_resolver;
use self::resolvers::NpmPackageFsResolver;
use super::CliNpmCache;
use super::CliNpmCacheHttpClient;
use super::CliNpmRegistryInfoProvider;
use super::CliNpmResolver;
use super::CliNpmTarballCache;
use super::InnerCliNpmResolverRef;
use super::ResolvePkgFolderFromDenoReqError;
use crate::args::CliLockfile;
use crate::args::LifecycleScriptsConfig;
use crate::args::NpmInstallDepsProvider;
use crate::args::NpmProcessState;
use crate::args::NpmProcessStateKind;
use crate::args::PackageJsonDepValueParseWithLocationError;
use crate::cache::FastInsecureHasher;
use crate::sys::CliSys;
use crate::util::progress_bar::ProgressBar;
use crate::util::sync::AtomicFlag;
mod resolution;
mod resolvers;
pub enum CliNpmResolverManagedSnapshotOption {
ResolveFromLockfile(Arc<CliLockfile>),
Specified(Option<ValidSerializedNpmResolutionSnapshot>),
}
pub struct CliManagedNpmResolverCreateOptions {
pub snapshot: CliNpmResolverManagedSnapshotOption,
pub maybe_lockfile: Option<Arc<CliLockfile>>,
pub http_client_provider: Arc<crate::http_util::HttpClientProvider>,
pub npm_cache_dir: Arc<NpmCacheDir>,
pub sys: CliSys,
pub cache_setting: deno_cache_dir::file_fetcher::CacheSetting,
pub text_only_progress_bar: crate::util::progress_bar::ProgressBar,
pub maybe_node_modules_path: Option<PathBuf>,
pub npm_system_info: NpmSystemInfo,
pub npm_install_deps_provider: Arc<NpmInstallDepsProvider>,
pub npmrc: Arc<ResolvedNpmRc>,
pub lifecycle_scripts: LifecycleScriptsConfig,
}
pub async fn create_managed_npm_resolver_for_lsp(
options: CliManagedNpmResolverCreateOptions,
) -> Arc<dyn CliNpmResolver> {
let npm_cache = create_cache(&options);
let http_client = Arc::new(CliNpmCacheHttpClient::new(
options.http_client_provider.clone(),
options.text_only_progress_bar.clone(),
));
let npm_api = create_api(npm_cache.clone(), http_client.clone(), &options);
// spawn due to the lsp's `Send` requirement
deno_core::unsync::spawn(async move {
let snapshot = match resolve_snapshot(&npm_api, options.snapshot).await {
Ok(snapshot) => snapshot,
Err(err) => {
log::warn!("failed to resolve snapshot: {}", err);
None
}
};
create_inner(
http_client,
npm_cache,
options.npm_install_deps_provider,
npm_api,
options.sys,
options.text_only_progress_bar,
options.maybe_lockfile,
options.npmrc,
options.maybe_node_modules_path,
options.npm_system_info,
snapshot,
options.lifecycle_scripts,
)
})
.await
.unwrap()
}
pub async fn create_managed_npm_resolver(
options: CliManagedNpmResolverCreateOptions,
) -> Result<Arc<dyn CliNpmResolver>, AnyError> {
let npm_cache = create_cache(&options);
let http_client = Arc::new(CliNpmCacheHttpClient::new(
options.http_client_provider.clone(),
options.text_only_progress_bar.clone(),
));
let api = create_api(npm_cache.clone(), http_client.clone(), &options);
let snapshot = resolve_snapshot(&api, options.snapshot).await?;
Ok(create_inner(
http_client,
npm_cache,
options.npm_install_deps_provider,
api,
options.sys,
options.text_only_progress_bar,
options.maybe_lockfile,
options.npmrc,
options.maybe_node_modules_path,
options.npm_system_info,
snapshot,
options.lifecycle_scripts,
))
}
#[allow(clippy::too_many_arguments)]
fn create_inner(
http_client: Arc<CliNpmCacheHttpClient>,
npm_cache: Arc<CliNpmCache>,
npm_install_deps_provider: Arc<NpmInstallDepsProvider>,
registry_info_provider: Arc<CliNpmRegistryInfoProvider>,
sys: CliSys,
text_only_progress_bar: crate::util::progress_bar::ProgressBar,
maybe_lockfile: Option<Arc<CliLockfile>>,
npm_rc: Arc<ResolvedNpmRc>,
node_modules_dir_path: Option<PathBuf>,
npm_system_info: NpmSystemInfo,
snapshot: Option<ValidSerializedNpmResolutionSnapshot>,
lifecycle_scripts: LifecycleScriptsConfig,
) -> Arc<dyn CliNpmResolver> {
let resolution = Arc::new(NpmResolution::from_serialized(
registry_info_provider.clone(),
snapshot,
maybe_lockfile.clone(),
));
let tarball_cache = Arc::new(CliNpmTarballCache::new(
npm_cache.clone(),
http_client,
sys.clone(),
npm_rc.clone(),
));
let fs_resolver = create_npm_fs_resolver(
npm_cache.clone(),
&npm_install_deps_provider,
&text_only_progress_bar,
resolution.clone(),
sys.clone(),
tarball_cache.clone(),
node_modules_dir_path,
npm_system_info.clone(),
lifecycle_scripts.clone(),
);
Arc::new(ManagedCliNpmResolver::new(
fs_resolver,
maybe_lockfile,
registry_info_provider,
npm_cache,
npm_install_deps_provider,
resolution,
sys,
tarball_cache,
text_only_progress_bar,
npm_system_info,
lifecycle_scripts,
))
}
fn create_cache(
options: &CliManagedNpmResolverCreateOptions,
) -> Arc<CliNpmCache> {
Arc::new(CliNpmCache::new(
options.npm_cache_dir.clone(),
options.sys.clone(),
NpmCacheSetting::from_cache_setting(&options.cache_setting),
options.npmrc.clone(),
))
}
fn create_api(
cache: Arc<CliNpmCache>,
http_client: Arc<CliNpmCacheHttpClient>,
options: &CliManagedNpmResolverCreateOptions,
) -> Arc<CliNpmRegistryInfoProvider> {
Arc::new(CliNpmRegistryInfoProvider::new(
cache,
http_client,
options.npmrc.clone(),
))
}
async fn resolve_snapshot(
registry_info_provider: &Arc<CliNpmRegistryInfoProvider>,
snapshot: CliNpmResolverManagedSnapshotOption,
) -> Result<Option<ValidSerializedNpmResolutionSnapshot>, AnyError> {
match snapshot {
CliNpmResolverManagedSnapshotOption::ResolveFromLockfile(lockfile) => {
if !lockfile.overwrite() {
let snapshot = snapshot_from_lockfile(
lockfile.clone(),
&registry_info_provider.as_npm_registry_api(),
)
.await
.with_context(|| {
format!("failed reading lockfile '{}'", lockfile.filename.display())
})?;
Ok(Some(snapshot))
} else {
Ok(None)
}
}
CliNpmResolverManagedSnapshotOption::Specified(snapshot) => Ok(snapshot),
}
}
async fn snapshot_from_lockfile(
lockfile: Arc<CliLockfile>,
api: &dyn NpmRegistryApi,
) -> Result<ValidSerializedNpmResolutionSnapshot, AnyError> {
let (incomplete_snapshot, skip_integrity_check) = {
let lock = lockfile.lock();
(
deno_npm::resolution::incomplete_snapshot_from_lockfile(&lock)?,
lock.overwrite,
)
};
let snapshot = deno_npm::resolution::snapshot_from_lockfile(
deno_npm::resolution::SnapshotFromLockfileParams {
incomplete_snapshot,
api,
skip_integrity_check,
},
)
.await?;
Ok(snapshot)
}
#[derive(Debug)]
struct ManagedInNpmPackageChecker {
root_dir: Url,
}
impl InNpmPackageChecker for ManagedInNpmPackageChecker {
fn in_npm_package(&self, specifier: &Url) -> bool {
specifier.as_ref().starts_with(self.root_dir.as_str())
}
}
pub struct CliManagedInNpmPkgCheckerCreateOptions<'a> {
pub root_cache_dir_url: &'a Url,
pub maybe_node_modules_path: Option<&'a Path>,
}
pub fn create_managed_in_npm_pkg_checker(
options: CliManagedInNpmPkgCheckerCreateOptions,
) -> Arc<dyn InNpmPackageChecker> {
let root_dir = match options.maybe_node_modules_path {
Some(node_modules_folder) => {
deno_path_util::url_from_directory_path(node_modules_folder).unwrap()
}
None => options.root_cache_dir_url.clone(),
};
debug_assert!(root_dir.as_str().ends_with('/'));
Arc::new(ManagedInNpmPackageChecker { root_dir })
}
#[derive(Debug, Clone, PartialEq, Eq)]
pub enum PackageCaching<'a> {
Only(Cow<'a, [PackageReq]>),
All,
}
/// An npm resolver where the resolution is managed by Deno rather than
/// the user bringing their own node_modules (BYONM) on the file system.
pub struct ManagedCliNpmResolver {
fs_resolver: Arc<dyn NpmPackageFsResolver>,
maybe_lockfile: Option<Arc<CliLockfile>>,
registry_info_provider: Arc<CliNpmRegistryInfoProvider>,
npm_cache: Arc<CliNpmCache>,
npm_install_deps_provider: Arc<NpmInstallDepsProvider>,
sys: CliSys,
resolution: Arc<NpmResolution>,
tarball_cache: Arc<CliNpmTarballCache>,
text_only_progress_bar: ProgressBar,
npm_system_info: NpmSystemInfo,
top_level_install_flag: AtomicFlag,
lifecycle_scripts: LifecycleScriptsConfig,
}
impl std::fmt::Debug for ManagedCliNpmResolver {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
f.debug_struct("ManagedNpmResolver")
.field("<omitted>", &"<omitted>")
.finish()
}
}
impl ManagedCliNpmResolver {
#[allow(clippy::too_many_arguments)]
pub fn new(
fs_resolver: Arc<dyn NpmPackageFsResolver>,
maybe_lockfile: Option<Arc<CliLockfile>>,
registry_info_provider: Arc<CliNpmRegistryInfoProvider>,
npm_cache: Arc<CliNpmCache>,
npm_install_deps_provider: Arc<NpmInstallDepsProvider>,
resolution: Arc<NpmResolution>,
sys: CliSys,
tarball_cache: Arc<CliNpmTarballCache>,
text_only_progress_bar: ProgressBar,
npm_system_info: NpmSystemInfo,
lifecycle_scripts: LifecycleScriptsConfig,
) -> Self {
Self {
fs_resolver,
maybe_lockfile,
registry_info_provider,
npm_cache,
npm_install_deps_provider,
text_only_progress_bar,
resolution,
sys,
tarball_cache,
npm_system_info,
top_level_install_flag: Default::default(),
lifecycle_scripts,
}
}
pub fn resolve_pkg_folder_from_pkg_id(
&self,
pkg_id: &NpmPackageId,
) -> Result<PathBuf, AnyError> {
let path = self.fs_resolver.package_folder(pkg_id)?;
let path = canonicalize_path_maybe_not_exists(&self.sys, &path)?;
log::debug!(
"Resolved package folder of {} to {}",
pkg_id.as_serialized(),
path.display()
);
Ok(path)
}
/// Resolves the package id from the provided specifier.
pub fn resolve_pkg_id_from_specifier(
&self,
specifier: &ModuleSpecifier,
) -> Result<Option<NpmPackageId>, AnyError> {
let Some(cache_folder_id) = self
.fs_resolver
.resolve_package_cache_folder_id_from_specifier(specifier)?
else {
return Ok(None);
};
Ok(Some(
self
.resolution
.resolve_pkg_id_from_pkg_cache_folder_id(&cache_folder_id)?,
))
}
pub fn resolve_pkg_reqs_from_pkg_id(
&self,
id: &NpmPackageId,
) -> Vec<PackageReq> {
self.resolution.resolve_pkg_reqs_from_pkg_id(id)
}
/// Attempts to get the package size in bytes.
pub fn package_size(
&self,
package_id: &NpmPackageId,
) -> Result<u64, AnyError> {
let package_folder = self.fs_resolver.package_folder(package_id)?;
Ok(crate::util::fs::dir_size(&package_folder)?)
}
pub fn all_system_packages(
&self,
system_info: &NpmSystemInfo,
) -> Vec<NpmResolutionPackage> {
self.resolution.all_system_packages(system_info)
}
/// Checks if the provided package req's folder is cached.
pub fn is_pkg_req_folder_cached(&self, req: &PackageReq) -> bool {
self
.resolve_pkg_id_from_pkg_req(req)
.ok()
.and_then(|id| self.fs_resolver.package_folder(&id).ok())
.map(|folder| folder.exists())
.unwrap_or(false)
}
/// Adds package requirements to the resolver and ensures everything is setup.
/// This includes setting up the `node_modules` directory, if applicable.
pub async fn add_and_cache_package_reqs(
&self,
packages: &[PackageReq],
) -> Result<(), AnyError> {
self
.add_package_reqs_raw(
packages,
Some(PackageCaching::Only(packages.into())),
)
.await
.dependencies_result
}
pub async fn add_package_reqs_no_cache(
&self,
packages: &[PackageReq],
) -> Result<(), AnyError> {
self
.add_package_reqs_raw(packages, None)
.await
.dependencies_result
}
pub async fn add_package_reqs(
&self,
packages: &[PackageReq],
caching: PackageCaching<'_>,
) -> Result<(), AnyError> {
self
.add_package_reqs_raw(packages, Some(caching))
.await
.dependencies_result
}
pub async fn add_package_reqs_raw<'a>(
&self,
packages: &[PackageReq],
caching: Option<PackageCaching<'a>>,
) -> AddPkgReqsResult {
if packages.is_empty() {
return AddPkgReqsResult {
dependencies_result: Ok(()),
results: vec![],
};
}
let mut result = self.resolution.add_package_reqs(packages).await;
if result.dependencies_result.is_ok() {
if let Some(lockfile) = self.maybe_lockfile.as_ref() {
result.dependencies_result = lockfile.error_if_changed();
}
}
if result.dependencies_result.is_ok() {
if let Some(caching) = caching {
result.dependencies_result = self.cache_packages(caching).await;
}
}
result
}
/// Sets package requirements to the resolver, removing old requirements and adding new ones.
///
/// This will retrieve and resolve package information, but not cache any package files.
pub async fn set_package_reqs(
&self,
packages: &[PackageReq],
) -> Result<(), AnyError> {
self.resolution.set_package_reqs(packages).await
}
pub fn snapshot(&self) -> NpmResolutionSnapshot {
self.resolution.snapshot()
}
pub fn top_package_req_for_name(&self, name: &str) -> Option<PackageReq> {
let package_reqs = self.resolution.package_reqs();
let mut entries = package_reqs
.iter()
.filter(|(_, nv)| nv.name == name)
.collect::<Vec<_>>();
entries.sort_by_key(|(_, nv)| &nv.version);
Some(entries.last()?.0.clone())
}
pub fn serialized_valid_snapshot_for_system(
&self,
system_info: &NpmSystemInfo,
) -> ValidSerializedNpmResolutionSnapshot {
self
.resolution
.serialized_valid_snapshot_for_system(system_info)
}
pub async fn inject_synthetic_types_node_package(
&self,
) -> Result<(), AnyError> {
let reqs = &[PackageReq::from_str("@types/node").unwrap()];
// add and ensure this isn't added to the lockfile
self
.add_package_reqs(reqs, PackageCaching::Only(reqs.into()))
.await?;
Ok(())
}
pub async fn cache_packages(
&self,
caching: PackageCaching<'_>,
) -> Result<(), AnyError> {
self.fs_resolver.cache_packages(caching).await
}
pub fn resolve_pkg_folder_from_deno_module(
&self,
nv: &PackageNv,
) -> Result<PathBuf, AnyError> {
let pkg_id = self.resolution.resolve_pkg_id_from_deno_module(nv)?;
self.resolve_pkg_folder_from_pkg_id(&pkg_id)
}
pub fn resolve_pkg_id_from_pkg_req(
&self,
req: &PackageReq,
) -> Result<NpmPackageId, PackageReqNotFoundError> {
self.resolution.resolve_pkg_id_from_pkg_req(req)
}
pub fn ensure_no_pkg_json_dep_errors(
&self,
) -> Result<(), Box<PackageJsonDepValueParseWithLocationError>> {
for err in self.npm_install_deps_provider.pkg_json_dep_errors() {
match err.source.as_kind() {
deno_package_json::PackageJsonDepValueParseErrorKind::VersionReq(_) => {
return Err(Box::new(err.clone()));
}
deno_package_json::PackageJsonDepValueParseErrorKind::Unsupported {
..
} => {
// only warn for this one
log::warn!(
"{} {}\n at {}",
colors::yellow("Warning"),
err.source,
err.location,
)
}
}
}
Ok(())
}
/// Ensures that the top level `package.json` dependencies are installed.
/// This may set up the `node_modules` directory.
///
/// Returns `true` if the top level packages are already installed. A
/// return value of `false` means that new packages were added to the NPM resolution.
pub async fn ensure_top_level_package_json_install(
&self,
) -> Result<bool, AnyError> {
if !self.top_level_install_flag.raise() {
return Ok(true); // already did this
}
let pkg_json_remote_pkgs = self.npm_install_deps_provider.remote_pkgs();
if pkg_json_remote_pkgs.is_empty() {
return Ok(true);
}
// check if something needs resolving before bothering to load all
// the package information (which is slow)
if pkg_json_remote_pkgs.iter().all(|pkg| {
self
.resolution
.resolve_pkg_id_from_pkg_req(&pkg.req)
.is_ok()
}) {
log::debug!(
"All package.json deps resolvable. Skipping top level install."
);
return Ok(true); // everything is already resolvable
}
let pkg_reqs = pkg_json_remote_pkgs
.iter()
.map(|pkg| pkg.req.clone())
.collect::<Vec<_>>();
self.add_package_reqs_no_cache(&pkg_reqs).await?;
Ok(false)
}
pub async fn cache_package_info(
&self,
package_name: &str,
) -> Result<Arc<NpmPackageInfo>, AnyError> {
// this will internally cache the package information
self
.registry_info_provider
.package_info(package_name)
.await
.map_err(|err| err.into())
}
pub fn maybe_node_modules_path(&self) -> Option<&Path> {
self.fs_resolver.node_modules_path()
}
pub fn global_cache_root_path(&self) -> &Path {
self.npm_cache.root_dir_path()
}
pub fn global_cache_root_url(&self) -> &Url {
self.npm_cache.root_dir_url()
}
}
fn npm_process_state(
snapshot: ValidSerializedNpmResolutionSnapshot,
node_modules_path: Option<&Path>,
) -> String {
serde_json::to_string(&NpmProcessState {
kind: NpmProcessStateKind::Snapshot(snapshot.into_serialized()),
local_node_modules_path: node_modules_path
.map(|p| p.to_string_lossy().to_string()),
})
.unwrap()
}
impl NpmPackageFolderResolver for ManagedCliNpmResolver {
fn resolve_package_folder_from_package(
&self,
name: &str,
referrer: &ModuleSpecifier,
) -> Result<PathBuf, PackageFolderResolveError> {
let path = self
.fs_resolver
.resolve_package_folder_from_package(name, referrer)?;
let path =
canonicalize_path_maybe_not_exists(&self.sys, &path).map_err(|err| {
PackageFolderResolveIoError {
package_name: name.to_string(),
referrer: referrer.clone(),
source: err,
}
})?;
log::debug!("Resolved {} from {} to {}", name, referrer, path.display());
Ok(path)
}
}
impl NpmProcessStateProvider for ManagedCliNpmResolver {
fn get_npm_process_state(&self) -> String {
npm_process_state(
self.resolution.serialized_valid_snapshot(),
self.fs_resolver.node_modules_path(),
)
}
}
impl CliNpmReqResolver for ManagedCliNpmResolver {
fn resolve_pkg_folder_from_deno_module_req(
&self,
req: &PackageReq,
_referrer: &ModuleSpecifier,
) -> Result<PathBuf, ResolvePkgFolderFromDenoReqError> {
let pkg_id = self
.resolve_pkg_id_from_pkg_req(req)
.map_err(|err| ResolvePkgFolderFromDenoReqError::Managed(err.into()))?;
self
.resolve_pkg_folder_from_pkg_id(&pkg_id)
.map_err(ResolvePkgFolderFromDenoReqError::Managed)
}
}
impl CliNpmResolver for ManagedCliNpmResolver {
fn into_npm_pkg_folder_resolver(
self: Arc<Self>,
) -> Arc<dyn NpmPackageFolderResolver> {
self
}
fn into_npm_req_resolver(self: Arc<Self>) -> Arc<dyn CliNpmReqResolver> {
self
}
fn into_process_state_provider(
self: Arc<Self>,
) -> Arc<dyn NpmProcessStateProvider> {
self
}
fn clone_snapshotted(&self) -> Arc<dyn CliNpmResolver> {
// create a new snapshotted npm resolution and resolver
let npm_resolution = Arc::new(NpmResolution::new(
self.registry_info_provider.clone(),
self.resolution.snapshot(),
self.maybe_lockfile.clone(),
));
Arc::new(ManagedCliNpmResolver::new(
create_npm_fs_resolver(
self.npm_cache.clone(),
&self.npm_install_deps_provider,
&self.text_only_progress_bar,
npm_resolution.clone(),
self.sys.clone(),
self.tarball_cache.clone(),
self.root_node_modules_path().map(ToOwned::to_owned),
self.npm_system_info.clone(),
self.lifecycle_scripts.clone(),
),
self.maybe_lockfile.clone(),
self.registry_info_provider.clone(),
self.npm_cache.clone(),
self.npm_install_deps_provider.clone(),
npm_resolution,
self.sys.clone(),
self.tarball_cache.clone(),
self.text_only_progress_bar.clone(),
self.npm_system_info.clone(),
self.lifecycle_scripts.clone(),
))
}
fn as_inner(&self) -> InnerCliNpmResolverRef {
InnerCliNpmResolverRef::Managed(self)
}
fn root_node_modules_path(&self) -> Option<&Path> {
self.fs_resolver.node_modules_path()
}
fn check_state_hash(&self) -> Option<u64> {
// We could go further and check all the individual
// npm packages, but that's probably overkill.
let mut package_reqs = self
.resolution
.package_reqs()
.into_iter()
.collect::<Vec<_>>();
package_reqs.sort_by(|a, b| a.0.cmp(&b.0)); // determinism
let mut hasher = FastInsecureHasher::new_without_deno_version();
// ensure the cache gets busted when turning nodeModulesDir on or off
// as this could cause changes in resolution
hasher.write_hashable(self.fs_resolver.node_modules_path().is_some());
for (pkg_req, pkg_nv) in package_reqs {
hasher.write_hashable(&pkg_req);
hasher.write_hashable(&pkg_nv);
}
Some(hasher.finish())
}
}

View file

@ -1,53 +0,0 @@
// Copyright 2018-2025 the Deno authors. MIT license.
pub mod bin_entries;
pub mod lifecycle_scripts;
use std::path::Path;
use std::path::PathBuf;
use async_trait::async_trait;
use deno_ast::ModuleSpecifier;
use deno_core::error::AnyError;
use deno_npm::NpmPackageCacheFolderId;
use deno_npm::NpmPackageId;
use node_resolver::errors::PackageFolderResolveError;
use super::super::PackageCaching;
/// Part of the resolution that interacts with the file system.
#[async_trait(?Send)]
pub trait NpmPackageFsResolver: Send + Sync {
/// The local node_modules folder if it is applicable to the implementation.
fn node_modules_path(&self) -> Option<&Path>;
fn maybe_package_folder(&self, package_id: &NpmPackageId) -> Option<PathBuf>;
fn package_folder(
&self,
package_id: &NpmPackageId,
) -> Result<PathBuf, AnyError> {
self.maybe_package_folder(package_id).ok_or_else(|| {
deno_core::anyhow::anyhow!(
"Package folder not found for '{}'",
package_id.as_serialized()
)
})
}
fn resolve_package_folder_from_package(
&self,
name: &str,
referrer: &ModuleSpecifier,
) -> Result<PathBuf, PackageFolderResolveError>;
fn resolve_package_cache_folder_id_from_specifier(
&self,
specifier: &ModuleSpecifier,
) -> Result<Option<NpmPackageCacheFolderId>, AnyError>;
async fn cache_packages<'a>(
&self,
caching: PackageCaching<'a>,
) -> Result<(), AnyError>;
}

View file

@ -1,55 +0,0 @@
// Copyright 2018-2025 the Deno authors. MIT license.
mod common;
mod global;
mod local;
use std::path::PathBuf;
use std::sync::Arc;
use deno_npm::NpmSystemInfo;
pub use self::common::NpmPackageFsResolver;
use self::global::GlobalNpmPackageResolver;
use self::local::LocalNpmPackageResolver;
use super::resolution::NpmResolution;
use crate::args::LifecycleScriptsConfig;
use crate::args::NpmInstallDepsProvider;
use crate::npm::CliNpmCache;
use crate::npm::CliNpmTarballCache;
use crate::sys::CliSys;
use crate::util::progress_bar::ProgressBar;
#[allow(clippy::too_many_arguments)]
pub fn create_npm_fs_resolver(
npm_cache: Arc<CliNpmCache>,
npm_install_deps_provider: &Arc<NpmInstallDepsProvider>,
progress_bar: &ProgressBar,
resolution: Arc<NpmResolution>,
sys: CliSys,
tarball_cache: Arc<CliNpmTarballCache>,
maybe_node_modules_path: Option<PathBuf>,
system_info: NpmSystemInfo,
lifecycle_scripts: LifecycleScriptsConfig,
) -> Arc<dyn NpmPackageFsResolver> {
match maybe_node_modules_path {
Some(node_modules_folder) => Arc::new(LocalNpmPackageResolver::new(
npm_cache,
npm_install_deps_provider.clone(),
progress_bar.clone(),
resolution,
sys,
tarball_cache,
node_modules_folder,
system_info,
lifecycle_scripts,
)),
None => Arc::new(GlobalNpmPackageResolver::new(
npm_cache,
tarball_cache,
resolution,
system_info,
lifecycle_scripts,
)),
}
}

View file

@ -1,40 +1,27 @@
// Copyright 2018-2025 the Deno authors. MIT license.
mod byonm;
pub mod installer;
mod managed;
mod permission_checker;
use std::path::Path;
use std::sync::Arc;
use dashmap::DashMap;
use deno_core::error::AnyError;
use deno_core::serde_json;
use deno_core::url::Url;
use deno_error::JsErrorBox;
use deno_lib::version::DENO_VERSION_INFO;
use deno_npm::npm_rc::ResolvedNpmRc;
use deno_npm::registry::NpmPackageInfo;
use deno_resolver::npm::ByonmInNpmPackageChecker;
use deno_resolver::npm::ByonmNpmResolver;
use deno_resolver::npm::CliNpmReqResolver;
use deno_resolver::npm::ResolvePkgFolderFromDenoReqError;
use deno_runtime::ops::process::NpmProcessStateProvider;
use deno_resolver::npm::ByonmNpmResolverCreateOptions;
use deno_semver::package::PackageNv;
use deno_semver::package::PackageReq;
use http::HeaderName;
use http::HeaderValue;
use managed::create_managed_in_npm_pkg_checker;
use node_resolver::InNpmPackageChecker;
use node_resolver::NpmPackageFolderResolver;
pub use self::byonm::CliByonmNpmResolver;
pub use self::byonm::CliByonmNpmResolverCreateOptions;
pub use self::managed::CliManagedInNpmPkgCheckerCreateOptions;
pub use self::managed::CliManagedNpmResolverCreateOptions;
pub use self::managed::CliNpmResolverManagedSnapshotOption;
pub use self::managed::ManagedCliNpmResolver;
pub use self::managed::PackageCaching;
pub use self::permission_checker::NpmRegistryReadPermissionChecker;
pub use self::permission_checker::NpmRegistryReadPermissionCheckerMode;
pub use self::managed::NpmResolutionInitializer;
pub use self::managed::ResolveSnapshotError;
use crate::file_fetcher::CliFileFetcher;
use crate::http_util::HttpClientProvider;
use crate::sys::CliSys;
@ -45,6 +32,12 @@ pub type CliNpmTarballCache =
pub type CliNpmCache = deno_npm_cache::NpmCache<CliSys>;
pub type CliNpmRegistryInfoProvider =
deno_npm_cache::RegistryInfoProvider<CliNpmCacheHttpClient, CliSys>;
pub type CliNpmResolver = deno_resolver::npm::NpmResolver<CliSys>;
pub type CliManagedNpmResolver = deno_resolver::npm::ManagedNpmResolver<CliSys>;
pub type CliNpmResolverCreateOptions =
deno_resolver::npm::NpmResolverCreateOptions<CliSys>;
pub type CliByonmNpmResolverCreateOptions =
ByonmNpmResolverCreateOptions<CliSys>;
#[derive(Debug)]
pub struct CliNpmCacheHttpClient {
@ -90,105 +83,21 @@ impl deno_npm_cache::NpmCacheHttpClient for CliNpmCacheHttpClient {
| Json { .. }
| ToStr { .. }
| RedirectHeaderParse { .. }
| TooManyRedirects => None,
| TooManyRedirects
| NotFound
| Other(_) => None,
BadResponse(bad_response_error) => {
Some(bad_response_error.status_code)
}
};
deno_npm_cache::DownloadError {
status_code,
error: err.into(),
error: JsErrorBox::from_err(err),
}
})
}
}
pub enum CliNpmResolverCreateOptions {
Managed(CliManagedNpmResolverCreateOptions),
Byonm(CliByonmNpmResolverCreateOptions),
}
pub async fn create_cli_npm_resolver_for_lsp(
options: CliNpmResolverCreateOptions,
) -> Arc<dyn CliNpmResolver> {
use CliNpmResolverCreateOptions::*;
match options {
Managed(options) => {
managed::create_managed_npm_resolver_for_lsp(options).await
}
Byonm(options) => Arc::new(ByonmNpmResolver::new(options)),
}
}
pub async fn create_cli_npm_resolver(
options: CliNpmResolverCreateOptions,
) -> Result<Arc<dyn CliNpmResolver>, AnyError> {
use CliNpmResolverCreateOptions::*;
match options {
Managed(options) => managed::create_managed_npm_resolver(options).await,
Byonm(options) => Ok(Arc::new(ByonmNpmResolver::new(options))),
}
}
pub enum CreateInNpmPkgCheckerOptions<'a> {
Managed(CliManagedInNpmPkgCheckerCreateOptions<'a>),
Byonm,
}
pub fn create_in_npm_pkg_checker(
options: CreateInNpmPkgCheckerOptions,
) -> Arc<dyn InNpmPackageChecker> {
match options {
CreateInNpmPkgCheckerOptions::Managed(options) => {
create_managed_in_npm_pkg_checker(options)
}
CreateInNpmPkgCheckerOptions::Byonm => Arc::new(ByonmInNpmPackageChecker),
}
}
pub enum InnerCliNpmResolverRef<'a> {
Managed(&'a ManagedCliNpmResolver),
#[allow(dead_code)]
Byonm(&'a CliByonmNpmResolver),
}
pub trait CliNpmResolver: NpmPackageFolderResolver + CliNpmReqResolver {
fn into_npm_pkg_folder_resolver(
self: Arc<Self>,
) -> Arc<dyn NpmPackageFolderResolver>;
fn into_npm_req_resolver(self: Arc<Self>) -> Arc<dyn CliNpmReqResolver>;
fn into_process_state_provider(
self: Arc<Self>,
) -> Arc<dyn NpmProcessStateProvider>;
fn into_maybe_byonm(self: Arc<Self>) -> Option<Arc<CliByonmNpmResolver>> {
None
}
fn clone_snapshotted(&self) -> Arc<dyn CliNpmResolver>;
fn as_inner(&self) -> InnerCliNpmResolverRef;
fn as_managed(&self) -> Option<&ManagedCliNpmResolver> {
match self.as_inner() {
InnerCliNpmResolverRef::Managed(inner) => Some(inner),
InnerCliNpmResolverRef::Byonm(_) => None,
}
}
fn as_byonm(&self) -> Option<&CliByonmNpmResolver> {
match self.as_inner() {
InnerCliNpmResolverRef::Managed(_) => None,
InnerCliNpmResolverRef::Byonm(inner) => Some(inner),
}
}
fn root_node_modules_path(&self) -> Option<&Path>;
/// Returns a hash returning the state of the npm resolver
/// or `None` if the state currently can't be determined.
fn check_state_hash(&self) -> Option<u64>;
}
#[derive(Debug)]
pub struct NpmFetchResolver {
nv_by_req: DashMap<PackageReq, Option<PackageNv>>,
@ -274,8 +183,8 @@ pub const NPM_CONFIG_USER_AGENT_ENV_VAR: &str = "npm_config_user_agent";
pub fn get_npm_config_user_agent() -> String {
format!(
"deno/{} npm/? deno/{} {} {}",
env!("CARGO_PKG_VERSION"),
env!("CARGO_PKG_VERSION"),
DENO_VERSION_INFO.deno,
DENO_VERSION_INFO.deno,
std::env::consts::OS,
std::env::consts::ARCH
)

View file

@ -3,13 +3,11 @@
use std::sync::atomic::AtomicUsize;
use std::sync::atomic::Ordering;
use deno_core::error::generic_error;
use deno_core::error::type_error;
use deno_core::error::AnyError;
use deno_core::op2;
use deno_core::v8;
use deno_core::ModuleSpecifier;
use deno_core::OpState;
use deno_error::JsErrorBox;
use deno_runtime::deno_permissions::ChildPermissionsArg;
use deno_runtime::deno_permissions::PermissionsContainer;
use deno_runtime::deno_web::StartTime;
@ -78,7 +76,7 @@ pub fn op_pledge_test_permissions(
pub fn op_restore_test_permissions(
state: &mut OpState,
#[serde] token: Uuid,
) -> Result<(), AnyError> {
) -> Result<(), JsErrorBox> {
if let Some(permissions_holder) = state.try_take::<PermissionsHolder>() {
if token != permissions_holder.0 {
panic!("restore test permissions token does not match the stored token");
@ -88,7 +86,7 @@ pub fn op_restore_test_permissions(
state.put::<PermissionsContainer>(permissions);
Ok(())
} else {
Err(generic_error("no permissions to restore"))
Err(JsErrorBox::generic("no permissions to restore"))
}
}
@ -106,9 +104,9 @@ fn op_register_bench(
only: bool,
warmup: bool,
#[buffer] ret_buf: &mut [u8],
) -> Result<(), AnyError> {
) -> Result<(), JsErrorBox> {
if ret_buf.len() != 4 {
return Err(type_error(format!(
return Err(JsErrorBox::type_error(format!(
"Invalid ret_buf length: {}",
ret_buf.len()
)));

View file

@ -94,10 +94,12 @@ pub fn op_jupyter_input(
None
}
#[derive(Debug, thiserror::Error)]
#[derive(Debug, thiserror::Error, deno_error::JsError)]
pub enum JupyterBroadcastError {
#[class(inherit)]
#[error(transparent)]
SerdeJson(serde_json::Error),
#[class(generic)]
#[error(transparent)]
ZeroMq(AnyError),
}

View file

@ -2,25 +2,36 @@
use deno_ast::MediaType;
use deno_ast::ModuleSpecifier;
use deno_core::error::generic_error;
use deno_core::error::AnyError;
use deno_ast::ParseDiagnostic;
use deno_core::op2;
use crate::tools::lint;
deno_core::extension!(deno_lint, ops = [op_lint_create_serialized_ast,],);
#[derive(Debug, thiserror::Error, deno_error::JsError)]
pub enum LintError {
#[class(inherit)]
#[error(transparent)]
Io(#[from] std::io::Error),
#[class(inherit)]
#[error(transparent)]
ParseDiagnostic(#[from] ParseDiagnostic),
#[class(type)]
#[error("Failed to parse path as URL: {0}")]
PathParse(std::path::PathBuf),
}
#[op2]
#[buffer]
fn op_lint_create_serialized_ast(
#[string] file_name: &str,
#[string] source: String,
) -> Result<Vec<u8>, AnyError> {
) -> Result<Vec<u8>, LintError> {
let file_text = deno_ast::strip_bom(source);
let path = std::env::current_dir()?.join(file_name);
let specifier = ModuleSpecifier::from_file_path(&path).map_err(|_| {
generic_error(format!("Failed to parse path as URL: {}", path.display()))
})?;
let specifier = ModuleSpecifier::from_file_path(&path)
.map_err(|_| LintError::PathParse(path))?;
let media_type = MediaType::from_specifier(&specifier);
let parsed_source = deno_ast::parse_program(deno_ast::ParseParams {
specifier,

View file

@ -3,13 +3,11 @@
use std::sync::atomic::AtomicUsize;
use std::sync::atomic::Ordering;
use deno_core::error::generic_error;
use deno_core::error::type_error;
use deno_core::error::AnyError;
use deno_core::op2;
use deno_core::v8;
use deno_core::ModuleSpecifier;
use deno_core::OpState;
use deno_error::JsErrorBox;
use deno_runtime::deno_permissions::ChildPermissionsArg;
use deno_runtime::deno_permissions::PermissionsContainer;
use uuid::Uuid;
@ -73,7 +71,7 @@ pub fn op_pledge_test_permissions(
pub fn op_restore_test_permissions(
state: &mut OpState,
#[serde] token: Uuid,
) -> Result<(), AnyError> {
) -> Result<(), JsErrorBox> {
if let Some(permissions_holder) = state.try_take::<PermissionsHolder>() {
if token != permissions_holder.0 {
panic!("restore test permissions token does not match the stored token");
@ -83,7 +81,7 @@ pub fn op_restore_test_permissions(
state.put::<PermissionsContainer>(permissions);
Ok(())
} else {
Err(generic_error("no permissions to restore"))
Err(JsErrorBox::generic("no permissions to restore"))
}
}
@ -103,9 +101,9 @@ fn op_register_test(
#[smi] line_number: u32,
#[smi] column_number: u32,
#[buffer] ret_buf: &mut [u8],
) -> Result<(), AnyError> {
) -> Result<(), JsErrorBox> {
if ret_buf.len() != 4 {
return Err(type_error(format!(
return Err(JsErrorBox::type_error(format!(
"Invalid ret_buf length: {}",
ret_buf.len()
)));

View file

@ -1,213 +1,81 @@
// Copyright 2018-2025 the Deno authors. MIT license.
use std::borrow::Cow;
use std::path::Path;
use std::path::PathBuf;
use std::sync::Arc;
use async_trait::async_trait;
use dashmap::DashMap;
use dashmap::DashSet;
use deno_ast::MediaType;
use deno_config::workspace::MappedResolutionDiagnostic;
use deno_config::workspace::MappedResolutionError;
use deno_core::anyhow::anyhow;
use deno_core::anyhow::Context;
use deno_core::error::AnyError;
use deno_core::url::Url;
use deno_core::ModuleSourceCode;
use deno_core::ModuleSpecifier;
use deno_error::JsErrorBox;
use deno_graph::source::ResolveError;
use deno_graph::source::UnknownBuiltInNodeModuleError;
use deno_graph::NpmLoadError;
use deno_graph::NpmResolvePkgReqsResult;
use deno_npm::resolution::NpmResolutionError;
use deno_resolver::npm::DenoInNpmPackageChecker;
use deno_resolver::sloppy_imports::SloppyImportsCachedFs;
use deno_resolver::sloppy_imports::SloppyImportsResolver;
use deno_runtime::colors;
use deno_runtime::deno_fs;
use deno_runtime::deno_node::is_builtin_node_module;
use deno_runtime::deno_node::RealIsBuiltInNodeModuleChecker;
use deno_semver::package::PackageReq;
use node_resolver::NodeResolutionKind;
use node_resolver::ResolutionMode;
use sys_traits::FsMetadata;
use sys_traits::FsMetadataValue;
use thiserror::Error;
use crate::args::NpmCachingStrategy;
use crate::args::DENO_DISABLE_PEDANTIC_NODE_WARNINGS;
use crate::node::CliNodeCodeTranslator;
use crate::npm::installer::NpmInstaller;
use crate::npm::installer::PackageCaching;
use crate::npm::CliNpmResolver;
use crate::npm::InnerCliNpmResolverRef;
use crate::sys::CliSys;
use crate::util::sync::AtomicFlag;
use crate::util::text_encoding::from_utf8_lossy_cow;
pub type CjsTracker = deno_resolver::cjs::CjsTracker<CliSys>;
pub type IsCjsResolver = deno_resolver::cjs::IsCjsResolver<CliSys>;
pub type CliCjsTracker =
deno_resolver::cjs::CjsTracker<DenoInNpmPackageChecker, CliSys>;
pub type CliIsCjsResolver =
deno_resolver::cjs::IsCjsResolver<DenoInNpmPackageChecker, CliSys>;
pub type CliSloppyImportsCachedFs = SloppyImportsCachedFs<CliSys>;
pub type CliSloppyImportsResolver =
SloppyImportsResolver<SloppyImportsCachedFs>;
SloppyImportsResolver<CliSloppyImportsCachedFs>;
pub type CliDenoResolver = deno_resolver::DenoResolver<
DenoInNpmPackageChecker,
RealIsBuiltInNodeModuleChecker,
SloppyImportsCachedFs,
CliNpmResolver,
CliSloppyImportsCachedFs,
CliSys,
>;
pub type CliNpmReqResolver = deno_resolver::npm::NpmReqResolver<
DenoInNpmPackageChecker,
RealIsBuiltInNodeModuleChecker,
CliNpmResolver,
CliSys,
>;
pub type CliNpmReqResolver =
deno_resolver::npm::NpmReqResolver<RealIsBuiltInNodeModuleChecker, CliSys>;
pub struct ModuleCodeStringSource {
pub code: ModuleSourceCode,
pub found_url: ModuleSpecifier,
pub media_type: MediaType,
}
#[derive(Debug, Error)]
#[error("{media_type} files are not supported in npm packages: {specifier}")]
pub struct NotSupportedKindInNpmError {
pub media_type: MediaType,
pub specifier: Url,
}
// todo(dsherret): move to module_loader.rs (it seems to be here due to use in standalone)
#[derive(Clone)]
pub struct NpmModuleLoader {
cjs_tracker: Arc<CjsTracker>,
fs: Arc<dyn deno_fs::FileSystem>,
node_code_translator: Arc<CliNodeCodeTranslator>,
}
impl NpmModuleLoader {
pub fn new(
cjs_tracker: Arc<CjsTracker>,
fs: Arc<dyn deno_fs::FileSystem>,
node_code_translator: Arc<CliNodeCodeTranslator>,
) -> Self {
Self {
cjs_tracker,
node_code_translator,
fs,
}
}
pub async fn load(
&self,
specifier: &ModuleSpecifier,
maybe_referrer: Option<&ModuleSpecifier>,
) -> Result<ModuleCodeStringSource, AnyError> {
let file_path = specifier.to_file_path().unwrap();
let code = self
.fs
.read_file_async(file_path.clone(), None)
.await
.map_err(AnyError::from)
.with_context(|| {
if file_path.is_dir() {
// directory imports are not allowed when importing from an
// ES module, so provide the user with a helpful error message
let dir_path = file_path;
let mut msg = "Directory import ".to_string();
msg.push_str(&dir_path.to_string_lossy());
if let Some(referrer) = &maybe_referrer {
msg.push_str(" is not supported resolving import from ");
msg.push_str(referrer.as_str());
let entrypoint_name = ["index.mjs", "index.js", "index.cjs"]
.iter()
.find(|e| dir_path.join(e).is_file());
if let Some(entrypoint_name) = entrypoint_name {
msg.push_str("\nDid you mean to import ");
msg.push_str(entrypoint_name);
msg.push_str(" within the directory?");
}
}
msg
} else {
let mut msg = "Unable to load ".to_string();
msg.push_str(&file_path.to_string_lossy());
if let Some(referrer) = &maybe_referrer {
msg.push_str(" imported from ");
msg.push_str(referrer.as_str());
}
msg
}
})?;
let media_type = MediaType::from_specifier(specifier);
if media_type.is_emittable() {
return Err(AnyError::from(NotSupportedKindInNpmError {
media_type,
specifier: specifier.clone(),
}));
}
let code = if self.cjs_tracker.is_maybe_cjs(specifier, media_type)? {
// translate cjs to esm if it's cjs and inject node globals
let code = from_utf8_lossy_cow(code);
ModuleSourceCode::String(
self
.node_code_translator
.translate_cjs_to_esm(specifier, Some(code))
.await?
.into_owned()
.into(),
)
} else {
// esm and json code is untouched
ModuleSourceCode::Bytes(match code {
Cow::Owned(bytes) => bytes.into_boxed_slice().into(),
Cow::Borrowed(bytes) => bytes.into(),
})
};
Ok(ModuleCodeStringSource {
code,
found_url: specifier.clone(),
media_type: MediaType::from_specifier(specifier),
})
}
}
pub struct CliResolverOptions {
pub deno_resolver: Arc<CliDenoResolver>,
pub npm_resolver: Option<Arc<dyn CliNpmResolver>>,
pub bare_node_builtins_enabled: bool,
}
#[derive(Debug, Default)]
pub struct FoundPackageJsonDepFlag(AtomicFlag);
/// A resolver that takes care of resolution, taking into account loaded
/// import map, JSX settings.
#[derive(Debug)]
pub struct CliResolver {
deno_resolver: Arc<CliDenoResolver>,
npm_resolver: Option<Arc<dyn CliNpmResolver>>,
found_package_json_dep_flag: AtomicFlag,
bare_node_builtins_enabled: bool,
found_package_json_dep_flag: Arc<FoundPackageJsonDepFlag>,
warned_pkgs: DashSet<PackageReq>,
}
impl CliResolver {
pub fn new(options: CliResolverOptions) -> Self {
pub fn new(
deno_resolver: Arc<CliDenoResolver>,
found_package_json_dep_flag: Arc<FoundPackageJsonDepFlag>,
) -> Self {
Self {
deno_resolver: options.deno_resolver,
npm_resolver: options.npm_resolver,
found_package_json_dep_flag: Default::default(),
bare_node_builtins_enabled: options.bare_node_builtins_enabled,
deno_resolver,
found_package_json_dep_flag,
warned_pkgs: Default::default(),
}
}
// todo(dsherret): move this off CliResolver as CliResolver is acting
// like a factory by doing this (it's beyond its responsibility)
pub fn create_graph_npm_resolver(
&self,
npm_caching: NpmCachingStrategy,
) -> WorkerCliNpmGraphResolver {
WorkerCliNpmGraphResolver {
npm_resolver: self.npm_resolver.as_ref(),
found_package_json_dep_flag: &self.found_package_json_dep_flag,
bare_node_builtins_enabled: self.bare_node_builtins_enabled,
npm_caching,
}
}
pub fn resolve(
&self,
raw_specifier: &str,
@ -225,15 +93,17 @@ impl CliResolver {
) => match mapped_resolution_error {
MappedResolutionError::Specifier(e) => ResolveError::Specifier(e),
// deno_graph checks specifically for an ImportMapError
MappedResolutionError::ImportMap(e) => ResolveError::Other(e.into()),
err => ResolveError::Other(err.into()),
MappedResolutionError::ImportMap(e) => ResolveError::ImportMap(e),
MappedResolutionError::Workspace(e) => {
ResolveError::Other(JsErrorBox::from_err(e))
}
},
err => ResolveError::Other(err.into()),
err => ResolveError::Other(JsErrorBox::from_err(err)),
})?;
if resolution.found_package_json_dep {
// mark that we need to do an "npm install" later
self.found_package_json_dep_flag.raise();
self.found_package_json_dep_flag.0.raise();
}
if let Some(diagnostic) = resolution.maybe_diagnostic {
@ -260,15 +130,31 @@ impl CliResolver {
}
#[derive(Debug)]
pub struct WorkerCliNpmGraphResolver<'a> {
npm_resolver: Option<&'a Arc<dyn CliNpmResolver>>,
found_package_json_dep_flag: &'a AtomicFlag,
pub struct CliNpmGraphResolver {
npm_installer: Option<Arc<NpmInstaller>>,
found_package_json_dep_flag: Arc<FoundPackageJsonDepFlag>,
bare_node_builtins_enabled: bool,
npm_caching: NpmCachingStrategy,
}
impl CliNpmGraphResolver {
pub fn new(
npm_installer: Option<Arc<NpmInstaller>>,
found_package_json_dep_flag: Arc<FoundPackageJsonDepFlag>,
bare_node_builtins_enabled: bool,
npm_caching: NpmCachingStrategy,
) -> Self {
Self {
npm_installer,
found_package_json_dep_flag,
bare_node_builtins_enabled,
npm_caching,
}
}
}
#[async_trait(?Send)]
impl<'a> deno_graph::source::NpmResolver for WorkerCliNpmGraphResolver<'a> {
impl deno_graph::source::NpmResolver for CliNpmGraphResolver {
fn resolve_builtin_node_module(
&self,
specifier: &ModuleSpecifier,
@ -298,35 +184,24 @@ impl<'a> deno_graph::source::NpmResolver for WorkerCliNpmGraphResolver<'a> {
}
fn load_and_cache_npm_package_info(&self, package_name: &str) {
match self.npm_resolver {
Some(npm_resolver) if npm_resolver.as_managed().is_some() => {
let npm_resolver = npm_resolver.clone();
if let Some(npm_installer) = &self.npm_installer {
let npm_installer = npm_installer.clone();
let package_name = package_name.to_string();
deno_core::unsync::spawn(async move {
if let Some(managed) = npm_resolver.as_managed() {
let _ignore = managed.cache_package_info(&package_name).await;
}
let _ignore = npm_installer.cache_package_info(&package_name).await;
});
}
_ => {}
}
}
async fn resolve_pkg_reqs(
&self,
package_reqs: &[PackageReq],
) -> NpmResolvePkgReqsResult {
match &self.npm_resolver {
Some(npm_resolver) => {
let npm_resolver = match npm_resolver.as_inner() {
InnerCliNpmResolverRef::Managed(npm_resolver) => npm_resolver,
// if we are using byonm, then this should never be called because
// we don't use deno_graph's npm resolution in this case
InnerCliNpmResolverRef::Byonm(_) => unreachable!(),
};
let top_level_result = if self.found_package_json_dep_flag.is_raised() {
npm_resolver
match &self.npm_installer {
Some(npm_installer) => {
let top_level_result = if self.found_package_json_dep_flag.0.is_raised()
{
npm_installer
.ensure_top_level_package_json_install()
.await
.map(|_| ())
@ -334,15 +209,13 @@ impl<'a> deno_graph::source::NpmResolver for WorkerCliNpmGraphResolver<'a> {
Ok(())
};
let result = npm_resolver
let result = npm_installer
.add_package_reqs_raw(
package_reqs,
match self.npm_caching {
NpmCachingStrategy::Eager => {
Some(crate::npm::PackageCaching::All)
}
NpmCachingStrategy::Eager => Some(PackageCaching::All),
NpmCachingStrategy::Lazy => {
Some(crate::npm::PackageCaching::Only(package_reqs.into()))
Some(PackageCaching::Only(package_reqs.into()))
}
NpmCachingStrategy::Manual => None,
},
@ -356,26 +229,28 @@ impl<'a> deno_graph::source::NpmResolver for WorkerCliNpmGraphResolver<'a> {
.map(|r| {
r.map_err(|err| match err {
NpmResolutionError::Registry(e) => {
NpmLoadError::RegistryInfo(Arc::new(e.into()))
NpmLoadError::RegistryInfo(Arc::new(e))
}
NpmResolutionError::Resolution(e) => {
NpmLoadError::PackageReqResolution(Arc::new(e.into()))
NpmLoadError::PackageReqResolution(Arc::new(e))
}
NpmResolutionError::DependencyEntry(e) => {
NpmLoadError::PackageReqResolution(Arc::new(e.into()))
NpmLoadError::PackageReqResolution(Arc::new(e))
}
})
})
.collect(),
dep_graph_result: match top_level_result {
Ok(()) => result.dependencies_result.map_err(Arc::new),
Ok(()) => result
.dependencies_result
.map_err(|e| Arc::new(e) as Arc<dyn deno_error::JsErrorClass>),
Err(err) => Err(Arc::new(err)),
},
}
}
None => {
let err = Arc::new(anyhow!(
"npm specifiers were requested; but --no-npm is specified"
let err = Arc::new(JsErrorBox::generic(
"npm specifiers were requested; but --no-npm is specified",
));
NpmResolvePkgReqsResult {
results: package_reqs
@ -392,60 +267,3 @@ impl<'a> deno_graph::source::NpmResolver for WorkerCliNpmGraphResolver<'a> {
self.bare_node_builtins_enabled
}
}
#[derive(Debug)]
pub struct SloppyImportsCachedFs {
sys: CliSys,
cache: Option<
DashMap<
PathBuf,
Option<deno_resolver::sloppy_imports::SloppyImportsFsEntry>,
>,
>,
}
impl SloppyImportsCachedFs {
pub fn new(sys: CliSys) -> Self {
Self {
sys,
cache: Some(Default::default()),
}
}
pub fn new_without_stat_cache(fs: CliSys) -> Self {
Self {
sys: fs,
cache: None,
}
}
}
impl deno_resolver::sloppy_imports::SloppyImportResolverFs
for SloppyImportsCachedFs
{
fn stat_sync(
&self,
path: &Path,
) -> Option<deno_resolver::sloppy_imports::SloppyImportsFsEntry> {
if let Some(cache) = &self.cache {
if let Some(entry) = cache.get(path) {
return *entry;
}
}
let entry = self.sys.fs_metadata(path).ok().and_then(|stat| {
if stat.file_type().is_file() {
Some(deno_resolver::sloppy_imports::SloppyImportsFsEntry::File)
} else if stat.file_type().is_dir() {
Some(deno_resolver::sloppy_imports::SloppyImportsFsEntry::Dir)
} else {
None
}
});
if let Some(cache) = &self.cache {
cache.insert(path.to_owned(), entry);
}
entry
}
}

63
cli/rt/Cargo.toml Normal file
View file

@ -0,0 +1,63 @@
# Copyright 2018-2025 the Deno authors. MIT license.
[package]
name = "denort"
version = "2.1.5"
authors.workspace = true
default-run = "denort"
edition.workspace = true
license.workspace = true
publish = false
repository.workspace = true
description = "Provides the denort executable"
[[bin]]
name = "denort"
path = "main.rs"
doc = false
[[test]]
name = "integration"
path = "integration_tests_runner.rs"
harness = false
[build-dependencies]
deno_runtime = { workspace = true, features = ["include_js_files_for_snapshotting", "only_snapshotted_js_sources"] }
deno_core = { workspace = true, features = ["include_js_files_for_snapshotting"] }
[dependencies]
deno_cache_dir.workspace = true
deno_config.workspace = true
deno_core = { workspace = true, features = ["include_js_files_for_snapshotting"] }
deno_error.workspace = true
deno_lib.workspace = true
deno_media_type = { workspace = true, features = ["data_url", "decoding"] }
deno_npm.workspace = true
deno_package_json.workspace = true
deno_path_util.workspace = true
deno_resolver = { workspace = true, features = ["sync"] }
deno_runtime = { workspace = true, features = ["include_js_files_for_snapshotting"] }
deno_semver.workspace = true
deno_snapshots.workspace = true
deno_terminal.workspace = true
libsui = "0.5.0"
node_resolver.workspace = true
async-trait.workspace = true
bincode = "=1.3.3"
import_map = { version = "=0.21.0", features = ["ext"] }
indexmap.workspace = true
log = { workspace = true, features = ["serde"] }
serde.workspace = true
serde_json.workspace = true
sys_traits = { workspace = true, features = ["getrandom", "filetime", "libc", "real", "strip_unc", "winapi"] }
thiserror.workspace = true
tokio.workspace = true
tokio-util.workspace = true
twox-hash.workspace = true
url.workspace = true
[dev-dependencies]
pretty_assertions.workspace = true
sys_traits = { workspace = true, features = ["memory"] }
test_util.workspace = true

682
cli/rt/binary.rs Normal file
View file

@ -0,0 +1,682 @@
// Copyright 2018-2025 the Deno authors. MIT license.
use std::borrow::Cow;
use std::collections::HashMap;
use std::ffi::OsString;
use std::io::ErrorKind;
use std::path::Path;
use std::path::PathBuf;
use std::sync::Arc;
use deno_core::anyhow::bail;
use deno_core::anyhow::Context;
use deno_core::error::AnyError;
use deno_core::serde_json;
use deno_core::url::Url;
use deno_core::FastString;
use deno_core::ModuleSourceCode;
use deno_core::ModuleType;
use deno_error::JsError;
use deno_error::JsErrorBox;
use deno_lib::standalone::binary::DenoRtDeserializable;
use deno_lib::standalone::binary::Metadata;
use deno_lib::standalone::binary::RemoteModuleEntry;
use deno_lib::standalone::binary::SpecifierDataStore;
use deno_lib::standalone::binary::SpecifierId;
use deno_lib::standalone::binary::MAGIC_BYTES;
use deno_lib::standalone::virtual_fs::VirtualDirectory;
use deno_lib::standalone::virtual_fs::VirtualDirectoryEntries;
use deno_media_type::MediaType;
use deno_npm::resolution::SerializedNpmResolutionSnapshot;
use deno_npm::resolution::SerializedNpmResolutionSnapshotPackage;
use deno_npm::resolution::ValidSerializedNpmResolutionSnapshot;
use deno_npm::NpmPackageId;
use deno_runtime::deno_fs::FileSystem;
use deno_runtime::deno_fs::RealFs;
use deno_runtime::deno_io::fs::FsError;
use deno_semver::package::PackageReq;
use deno_semver::StackString;
use indexmap::IndexMap;
use thiserror::Error;
use crate::file_system::FileBackedVfs;
use crate::file_system::VfsRoot;
pub struct StandaloneData {
pub metadata: Metadata,
pub modules: Arc<StandaloneModules>,
pub npm_snapshot: Option<ValidSerializedNpmResolutionSnapshot>,
pub root_path: PathBuf,
pub vfs: Arc<FileBackedVfs>,
}
/// This function will try to run this binary as a standalone binary
/// produced by `deno compile`. It determines if this is a standalone
/// binary by skipping over the trailer width at the end of the file,
/// then checking for the magic trailer string `d3n0l4nd`. If found,
/// the bundle is executed. If not, this function exits with `Ok(None)`.
pub fn extract_standalone(
cli_args: Cow<Vec<OsString>>,
) -> Result<Option<StandaloneData>, AnyError> {
let Some(data) = libsui::find_section("d3n0l4nd") else {
return Ok(None);
};
let root_path = {
let maybe_current_exe = std::env::current_exe().ok();
let current_exe_name = maybe_current_exe
.as_ref()
.and_then(|p| p.file_name())
.map(|p| p.to_string_lossy())
// should never happen
.unwrap_or_else(|| Cow::Borrowed("binary"));
std::env::temp_dir().join(format!("deno-compile-{}", current_exe_name))
};
let root_url = deno_path_util::url_from_directory_path(&root_path)?;
let DeserializedDataSection {
mut metadata,
npm_snapshot,
modules_store: remote_modules,
vfs_root_entries,
vfs_files_data,
} = match deserialize_binary_data_section(&root_url, data)? {
Some(data_section) => data_section,
None => return Ok(None),
};
let cli_args = cli_args.into_owned();
metadata.argv.reserve(cli_args.len() - 1);
for arg in cli_args.into_iter().skip(1) {
metadata.argv.push(arg.into_string().unwrap());
}
let vfs = {
let fs_root = VfsRoot {
dir: VirtualDirectory {
// align the name of the directory with the root dir
name: root_path.file_name().unwrap().to_string_lossy().to_string(),
entries: vfs_root_entries,
},
root_path: root_path.clone(),
start_file_offset: 0,
};
Arc::new(FileBackedVfs::new(
Cow::Borrowed(vfs_files_data),
fs_root,
metadata.vfs_case_sensitivity,
))
};
Ok(Some(StandaloneData {
metadata,
modules: Arc::new(StandaloneModules {
modules: remote_modules,
vfs: vfs.clone(),
}),
npm_snapshot,
root_path,
vfs,
}))
}
pub struct DeserializedDataSection {
pub metadata: Metadata,
pub npm_snapshot: Option<ValidSerializedNpmResolutionSnapshot>,
pub modules_store: RemoteModulesStore,
pub vfs_root_entries: VirtualDirectoryEntries,
pub vfs_files_data: &'static [u8],
}
pub fn deserialize_binary_data_section(
root_dir_url: &Url,
data: &'static [u8],
) -> Result<Option<DeserializedDataSection>, AnyError> {
fn read_magic_bytes(input: &[u8]) -> Result<(&[u8], bool), AnyError> {
if input.len() < MAGIC_BYTES.len() {
bail!("Unexpected end of data. Could not find magic bytes.");
}
let (magic_bytes, input) = input.split_at(MAGIC_BYTES.len());
if magic_bytes != MAGIC_BYTES {
return Ok((input, false));
}
Ok((input, true))
}
let (input, found) = read_magic_bytes(data)?;
if !found {
return Ok(None);
}
// 1. Metadata
let (input, data) =
read_bytes_with_u64_len(input).context("reading metadata")?;
let metadata: Metadata =
serde_json::from_slice(data).context("deserializing metadata")?;
// 2. Npm snapshot
let (input, data) =
read_bytes_with_u64_len(input).context("reading npm snapshot")?;
let npm_snapshot = if data.is_empty() {
None
} else {
Some(deserialize_npm_snapshot(data).context("deserializing npm snapshot")?)
};
// 3. Specifiers
let (input, specifiers_store) =
SpecifierStore::deserialize(root_dir_url, input)
.context("deserializing specifiers")?;
// 4. Redirects
let (input, redirects_store) =
SpecifierDataStore::<SpecifierId>::deserialize(input)
.context("deserializing redirects")?;
// 5. Remote modules
let (input, remote_modules_store) =
SpecifierDataStore::<RemoteModuleEntry<'static>>::deserialize(input)
.context("deserializing remote modules")?;
// 6. VFS
let (input, data) = read_bytes_with_u64_len(input).context("vfs")?;
let vfs_root_entries: VirtualDirectoryEntries =
serde_json::from_slice(data).context("deserializing vfs data")?;
let (input, vfs_files_data) =
read_bytes_with_u64_len(input).context("reading vfs files data")?;
// finally ensure we read the magic bytes at the end
let (_input, found) = read_magic_bytes(input)?;
if !found {
bail!("Could not find magic bytes at the end of the data.");
}
let modules_store = RemoteModulesStore::new(
specifiers_store,
redirects_store,
remote_modules_store,
);
Ok(Some(DeserializedDataSection {
metadata,
npm_snapshot,
modules_store,
vfs_root_entries,
vfs_files_data,
}))
}
struct SpecifierStore {
data: IndexMap<Arc<Url>, SpecifierId>,
reverse: IndexMap<SpecifierId, Arc<Url>>,
}
impl SpecifierStore {
pub fn deserialize<'a>(
root_dir_url: &Url,
input: &'a [u8],
) -> std::io::Result<(&'a [u8], Self)> {
let (input, len) = read_u32_as_usize(input)?;
let mut data = IndexMap::with_capacity(len);
let mut reverse = IndexMap::with_capacity(len);
let mut input = input;
for _ in 0..len {
let (new_input, specifier_str) = read_string_lossy(input)?;
let specifier = match Url::parse(&specifier_str) {
Ok(url) => url,
Err(err) => match root_dir_url.join(&specifier_str) {
Ok(url) => url,
Err(_) => {
return Err(std::io::Error::new(
std::io::ErrorKind::InvalidData,
err,
));
}
},
};
let (new_input, id) = SpecifierId::deserialize(new_input)?;
let specifier = Arc::new(specifier);
data.insert(specifier.clone(), id);
reverse.insert(id, specifier);
input = new_input;
}
Ok((input, Self { data, reverse }))
}
pub fn get_id(&self, specifier: &Url) -> Option<SpecifierId> {
self.data.get(specifier).cloned()
}
pub fn get_specifier(&self, specifier_id: SpecifierId) -> Option<&Url> {
self.reverse.get(&specifier_id).map(|url| url.as_ref())
}
}
pub struct StandaloneModules {
modules: RemoteModulesStore,
vfs: Arc<FileBackedVfs>,
}
impl StandaloneModules {
pub fn resolve_specifier<'a>(
&'a self,
specifier: &'a Url,
) -> Result<Option<&'a Url>, TooManyRedirectsError> {
if specifier.scheme() == "file" {
Ok(Some(specifier))
} else {
self.modules.resolve_specifier(specifier)
}
}
pub fn has_file(&self, path: &Path) -> bool {
self.vfs.file_entry(path).is_ok()
}
pub fn read<'a>(
&'a self,
specifier: &'a Url,
) -> Result<Option<DenoCompileModuleData<'a>>, JsErrorBox> {
if specifier.scheme() == "file" {
let path = deno_path_util::url_to_file_path(specifier)
.map_err(JsErrorBox::from_err)?;
let mut transpiled = None;
let mut source_map = None;
let mut cjs_export_analysis = None;
let bytes = match self.vfs.file_entry(&path) {
Ok(entry) => {
let bytes = self
.vfs
.read_file_all(entry)
.map_err(JsErrorBox::from_err)?;
transpiled = entry
.transpiled_offset
.and_then(|t| self.vfs.read_file_offset_with_len(t).ok());
source_map = entry
.source_map_offset
.and_then(|t| self.vfs.read_file_offset_with_len(t).ok());
cjs_export_analysis = entry
.cjs_export_analysis_offset
.and_then(|t| self.vfs.read_file_offset_with_len(t).ok());
bytes
}
Err(err) if err.kind() == ErrorKind::NotFound => {
match RealFs.read_file_sync(&path, None) {
Ok(bytes) => bytes,
Err(FsError::Io(err)) if err.kind() == ErrorKind::NotFound => {
return Ok(None)
}
Err(err) => return Err(JsErrorBox::from_err(err)),
}
}
Err(err) => return Err(JsErrorBox::from_err(err)),
};
Ok(Some(DenoCompileModuleData {
media_type: MediaType::from_specifier(specifier),
specifier,
data: bytes,
transpiled,
source_map,
cjs_export_analysis,
}))
} else {
self.modules.read(specifier).map_err(JsErrorBox::from_err)
}
}
}
pub struct DenoCompileModuleData<'a> {
pub specifier: &'a Url,
pub media_type: MediaType,
pub data: Cow<'static, [u8]>,
pub transpiled: Option<Cow<'static, [u8]>>,
pub source_map: Option<Cow<'static, [u8]>>,
pub cjs_export_analysis: Option<Cow<'static, [u8]>>,
}
impl<'a> DenoCompileModuleData<'a> {
pub fn into_parts(self) -> (&'a Url, ModuleType, DenoCompileModuleSource) {
fn into_string_unsafe(data: Cow<'static, [u8]>) -> DenoCompileModuleSource {
match data {
Cow::Borrowed(d) => DenoCompileModuleSource::String(
// SAFETY: we know this is a valid utf8 string
unsafe { std::str::from_utf8_unchecked(d) },
),
Cow::Owned(d) => DenoCompileModuleSource::Bytes(Cow::Owned(d)),
}
}
let data = self.transpiled.unwrap_or(self.data);
let (media_type, source) = match self.media_type {
MediaType::JavaScript
| MediaType::Jsx
| MediaType::Mjs
| MediaType::Cjs
| MediaType::TypeScript
| MediaType::Mts
| MediaType::Cts
| MediaType::Dts
| MediaType::Dmts
| MediaType::Dcts
| MediaType::Tsx => (ModuleType::JavaScript, into_string_unsafe(data)),
MediaType::Json => (ModuleType::Json, into_string_unsafe(data)),
MediaType::Wasm => {
(ModuleType::Wasm, DenoCompileModuleSource::Bytes(data))
}
// just assume javascript if we made it here
MediaType::Css | MediaType::SourceMap | MediaType::Unknown => {
(ModuleType::JavaScript, DenoCompileModuleSource::Bytes(data))
}
};
(self.specifier, media_type, source)
}
}
pub enum DenoCompileModuleSource {
String(&'static str),
Bytes(Cow<'static, [u8]>),
}
impl DenoCompileModuleSource {
pub fn into_for_v8(self) -> ModuleSourceCode {
fn into_bytes(data: Cow<'static, [u8]>) -> ModuleSourceCode {
ModuleSourceCode::Bytes(match data {
Cow::Borrowed(d) => d.into(),
Cow::Owned(d) => d.into_boxed_slice().into(),
})
}
match self {
// todo(https://github.com/denoland/deno_core/pull/943): store whether
// the string is ascii or not ahead of time so we can avoid the is_ascii()
// check in FastString::from_static
Self::String(s) => ModuleSourceCode::String(FastString::from_static(s)),
Self::Bytes(b) => into_bytes(b),
}
}
}
#[derive(Debug, Error, JsError)]
#[class(generic)]
#[error("Too many redirects resolving: {0}")]
pub struct TooManyRedirectsError(Url);
pub struct RemoteModulesStore {
specifiers: SpecifierStore,
redirects: SpecifierDataStore<SpecifierId>,
remote_modules: SpecifierDataStore<RemoteModuleEntry<'static>>,
}
impl RemoteModulesStore {
fn new(
specifiers: SpecifierStore,
redirects: SpecifierDataStore<SpecifierId>,
remote_modules: SpecifierDataStore<RemoteModuleEntry<'static>>,
) -> Self {
Self {
specifiers,
redirects,
remote_modules,
}
}
pub fn resolve_specifier<'a>(
&'a self,
specifier: &'a Url,
) -> Result<Option<&'a Url>, TooManyRedirectsError> {
let Some(mut current) = self.specifiers.get_id(specifier) else {
return Ok(None);
};
let mut count = 0;
loop {
if count > 10 {
return Err(TooManyRedirectsError(specifier.clone()));
}
match self.redirects.get(current) {
Some(to) => {
current = *to;
count += 1;
}
None => {
if count == 0 {
return Ok(Some(specifier));
} else {
return Ok(self.specifiers.get_specifier(current));
}
}
}
}
}
pub fn read<'a>(
&'a self,
original_specifier: &'a Url,
) -> Result<Option<DenoCompileModuleData<'a>>, TooManyRedirectsError> {
#[allow(clippy::ptr_arg)]
fn handle_cow_ref(data: &Cow<'static, [u8]>) -> Cow<'static, [u8]> {
match data {
Cow::Borrowed(data) => Cow::Borrowed(data),
Cow::Owned(data) => {
// this variant should never happen because the data
// should always be borrowed static in denort
debug_assert!(false);
Cow::Owned(data.clone())
}
}
}
let mut count = 0;
let Some(mut specifier) = self.specifiers.get_id(original_specifier) else {
return Ok(None);
};
loop {
if count > 10 {
return Err(TooManyRedirectsError(original_specifier.clone()));
}
match self.redirects.get(specifier) {
Some(to) => {
specifier = *to;
count += 1;
}
None => {
let Some(entry) = self.remote_modules.get(specifier) else {
return Ok(None);
};
return Ok(Some(DenoCompileModuleData {
specifier: if count == 0 {
original_specifier
} else {
self.specifiers.get_specifier(specifier).unwrap()
},
media_type: entry.media_type,
data: handle_cow_ref(&entry.data),
transpiled: entry.maybe_transpiled.as_ref().map(handle_cow_ref),
source_map: entry.maybe_source_map.as_ref().map(handle_cow_ref),
cjs_export_analysis: entry
.maybe_cjs_export_analysis
.as_ref()
.map(handle_cow_ref),
}));
}
}
}
}
}
fn deserialize_npm_snapshot(
input: &[u8],
) -> Result<ValidSerializedNpmResolutionSnapshot, AnyError> {
fn parse_id(input: &[u8]) -> Result<(&[u8], NpmPackageId), AnyError> {
let (input, id) = read_string_lossy(input)?;
let id = NpmPackageId::from_serialized(&id)?;
Ok((input, id))
}
#[allow(clippy::needless_lifetimes)] // clippy bug
fn parse_root_package<'a>(
id_to_npm_id: &'a impl Fn(usize) -> Result<NpmPackageId, AnyError>,
) -> impl Fn(&[u8]) -> Result<(&[u8], (PackageReq, NpmPackageId)), AnyError> + 'a
{
|input| {
let (input, req) = read_string_lossy(input)?;
let req = PackageReq::from_str(&req)?;
let (input, id) = read_u32_as_usize(input)?;
Ok((input, (req, id_to_npm_id(id)?)))
}
}
#[allow(clippy::needless_lifetimes)] // clippy bug
fn parse_package_dep<'a>(
id_to_npm_id: &'a impl Fn(usize) -> Result<NpmPackageId, AnyError>,
) -> impl Fn(&[u8]) -> Result<(&[u8], (StackString, NpmPackageId)), AnyError> + 'a
{
|input| {
let (input, req) = read_string_lossy(input)?;
let (input, id) = read_u32_as_usize(input)?;
let req = StackString::from_cow(req);
Ok((input, (req, id_to_npm_id(id)?)))
}
}
fn parse_package<'a>(
input: &'a [u8],
id: NpmPackageId,
id_to_npm_id: &impl Fn(usize) -> Result<NpmPackageId, AnyError>,
) -> Result<(&'a [u8], SerializedNpmResolutionSnapshotPackage), AnyError> {
let (input, deps_len) = read_u32_as_usize(input)?;
let (input, dependencies) =
parse_hashmap_n_times(input, deps_len, parse_package_dep(id_to_npm_id))?;
Ok((
input,
SerializedNpmResolutionSnapshotPackage {
id,
system: Default::default(),
dist: Default::default(),
dependencies,
optional_dependencies: Default::default(),
bin: None,
scripts: Default::default(),
deprecated: Default::default(),
},
))
}
let (input, packages_len) = read_u32_as_usize(input)?;
// get a hashmap of all the npm package ids to their serialized ids
let (input, data_ids_to_npm_ids) =
parse_vec_n_times(input, packages_len, parse_id)
.context("deserializing id")?;
let data_id_to_npm_id = |id: usize| {
data_ids_to_npm_ids
.get(id)
.cloned()
.ok_or_else(|| deno_core::anyhow::anyhow!("Invalid npm package id"))
};
let (input, root_packages_len) = read_u32_as_usize(input)?;
let (input, root_packages) = parse_hashmap_n_times(
input,
root_packages_len,
parse_root_package(&data_id_to_npm_id),
)
.context("deserializing root package")?;
let (input, packages) =
parse_vec_n_times_with_index(input, packages_len, |input, index| {
parse_package(input, data_id_to_npm_id(index)?, &data_id_to_npm_id)
})
.context("deserializing package")?;
if !input.is_empty() {
bail!("Unexpected data left over");
}
Ok(
SerializedNpmResolutionSnapshot {
packages,
root_packages,
}
// this is ok because we have already verified that all the
// identifiers found in the snapshot are valid via the
// npm package id -> npm package id mapping
.into_valid_unsafe(),
)
}
fn parse_hashmap_n_times<TKey: std::cmp::Eq + std::hash::Hash, TValue>(
mut input: &[u8],
times: usize,
parse: impl Fn(&[u8]) -> Result<(&[u8], (TKey, TValue)), AnyError>,
) -> Result<(&[u8], HashMap<TKey, TValue>), AnyError> {
let mut results = HashMap::with_capacity(times);
for _ in 0..times {
let result = parse(input);
let (new_input, (key, value)) = result?;
results.insert(key, value);
input = new_input;
}
Ok((input, results))
}
fn parse_vec_n_times<TResult>(
input: &[u8],
times: usize,
parse: impl Fn(&[u8]) -> Result<(&[u8], TResult), AnyError>,
) -> Result<(&[u8], Vec<TResult>), AnyError> {
parse_vec_n_times_with_index(input, times, |input, _index| parse(input))
}
fn parse_vec_n_times_with_index<TResult>(
mut input: &[u8],
times: usize,
parse: impl Fn(&[u8], usize) -> Result<(&[u8], TResult), AnyError>,
) -> Result<(&[u8], Vec<TResult>), AnyError> {
let mut results = Vec::with_capacity(times);
for i in 0..times {
let result = parse(input, i);
let (new_input, result) = result?;
results.push(result);
input = new_input;
}
Ok((input, results))
}
fn read_bytes_with_u64_len(input: &[u8]) -> std::io::Result<(&[u8], &[u8])> {
let (input, len) = read_u64(input)?;
let (input, data) = read_bytes(input, len as usize)?;
Ok((input, data))
}
fn read_bytes_with_u32_len(input: &[u8]) -> std::io::Result<(&[u8], &[u8])> {
let (input, len) = read_u32_as_usize(input)?;
let (input, data) = read_bytes(input, len)?;
Ok((input, data))
}
fn read_bytes(input: &[u8], len: usize) -> std::io::Result<(&[u8], &[u8])> {
check_has_len(input, len)?;
let (len_bytes, input) = input.split_at(len);
Ok((input, len_bytes))
}
#[inline(always)]
fn check_has_len(input: &[u8], len: usize) -> std::io::Result<()> {
if input.len() < len {
Err(std::io::Error::new(
std::io::ErrorKind::InvalidData,
"Unexpected end of data",
))
} else {
Ok(())
}
}
fn read_string_lossy(input: &[u8]) -> std::io::Result<(&[u8], Cow<str>)> {
let (input, data_bytes) = read_bytes_with_u32_len(input)?;
Ok((input, String::from_utf8_lossy(data_bytes)))
}
fn read_u32_as_usize(input: &[u8]) -> std::io::Result<(&[u8], usize)> {
let (input, len_bytes) = read_bytes(input, 4)?;
let len = u32::from_le_bytes(len_bytes.try_into().unwrap());
Ok((input, len as usize))
}
fn read_u64(input: &[u8]) -> std::io::Result<(&[u8], u64)> {
let (input, len_bytes) = read_bytes(input, 8)?;
let len = u64::from_le_bytes(len_bytes.try_into().unwrap());
Ok((input, len))
}

11
cli/rt/build.rs Normal file
View file

@ -0,0 +1,11 @@
// Copyright 2018-2025 the Deno authors. MIT license.
fn main() {
// Skip building from docs.rs.
if std::env::var_os("DOCS_RS").is_some() {
return;
}
deno_runtime::deno_napi::print_linker_flags("denort");
deno_runtime::deno_webgpu::print_linker_flags("denort");
}

View file

@ -1,6 +1,5 @@
// Copyright 2018-2025 the Deno authors. MIT license.
use std::collections::BTreeMap;
use std::collections::HashMap;
use std::io::BufReader;
use std::io::BufWriter;
@ -10,17 +9,15 @@ use std::path::Path;
use std::path::PathBuf;
use std::sync::Arc;
use deno_ast::ModuleSpecifier;
use deno_core::anyhow::bail;
use deno_core::error::AnyError;
use deno_core::parking_lot::Mutex;
use deno_core::unsync::sync::AtomicFlag;
use deno_lib::util::hash::FastInsecureHasher;
use deno_path_util::get_atomic_path;
use deno_runtime::code_cache::CodeCache;
use deno_runtime::code_cache::CodeCacheType;
use crate::cache::FastInsecureHasher;
use crate::worker::CliCodeCache;
use url::Url;
enum CodeCacheStrategy {
FirstRun(FirstRunCodeCacheStrategy),
@ -76,12 +73,27 @@ impl DenoCompileCodeCache {
}
}
}
pub fn for_deno_core(self: Arc<Self>) -> Arc<dyn CodeCache> {
self.clone()
}
pub fn enabled(&self) -> bool {
match &self.strategy {
CodeCacheStrategy::FirstRun(strategy) => {
!strategy.is_finished.is_raised()
}
CodeCacheStrategy::SubsequentRun(strategy) => {
!strategy.is_finished.is_raised()
}
}
}
}
impl CodeCache for DenoCompileCodeCache {
fn get_sync(
&self,
specifier: &ModuleSpecifier,
specifier: &Url,
code_cache_type: CodeCacheType,
source_hash: u64,
) -> Option<Vec<u8>> {
@ -106,7 +118,7 @@ impl CodeCache for DenoCompileCodeCache {
fn set_sync(
&self,
specifier: ModuleSpecifier,
specifier: Url,
code_cache_type: CodeCacheType,
source_hash: u64,
bytes: &[u8],
@ -152,23 +164,6 @@ impl CodeCache for DenoCompileCodeCache {
}
}
impl CliCodeCache for DenoCompileCodeCache {
fn enabled(&self) -> bool {
match &self.strategy {
CodeCacheStrategy::FirstRun(strategy) => {
!strategy.is_finished.is_raised()
}
CodeCacheStrategy::SubsequentRun(strategy) => {
!strategy.is_finished.is_raised()
}
}
}
fn as_code_cache(self: Arc<Self>) -> Arc<dyn CodeCache> {
self
}
}
type CodeCacheKey = (String, CodeCacheType);
struct FirstRunCodeCacheData {
@ -216,7 +211,7 @@ struct SubsequentRunCodeCacheStrategy {
impl SubsequentRunCodeCacheStrategy {
fn take_from_cache(
&self,
specifier: &ModuleSpecifier,
specifier: &Url,
code_cache_type: CodeCacheType,
source_hash: u64,
) -> Option<Vec<u8>> {
@ -395,8 +390,6 @@ fn deserialize_with_reader<T: Read>(
#[cfg(test)]
mod test {
use std::fs::File;
use test_util::TempDir;
use super::*;
@ -463,8 +456,8 @@ mod test {
fn code_cache() {
let temp_dir = TempDir::new();
let file_path = temp_dir.path().join("cache.bin").to_path_buf();
let url1 = ModuleSpecifier::parse("https://deno.land/example1.js").unwrap();
let url2 = ModuleSpecifier::parse("https://deno.land/example2.js").unwrap();
let url1 = Url::parse("https://deno.land/example1.js").unwrap();
let url2 = Url::parse("https://deno.land/example2.js").unwrap();
// first run
{
let code_cache = DenoCompileCodeCache::new(file_path.clone(), 1234);

1713
cli/rt/file_system.rs Normal file

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,5 @@
// Copyright 2018-2025 the Deno authors. MIT license.
pub fn main() {
// this file exists to cause the executable to be built when running cargo test
}

View file

@ -1,46 +1,27 @@
// Copyright 2018-2025 the Deno authors. MIT license.
// Allow unused code warnings because we share
// code between the two bin targets.
#![allow(dead_code)]
#![allow(unused_imports)]
mod standalone;
mod args;
mod cache;
mod emit;
mod errors;
mod file_fetcher;
mod http_util;
mod js;
mod node;
mod npm;
mod resolver;
mod shared;
mod sys;
mod task_runner;
mod util;
mod version;
mod worker;
use std::borrow::Cow;
use std::collections::HashMap;
use std::env;
use std::env::current_exe;
use std::sync::Arc;
use deno_core::error::generic_error;
use deno_core::error::AnyError;
use deno_core::error::JsError;
use deno_core::error::CoreError;
use deno_lib::util::result::any_and_jserrorbox_downcast_ref;
use deno_lib::version::otel_runtime_config;
use deno_runtime::deno_telemetry::OtelConfig;
use deno_runtime::fmt_errors::format_js_error;
use deno_runtime::tokio_util::create_and_run_current_thread_with_maybe_metrics;
pub use deno_runtime::UNSTABLE_GRANULAR_FLAGS;
use deno_terminal::colors;
use indexmap::IndexMap;
use standalone::DenoCompileFileSystem;
use crate::args::Flags;
use self::binary::extract_standalone;
use self::file_system::DenoRtSys;
mod binary;
mod code_cache;
mod file_system;
mod node;
mod run;
pub(crate) fn unstable_exit_cb(feature: &str, api_name: &str) {
log::error!(
@ -65,8 +46,10 @@ fn unwrap_or_exit<T>(result: Result<T, AnyError>) -> T {
Err(error) => {
let mut error_string = format!("{:?}", error);
if let Some(e) = error.downcast_ref::<JsError>() {
error_string = format_js_error(e);
if let Some(CoreError::Js(js_error)) =
any_and_jserrorbox_downcast_ref::<CoreError>(&error)
{
error_string = format_js_error(js_error);
}
exit_with_message(&error_string, 1);
@ -85,27 +68,26 @@ fn load_env_vars(env_vars: &IndexMap<String, String>) {
fn main() {
deno_runtime::deno_permissions::mark_standalone();
let args: Vec<_> = env::args_os().collect();
let standalone = standalone::extract_standalone(Cow::Owned(args));
let standalone = extract_standalone(Cow::Owned(args));
let future = async move {
match standalone {
Ok(Some(data)) => {
deno_telemetry::init(
crate::args::otel_runtime_config(),
deno_runtime::deno_telemetry::init(
otel_runtime_config(),
&data.metadata.otel_config,
)?;
util::logger::init(
init_logging(
data.metadata.log_level,
Some(data.metadata.otel_config.clone()),
);
load_env_vars(&data.metadata.env_vars_from_env_file);
let fs = DenoCompileFileSystem::new(data.vfs.clone());
let sys = crate::sys::CliSys::DenoCompile(fs.clone());
let exit_code = standalone::run(Arc::new(fs), sys, data).await?;
let sys = DenoRtSys::new(data.vfs.clone());
let exit_code = run::run(Arc::new(sys.clone()), sys, data).await?;
deno_runtime::exit(exit_code);
}
Ok(None) => Ok(()),
Err(err) => {
util::logger::init(None, None);
init_logging(None, None);
Err(err)
}
}
@ -113,3 +95,15 @@ fn main() {
unwrap_or_exit(create_and_run_current_thread_with_maybe_metrics(future));
}
fn init_logging(
maybe_level: Option<log::Level>,
otel_config: Option<OtelConfig>,
) {
deno_lib::util::logger::init(deno_lib::util::logger::InitLoggingOptions {
maybe_level,
otel_config,
on_log_start: || {},
on_log_end: || {},
})
}

165
cli/rt/node.rs Normal file
View file

@ -0,0 +1,165 @@
// Copyright 2018-2025 the Deno authors. MIT license.
use std::borrow::Cow;
use std::sync::Arc;
use deno_core::url::Url;
use deno_error::JsErrorBox;
use deno_lib::loader::NpmModuleLoader;
use deno_lib::standalone::binary::CjsExportAnalysisEntry;
use deno_media_type::MediaType;
use deno_resolver::npm::DenoInNpmPackageChecker;
use deno_resolver::npm::NpmReqResolver;
use deno_runtime::deno_fs::FileSystem;
use deno_runtime::deno_node::RealIsBuiltInNodeModuleChecker;
use node_resolver::analyze::CjsAnalysis;
use node_resolver::analyze::CjsAnalysisExports;
use node_resolver::analyze::NodeCodeTranslator;
use crate::binary::StandaloneModules;
use crate::file_system::DenoRtSys;
pub type DenoRtCjsTracker =
deno_resolver::cjs::CjsTracker<DenoInNpmPackageChecker, DenoRtSys>;
pub type DenoRtNpmResolver = deno_resolver::npm::NpmResolver<DenoRtSys>;
pub type DenoRtNpmModuleLoader = NpmModuleLoader<
CjsCodeAnalyzer,
DenoInNpmPackageChecker,
RealIsBuiltInNodeModuleChecker,
DenoRtNpmResolver,
DenoRtSys,
>;
pub type DenoRtNodeCodeTranslator = NodeCodeTranslator<
CjsCodeAnalyzer,
DenoInNpmPackageChecker,
RealIsBuiltInNodeModuleChecker,
DenoRtNpmResolver,
DenoRtSys,
>;
pub type DenoRtNodeResolver = deno_runtime::deno_node::NodeResolver<
DenoInNpmPackageChecker,
DenoRtNpmResolver,
DenoRtSys,
>;
pub type DenoRtNpmReqResolver = NpmReqResolver<
DenoInNpmPackageChecker,
RealIsBuiltInNodeModuleChecker,
DenoRtNpmResolver,
DenoRtSys,
>;
pub struct CjsCodeAnalyzer {
cjs_tracker: Arc<DenoRtCjsTracker>,
modules: Arc<StandaloneModules>,
sys: DenoRtSys,
}
impl CjsCodeAnalyzer {
pub fn new(
cjs_tracker: Arc<DenoRtCjsTracker>,
modules: Arc<StandaloneModules>,
sys: DenoRtSys,
) -> Self {
Self {
cjs_tracker,
modules,
sys,
}
}
fn inner_cjs_analysis<'a>(
&self,
specifier: &Url,
source: Cow<'a, str>,
) -> Result<CjsAnalysis<'a>, JsErrorBox> {
let media_type = MediaType::from_specifier(specifier);
if media_type == MediaType::Json {
return Ok(CjsAnalysis::Cjs(CjsAnalysisExports {
exports: vec![],
reexports: vec![],
}));
}
let cjs_tracker = self.cjs_tracker.clone();
let is_maybe_cjs = cjs_tracker
.is_maybe_cjs(specifier, media_type)
.map_err(JsErrorBox::from_err)?;
let analysis = if is_maybe_cjs {
let data = self
.modules
.read(specifier)?
.and_then(|d| d.cjs_export_analysis);
match data {
Some(data) => {
let data: CjsExportAnalysisEntry = bincode::deserialize(&data)
.map_err(|err| JsErrorBox::generic(err.to_string()))?;
match data {
CjsExportAnalysisEntry::Esm => {
cjs_tracker.set_is_known_script(specifier, false);
CjsAnalysis::Esm(source)
}
CjsExportAnalysisEntry::Cjs(analysis) => {
cjs_tracker.set_is_known_script(specifier, true);
CjsAnalysis::Cjs(analysis)
}
}
}
None => {
if log::log_enabled!(log::Level::Debug) {
if self.sys.is_specifier_in_vfs(specifier) {
log::debug!(
"No CJS export analysis was stored for '{}'. Assuming ESM. This might indicate a bug in Deno.",
specifier
);
} else {
log::debug!(
"Analyzing potentially CommonJS files is not supported at runtime in a compiled executable ({}). Assuming ESM.",
specifier
);
}
}
// assume ESM as we don't have access to swc here
CjsAnalysis::Esm(source)
}
}
} else {
CjsAnalysis::Esm(source)
};
Ok(analysis)
}
}
#[async_trait::async_trait(?Send)]
impl node_resolver::analyze::CjsCodeAnalyzer for CjsCodeAnalyzer {
async fn analyze_cjs<'a>(
&self,
specifier: &Url,
source: Option<Cow<'a, str>>,
) -> Result<CjsAnalysis<'a>, JsErrorBox> {
let source = match source {
Some(source) => source,
None => {
if let Ok(path) = deno_path_util::url_to_file_path(specifier) {
// todo(dsherret): should this use the sync method instead?
if let Ok(source_from_file) =
self.sys.read_text_file_lossy_async(path, None).await
{
source_from_file
} else {
return Ok(CjsAnalysis::Cjs(CjsAnalysisExports {
exports: vec![],
reexports: vec![],
}));
}
} else {
return Ok(CjsAnalysis::Cjs(CjsAnalysisExports {
exports: vec![],
reexports: vec![],
}));
}
}
};
self.inner_cjs_analysis(specifier, source)
}
}

990
cli/rt/run.rs Normal file
View file

@ -0,0 +1,990 @@
// Copyright 2018-2025 the Deno authors. MIT license.
use std::borrow::Cow;
use std::path::PathBuf;
use std::rc::Rc;
use std::sync::Arc;
use std::sync::OnceLock;
use deno_cache_dir::npm::NpmCacheDir;
use deno_config::workspace::MappedResolution;
use deno_config::workspace::ResolverWorkspaceJsrPackage;
use deno_config::workspace::WorkspaceResolver;
use deno_core::error::AnyError;
use deno_core::error::ModuleLoaderError;
use deno_core::futures::future::LocalBoxFuture;
use deno_core::futures::FutureExt;
use deno_core::url::Url;
use deno_core::v8_set_flags;
use deno_core::FastString;
use deno_core::FeatureChecker;
use deno_core::ModuleLoader;
use deno_core::ModuleSourceCode;
use deno_core::ModuleType;
use deno_core::RequestedModuleType;
use deno_core::ResolutionKind;
use deno_core::SourceCodeCacheInfo;
use deno_error::JsErrorBox;
use deno_lib::args::get_root_cert_store;
use deno_lib::args::npm_pkg_req_ref_to_binary_command;
use deno_lib::args::CaData;
use deno_lib::args::RootCertStoreLoadError;
use deno_lib::loader::NpmModuleLoader;
use deno_lib::npm::create_npm_process_state_provider;
use deno_lib::npm::NpmRegistryReadPermissionChecker;
use deno_lib::npm::NpmRegistryReadPermissionCheckerMode;
use deno_lib::standalone::binary::NodeModules;
use deno_lib::util::hash::FastInsecureHasher;
use deno_lib::util::text_encoding::from_utf8_lossy_cow;
use deno_lib::util::text_encoding::from_utf8_lossy_owned;
use deno_lib::util::v8::construct_v8_flags;
use deno_lib::worker::CreateModuleLoaderResult;
use deno_lib::worker::LibMainWorkerFactory;
use deno_lib::worker::LibMainWorkerOptions;
use deno_lib::worker::ModuleLoaderFactory;
use deno_lib::worker::StorageKeyResolver;
use deno_media_type::MediaType;
use deno_npm::npm_rc::ResolvedNpmRc;
use deno_npm::resolution::NpmResolutionSnapshot;
use deno_package_json::PackageJsonDepValue;
use deno_resolver::cjs::CjsTracker;
use deno_resolver::cjs::IsCjsResolutionMode;
use deno_resolver::npm::managed::ManagedInNpmPkgCheckerCreateOptions;
use deno_resolver::npm::managed::ManagedNpmResolverCreateOptions;
use deno_resolver::npm::managed::NpmResolutionCell;
use deno_resolver::npm::ByonmNpmResolverCreateOptions;
use deno_resolver::npm::CreateInNpmPkgCheckerOptions;
use deno_resolver::npm::DenoInNpmPackageChecker;
use deno_resolver::npm::NpmReqResolver;
use deno_resolver::npm::NpmReqResolverOptions;
use deno_resolver::npm::NpmResolver;
use deno_resolver::npm::NpmResolverCreateOptions;
use deno_runtime::code_cache::CodeCache;
use deno_runtime::deno_fs::FileSystem;
use deno_runtime::deno_node::create_host_defined_options;
use deno_runtime::deno_node::NodeRequireLoader;
use deno_runtime::deno_node::RealIsBuiltInNodeModuleChecker;
use deno_runtime::deno_permissions::Permissions;
use deno_runtime::deno_permissions::PermissionsContainer;
use deno_runtime::deno_tls::rustls::RootCertStore;
use deno_runtime::deno_tls::RootCertStoreProvider;
use deno_runtime::deno_web::BlobStore;
use deno_runtime::permissions::RuntimePermissionDescriptorParser;
use deno_runtime::WorkerExecutionMode;
use deno_runtime::WorkerLogLevel;
use deno_semver::npm::NpmPackageReqReference;
use node_resolver::analyze::NodeCodeTranslator;
use node_resolver::errors::ClosestPkgJsonError;
use node_resolver::NodeResolutionKind;
use node_resolver::NodeResolver;
use node_resolver::PackageJsonResolver;
use node_resolver::ResolutionMode;
use crate::binary::DenoCompileModuleSource;
use crate::binary::StandaloneData;
use crate::binary::StandaloneModules;
use crate::code_cache::DenoCompileCodeCache;
use crate::file_system::DenoRtSys;
use crate::file_system::FileBackedVfs;
use crate::node::CjsCodeAnalyzer;
use crate::node::DenoRtCjsTracker;
use crate::node::DenoRtNodeCodeTranslator;
use crate::node::DenoRtNodeResolver;
use crate::node::DenoRtNpmModuleLoader;
use crate::node::DenoRtNpmReqResolver;
struct SharedModuleLoaderState {
cjs_tracker: Arc<DenoRtCjsTracker>,
code_cache: Option<Arc<DenoCompileCodeCache>>,
modules: Arc<StandaloneModules>,
node_code_translator: Arc<DenoRtNodeCodeTranslator>,
node_resolver: Arc<DenoRtNodeResolver>,
npm_module_loader: Arc<DenoRtNpmModuleLoader>,
npm_registry_permission_checker: NpmRegistryReadPermissionChecker<DenoRtSys>,
npm_req_resolver: Arc<DenoRtNpmReqResolver>,
vfs: Arc<FileBackedVfs>,
workspace_resolver: WorkspaceResolver,
}
impl SharedModuleLoaderState {
fn get_code_cache(
&self,
specifier: &Url,
source: &[u8],
) -> Option<SourceCodeCacheInfo> {
let Some(code_cache) = &self.code_cache else {
return None;
};
if !code_cache.enabled() {
return None;
}
// deno version is already included in the root cache key
let hash = FastInsecureHasher::new_without_deno_version()
.write_hashable(source)
.finish();
let data = code_cache.get_sync(
specifier,
deno_runtime::code_cache::CodeCacheType::EsModule,
hash,
);
Some(SourceCodeCacheInfo {
hash,
data: data.map(Cow::Owned),
})
}
}
#[derive(Clone)]
struct EmbeddedModuleLoader {
shared: Arc<SharedModuleLoaderState>,
}
impl std::fmt::Debug for EmbeddedModuleLoader {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
f.debug_struct("EmbeddedModuleLoader").finish()
}
}
impl ModuleLoader for EmbeddedModuleLoader {
fn resolve(
&self,
raw_specifier: &str,
referrer: &str,
kind: ResolutionKind,
) -> Result<Url, ModuleLoaderError> {
let referrer = if referrer == "." {
if kind != ResolutionKind::MainModule {
return Err(
JsErrorBox::generic(format!(
"Expected to resolve main module, got {:?} instead.",
kind
))
.into(),
);
}
let current_dir = std::env::current_dir().unwrap();
deno_core::resolve_path(".", &current_dir)?
} else {
Url::parse(referrer).map_err(|err| {
JsErrorBox::type_error(format!(
"Referrer uses invalid specifier: {}",
err
))
})?
};
let referrer_kind = if self
.shared
.cjs_tracker
.is_maybe_cjs(&referrer, MediaType::from_specifier(&referrer))
.map_err(JsErrorBox::from_err)?
{
ResolutionMode::Require
} else {
ResolutionMode::Import
};
if self.shared.node_resolver.in_npm_package(&referrer) {
return Ok(
self
.shared
.node_resolver
.resolve(
raw_specifier,
&referrer,
referrer_kind,
NodeResolutionKind::Execution,
)
.map_err(JsErrorBox::from_err)?
.into_url(),
);
}
let mapped_resolution = self
.shared
.workspace_resolver
.resolve(raw_specifier, &referrer);
match mapped_resolution {
Ok(MappedResolution::WorkspaceJsrPackage { specifier, .. }) => {
Ok(specifier)
}
Ok(MappedResolution::WorkspaceNpmPackage {
target_pkg_json: pkg_json,
sub_path,
..
}) => Ok(
self
.shared
.node_resolver
.resolve_package_subpath_from_deno_module(
pkg_json.dir_path(),
sub_path.as_deref(),
Some(&referrer),
referrer_kind,
NodeResolutionKind::Execution,
)
.map_err(JsErrorBox::from_err)?,
),
Ok(MappedResolution::PackageJson {
dep_result,
sub_path,
alias,
..
}) => match dep_result
.as_ref()
.map_err(|e| JsErrorBox::from_err(e.clone()))?
{
PackageJsonDepValue::Req(req) => self
.shared
.npm_req_resolver
.resolve_req_with_sub_path(
req,
sub_path.as_deref(),
&referrer,
referrer_kind,
NodeResolutionKind::Execution,
)
.map_err(|e| JsErrorBox::from_err(e).into()),
PackageJsonDepValue::Workspace(version_req) => {
let pkg_folder = self
.shared
.workspace_resolver
.resolve_workspace_pkg_json_folder_for_pkg_json_dep(
alias,
version_req,
)
.map_err(JsErrorBox::from_err)?;
Ok(
self
.shared
.node_resolver
.resolve_package_subpath_from_deno_module(
pkg_folder,
sub_path.as_deref(),
Some(&referrer),
referrer_kind,
NodeResolutionKind::Execution,
)
.map_err(JsErrorBox::from_err)?,
)
}
},
Ok(MappedResolution::Normal { specifier, .. })
| Ok(MappedResolution::ImportMap { specifier, .. }) => {
if let Ok(reference) =
NpmPackageReqReference::from_specifier(&specifier)
{
return Ok(
self
.shared
.npm_req_resolver
.resolve_req_reference(
&reference,
&referrer,
referrer_kind,
NodeResolutionKind::Execution,
)
.map_err(JsErrorBox::from_err)?,
);
}
if specifier.scheme() == "jsr" {
if let Some(specifier) = self
.shared
.modules
.resolve_specifier(&specifier)
.map_err(JsErrorBox::from_err)?
{
return Ok(specifier.clone());
}
}
Ok(
self
.shared
.node_resolver
.handle_if_in_node_modules(&specifier)
.unwrap_or(specifier),
)
}
Err(err)
if err.is_unmapped_bare_specifier() && referrer.scheme() == "file" =>
{
let maybe_res = self
.shared
.npm_req_resolver
.resolve_if_for_npm_pkg(
raw_specifier,
&referrer,
referrer_kind,
NodeResolutionKind::Execution,
)
.map_err(JsErrorBox::from_err)?;
if let Some(res) = maybe_res {
return Ok(res.into_url());
}
Err(JsErrorBox::from_err(err).into())
}
Err(err) => Err(JsErrorBox::from_err(err).into()),
}
}
fn get_host_defined_options<'s>(
&self,
scope: &mut deno_core::v8::HandleScope<'s>,
name: &str,
) -> Option<deno_core::v8::Local<'s, deno_core::v8::Data>> {
let name = Url::parse(name).ok()?;
if self.shared.node_resolver.in_npm_package(&name) {
Some(create_host_defined_options(scope))
} else {
None
}
}
fn load(
&self,
original_specifier: &Url,
maybe_referrer: Option<&Url>,
_is_dynamic: bool,
_requested_module_type: RequestedModuleType,
) -> deno_core::ModuleLoadResponse {
if original_specifier.scheme() == "data" {
let data_url_text =
match deno_media_type::data_url::RawDataUrl::parse(original_specifier)
.and_then(|url| url.decode())
{
Ok(response) => response,
Err(err) => {
return deno_core::ModuleLoadResponse::Sync(Err(
JsErrorBox::type_error(format!("{:#}", err)).into(),
));
}
};
return deno_core::ModuleLoadResponse::Sync(Ok(
deno_core::ModuleSource::new(
deno_core::ModuleType::JavaScript,
ModuleSourceCode::String(data_url_text.into()),
original_specifier,
None,
),
));
}
if self.shared.node_resolver.in_npm_package(original_specifier) {
let shared = self.shared.clone();
let original_specifier = original_specifier.clone();
let maybe_referrer = maybe_referrer.cloned();
return deno_core::ModuleLoadResponse::Async(
async move {
let code_source = shared
.npm_module_loader
.load(&original_specifier, maybe_referrer.as_ref())
.await
.map_err(JsErrorBox::from_err)?;
let code_cache_entry = shared.get_code_cache(
&code_source.found_url,
code_source.code.as_bytes(),
);
Ok(deno_core::ModuleSource::new_with_redirect(
match code_source.media_type {
MediaType::Json => ModuleType::Json,
_ => ModuleType::JavaScript,
},
code_source.code,
&original_specifier,
&code_source.found_url,
code_cache_entry,
))
}
.boxed_local(),
);
}
match self.shared.modules.read(original_specifier) {
Ok(Some(module)) => {
let media_type = module.media_type;
let (module_specifier, module_type, module_source) =
module.into_parts();
let is_maybe_cjs = match self
.shared
.cjs_tracker
.is_maybe_cjs(original_specifier, media_type)
{
Ok(is_maybe_cjs) => is_maybe_cjs,
Err(err) => {
return deno_core::ModuleLoadResponse::Sync(Err(
JsErrorBox::type_error(format!("{:?}", err)).into(),
));
}
};
if is_maybe_cjs {
let original_specifier = original_specifier.clone();
let module_specifier = module_specifier.clone();
let shared = self.shared.clone();
deno_core::ModuleLoadResponse::Async(
async move {
let source = match module_source {
DenoCompileModuleSource::String(string) => {
Cow::Borrowed(string)
}
DenoCompileModuleSource::Bytes(module_code_bytes) => {
match module_code_bytes {
Cow::Owned(bytes) => {
Cow::Owned(from_utf8_lossy_owned(bytes))
}
Cow::Borrowed(bytes) => String::from_utf8_lossy(bytes),
}
}
};
let source = shared
.node_code_translator
.translate_cjs_to_esm(&module_specifier, Some(source))
.await
.map_err(JsErrorBox::from_err)?;
let module_source = match source {
Cow::Owned(source) => ModuleSourceCode::String(source.into()),
Cow::Borrowed(source) => {
ModuleSourceCode::String(FastString::from_static(source))
}
};
let code_cache_entry = shared
.get_code_cache(&module_specifier, module_source.as_bytes());
Ok(deno_core::ModuleSource::new_with_redirect(
module_type,
module_source,
&original_specifier,
&module_specifier,
code_cache_entry,
))
}
.boxed_local(),
)
} else {
let module_source = module_source.into_for_v8();
let code_cache_entry = self
.shared
.get_code_cache(module_specifier, module_source.as_bytes());
deno_core::ModuleLoadResponse::Sync(Ok(
deno_core::ModuleSource::new_with_redirect(
module_type,
module_source,
original_specifier,
module_specifier,
code_cache_entry,
),
))
}
}
Ok(None) => deno_core::ModuleLoadResponse::Sync(Err(
JsErrorBox::type_error(format!(
"Module not found: {}",
original_specifier
))
.into(),
)),
Err(err) => deno_core::ModuleLoadResponse::Sync(Err(
JsErrorBox::type_error(format!("{:?}", err)).into(),
)),
}
}
fn code_cache_ready(
&self,
specifier: Url,
source_hash: u64,
code_cache_data: &[u8],
) -> LocalBoxFuture<'static, ()> {
if let Some(code_cache) = &self.shared.code_cache {
code_cache.set_sync(
specifier,
deno_runtime::code_cache::CodeCacheType::EsModule,
source_hash,
code_cache_data,
);
}
std::future::ready(()).boxed_local()
}
fn get_source_map(&self, file_name: &str) -> Option<Cow<[u8]>> {
let url = Url::parse(file_name).ok()?;
let data = self.shared.modules.read(&url).ok()??;
data.source_map
}
fn get_source_mapped_source_line(
&self,
file_name: &str,
line_number: usize,
) -> Option<String> {
let specifier = Url::parse(file_name).ok()?;
let data = self.shared.modules.read(&specifier).ok()??;
let source = String::from_utf8_lossy(&data.data);
// Do NOT use .lines(): it skips the terminating empty line.
// (due to internally using_terminator() instead of .split())
let lines: Vec<&str> = source.split('\n').collect();
if line_number >= lines.len() {
Some(format!(
"{} Couldn't format source line: Line {} is out of bounds (source may have changed at runtime)",
crate::colors::yellow("Warning"), line_number + 1,
))
} else {
Some(lines[line_number].to_string())
}
}
}
impl NodeRequireLoader for EmbeddedModuleLoader {
fn ensure_read_permission<'a>(
&self,
permissions: &mut dyn deno_runtime::deno_node::NodePermissions,
path: &'a std::path::Path,
) -> Result<Cow<'a, std::path::Path>, JsErrorBox> {
if self.shared.modules.has_file(path) {
// allow reading if the file is in the snapshot
return Ok(Cow::Borrowed(path));
}
self
.shared
.npm_registry_permission_checker
.ensure_read_permission(permissions, path)
.map_err(JsErrorBox::from_err)
}
fn load_text_file_lossy(
&self,
path: &std::path::Path,
) -> Result<Cow<'static, str>, JsErrorBox> {
let file_entry = self
.shared
.vfs
.file_entry(path)
.map_err(JsErrorBox::from_err)?;
let file_bytes = self
.shared
.vfs
.read_file_offset_with_len(
file_entry.transpiled_offset.unwrap_or(file_entry.offset),
)
.map_err(JsErrorBox::from_err)?;
Ok(from_utf8_lossy_cow(file_bytes))
}
fn is_maybe_cjs(&self, specifier: &Url) -> Result<bool, ClosestPkgJsonError> {
let media_type = MediaType::from_specifier(specifier);
self.shared.cjs_tracker.is_maybe_cjs(specifier, media_type)
}
}
struct StandaloneModuleLoaderFactory {
shared: Arc<SharedModuleLoaderState>,
}
impl StandaloneModuleLoaderFactory {
pub fn create_result(&self) -> CreateModuleLoaderResult {
let loader = Rc::new(EmbeddedModuleLoader {
shared: self.shared.clone(),
});
CreateModuleLoaderResult {
module_loader: loader.clone(),
node_require_loader: loader,
}
}
}
impl ModuleLoaderFactory for StandaloneModuleLoaderFactory {
fn create_for_main(
&self,
_root_permissions: PermissionsContainer,
) -> CreateModuleLoaderResult {
self.create_result()
}
fn create_for_worker(
&self,
_parent_permissions: PermissionsContainer,
_permissions: PermissionsContainer,
) -> CreateModuleLoaderResult {
self.create_result()
}
}
struct StandaloneRootCertStoreProvider {
ca_stores: Option<Vec<String>>,
ca_data: Option<CaData>,
cell: OnceLock<Result<RootCertStore, RootCertStoreLoadError>>,
}
impl RootCertStoreProvider for StandaloneRootCertStoreProvider {
fn get_or_try_init(&self) -> Result<&RootCertStore, JsErrorBox> {
self
.cell
// get_or_try_init was not stable yet when this was written
.get_or_init(|| {
get_root_cert_store(None, self.ca_stores.clone(), self.ca_data.clone())
})
.as_ref()
.map_err(|err| JsErrorBox::from_err(err.clone()))
}
}
pub async fn run(
fs: Arc<dyn FileSystem>,
sys: DenoRtSys,
data: StandaloneData,
) -> Result<i32, AnyError> {
let StandaloneData {
metadata,
modules,
npm_snapshot,
root_path,
vfs,
} = data;
let root_cert_store_provider = Arc::new(StandaloneRootCertStoreProvider {
ca_stores: metadata.ca_stores,
ca_data: metadata.ca_data.map(CaData::Bytes),
cell: Default::default(),
});
// use a dummy npm registry url
let npm_registry_url = Url::parse("https://localhost/").unwrap();
let root_dir_url = Arc::new(Url::from_directory_path(&root_path).unwrap());
let main_module = root_dir_url.join(&metadata.entrypoint_key).unwrap();
let npm_global_cache_dir = root_path.join(".deno_compile_node_modules");
let pkg_json_resolver = Arc::new(PackageJsonResolver::new(sys.clone()));
let npm_registry_permission_checker = {
let mode = match &metadata.node_modules {
Some(NodeModules::Managed {
node_modules_dir: Some(path),
}) => NpmRegistryReadPermissionCheckerMode::Local(PathBuf::from(path)),
Some(NodeModules::Byonm { .. }) => {
NpmRegistryReadPermissionCheckerMode::Byonm
}
Some(NodeModules::Managed {
node_modules_dir: None,
})
| None => NpmRegistryReadPermissionCheckerMode::Global(
npm_global_cache_dir.clone(),
),
};
NpmRegistryReadPermissionChecker::new(sys.clone(), mode)
};
let (in_npm_pkg_checker, npm_resolver) = match metadata.node_modules {
Some(NodeModules::Managed { node_modules_dir }) => {
// create an npmrc that uses the fake npm_registry_url to resolve packages
let npmrc = Arc::new(ResolvedNpmRc {
default_config: deno_npm::npm_rc::RegistryConfigWithUrl {
registry_url: npm_registry_url.clone(),
config: Default::default(),
},
scopes: Default::default(),
registry_configs: Default::default(),
});
let npm_cache_dir = Arc::new(NpmCacheDir::new(
&sys,
npm_global_cache_dir,
npmrc.get_all_known_registries_urls(),
));
let snapshot = npm_snapshot.unwrap();
let maybe_node_modules_path = node_modules_dir
.map(|node_modules_dir| root_path.join(node_modules_dir));
let in_npm_pkg_checker =
DenoInNpmPackageChecker::new(CreateInNpmPkgCheckerOptions::Managed(
ManagedInNpmPkgCheckerCreateOptions {
root_cache_dir_url: npm_cache_dir.root_dir_url(),
maybe_node_modules_path: maybe_node_modules_path.as_deref(),
},
));
let npm_resolution =
Arc::new(NpmResolutionCell::new(NpmResolutionSnapshot::new(snapshot)));
let npm_resolver = NpmResolver::<DenoRtSys>::new::<DenoRtSys>(
NpmResolverCreateOptions::Managed(ManagedNpmResolverCreateOptions {
npm_resolution,
npm_cache_dir,
sys: sys.clone(),
maybe_node_modules_path,
npm_system_info: Default::default(),
npmrc,
}),
);
(in_npm_pkg_checker, npm_resolver)
}
Some(NodeModules::Byonm {
root_node_modules_dir,
}) => {
let root_node_modules_dir =
root_node_modules_dir.map(|p| vfs.root().join(p));
let in_npm_pkg_checker =
DenoInNpmPackageChecker::new(CreateInNpmPkgCheckerOptions::Byonm);
let npm_resolver = NpmResolver::<DenoRtSys>::new::<DenoRtSys>(
NpmResolverCreateOptions::Byonm(ByonmNpmResolverCreateOptions {
sys: sys.clone(),
pkg_json_resolver: pkg_json_resolver.clone(),
root_node_modules_dir,
}),
);
(in_npm_pkg_checker, npm_resolver)
}
None => {
// Packages from different registries are already inlined in the binary,
// so no need to create actual `.npmrc` configuration.
let npmrc = create_default_npmrc();
let npm_cache_dir = Arc::new(NpmCacheDir::new(
&sys,
npm_global_cache_dir,
npmrc.get_all_known_registries_urls(),
));
let in_npm_pkg_checker =
DenoInNpmPackageChecker::new(CreateInNpmPkgCheckerOptions::Managed(
ManagedInNpmPkgCheckerCreateOptions {
root_cache_dir_url: npm_cache_dir.root_dir_url(),
maybe_node_modules_path: None,
},
));
let npm_resolution = Arc::new(NpmResolutionCell::default());
let npm_resolver = NpmResolver::<DenoRtSys>::new::<DenoRtSys>(
NpmResolverCreateOptions::Managed(ManagedNpmResolverCreateOptions {
npm_resolution,
sys: sys.clone(),
npm_cache_dir,
maybe_node_modules_path: None,
npm_system_info: Default::default(),
npmrc: create_default_npmrc(),
}),
);
(in_npm_pkg_checker, npm_resolver)
}
};
let has_node_modules_dir = npm_resolver.root_node_modules_path().is_some();
let node_resolver = Arc::new(NodeResolver::new(
in_npm_pkg_checker.clone(),
RealIsBuiltInNodeModuleChecker,
npm_resolver.clone(),
pkg_json_resolver.clone(),
sys.clone(),
node_resolver::ConditionsFromResolutionMode::default(),
));
let cjs_tracker = Arc::new(CjsTracker::new(
in_npm_pkg_checker.clone(),
pkg_json_resolver.clone(),
if metadata.unstable_config.detect_cjs {
IsCjsResolutionMode::ImplicitTypeCommonJs
} else if metadata.workspace_resolver.package_jsons.is_empty() {
IsCjsResolutionMode::Disabled
} else {
IsCjsResolutionMode::ExplicitTypeCommonJs
},
));
let npm_req_resolver = Arc::new(NpmReqResolver::new(NpmReqResolverOptions {
sys: sys.clone(),
in_npm_pkg_checker: in_npm_pkg_checker.clone(),
node_resolver: node_resolver.clone(),
npm_resolver: npm_resolver.clone(),
}));
let cjs_esm_code_analyzer =
CjsCodeAnalyzer::new(cjs_tracker.clone(), modules.clone(), sys.clone());
let node_code_translator = Arc::new(NodeCodeTranslator::new(
cjs_esm_code_analyzer,
in_npm_pkg_checker,
node_resolver.clone(),
npm_resolver.clone(),
pkg_json_resolver.clone(),
sys.clone(),
));
let workspace_resolver = {
let import_map = match metadata.workspace_resolver.import_map {
Some(import_map) => Some(
import_map::parse_from_json_with_options(
root_dir_url.join(&import_map.specifier).unwrap(),
&import_map.json,
import_map::ImportMapOptions {
address_hook: None,
expand_imports: true,
},
)?
.import_map,
),
None => None,
};
let pkg_jsons = metadata
.workspace_resolver
.package_jsons
.into_iter()
.map(|(relative_path, json)| {
let path = root_dir_url
.join(&relative_path)
.unwrap()
.to_file_path()
.unwrap();
let pkg_json =
deno_package_json::PackageJson::load_from_value(path, json);
Arc::new(pkg_json)
})
.collect();
WorkspaceResolver::new_raw(
root_dir_url.clone(),
import_map,
metadata
.workspace_resolver
.jsr_pkgs
.iter()
.map(|pkg| ResolverWorkspaceJsrPackage {
is_patch: false, // only used for enhancing the diagnostic, which isn't shown in deno compile
base: root_dir_url.join(&pkg.relative_base).unwrap(),
name: pkg.name.clone(),
version: pkg.version.clone(),
exports: pkg.exports.clone(),
})
.collect(),
pkg_jsons,
metadata.workspace_resolver.pkg_json_resolution,
)
};
let code_cache = match metadata.code_cache_key {
Some(code_cache_key) => Some(Arc::new(DenoCompileCodeCache::new(
root_path.with_file_name(format!(
"{}.cache",
root_path.file_name().unwrap().to_string_lossy()
)),
code_cache_key,
))),
None => {
log::debug!("Code cache disabled.");
None
}
};
let module_loader_factory = StandaloneModuleLoaderFactory {
shared: Arc::new(SharedModuleLoaderState {
cjs_tracker: cjs_tracker.clone(),
code_cache: code_cache.clone(),
modules,
node_code_translator: node_code_translator.clone(),
node_resolver: node_resolver.clone(),
npm_module_loader: Arc::new(NpmModuleLoader::new(
cjs_tracker.clone(),
node_code_translator,
sys.clone(),
)),
npm_registry_permission_checker,
npm_req_resolver,
vfs: vfs.clone(),
workspace_resolver,
}),
};
let permissions = {
let mut permissions = metadata.permissions;
// grant read access to the vfs
match &mut permissions.allow_read {
Some(vec) if vec.is_empty() => {
// do nothing, already granted
}
Some(vec) => {
vec.push(root_path.to_string_lossy().to_string());
}
None => {
permissions.allow_read =
Some(vec![root_path.to_string_lossy().to_string()]);
}
}
let desc_parser =
Arc::new(RuntimePermissionDescriptorParser::new(sys.clone()));
let permissions =
Permissions::from_options(desc_parser.as_ref(), &permissions)?;
PermissionsContainer::new(desc_parser, permissions)
};
let feature_checker = Arc::new({
let mut checker = FeatureChecker::default();
checker.set_exit_cb(Box::new(crate::unstable_exit_cb));
for feature in metadata.unstable_config.features {
// `metadata` is valid for the whole lifetime of the program, so we
// can leak the string here.
checker.enable_feature(feature.leak());
}
checker
});
let lib_main_worker_options = LibMainWorkerOptions {
argv: metadata.argv,
log_level: WorkerLogLevel::Info,
enable_op_summary_metrics: false,
enable_testing_features: false,
has_node_modules_dir,
inspect_brk: false,
inspect_wait: false,
strace_ops: None,
is_inspecting: false,
skip_op_registration: true,
location: metadata.location,
argv0: NpmPackageReqReference::from_specifier(&main_module)
.ok()
.map(|req_ref| npm_pkg_req_ref_to_binary_command(&req_ref))
.or(std::env::args().next()),
node_debug: std::env::var("NODE_DEBUG").ok(),
origin_data_folder_path: None,
seed: metadata.seed,
unsafely_ignore_certificate_errors: metadata
.unsafely_ignore_certificate_errors,
node_ipc: None,
serve_port: None,
serve_host: None,
otel_config: metadata.otel_config,
startup_snapshot: deno_snapshots::CLI_SNAPSHOT,
};
let worker_factory = LibMainWorkerFactory::new(
Arc::new(BlobStore::default()),
code_cache.map(|c| c.for_deno_core()),
feature_checker,
fs,
None,
Box::new(module_loader_factory),
node_resolver.clone(),
create_npm_process_state_provider(&npm_resolver),
pkg_json_resolver,
root_cert_store_provider,
StorageKeyResolver::empty(),
sys.clone(),
lib_main_worker_options,
);
// Initialize v8 once from the main thread.
v8_set_flags(construct_v8_flags(&[], &metadata.v8_flags, vec![]));
// TODO(bartlomieju): remove last argument once Deploy no longer needs it
deno_core::JsRuntime::init_platform(None, true);
let main_module = match NpmPackageReqReference::from_specifier(&main_module) {
Ok(package_ref) => {
let pkg_folder = npm_resolver.resolve_pkg_folder_from_deno_module_req(
package_ref.req(),
&deno_path_util::url_from_file_path(&vfs.root().join("package.json"))?,
)?;
worker_factory
.resolve_npm_binary_entrypoint(&pkg_folder, package_ref.sub_path())?
}
Err(_) => main_module,
};
let mut worker = worker_factory.create_main_worker(
WorkerExecutionMode::Run,
permissions,
main_module,
)?;
let exit_code = worker.run().await?;
Ok(exit_code)
}
fn create_default_npmrc() -> Arc<ResolvedNpmRc> {
// this is fine because multiple registries are combined into
// one when compiling the binary
Arc::new(ResolvedNpmRc {
default_config: deno_npm::npm_rc::RegistryConfigWithUrl {
registry_url: Url::parse("https://registry.npmjs.org").unwrap(),
config: Default::default(),
},
scopes: Default::default(),
registry_configs: Default::default(),
})
}

20
cli/snapshot/Cargo.toml Normal file
View file

@ -0,0 +1,20 @@
# Copyright 2018-2025 the Deno authors. MIT license.
[package]
name = "deno_snapshots"
version = "0.1.0"
authors.workspace = true
edition.workspace = true
license.workspace = true
readme = "README.md"
repository.workspace = true
description = "v8 snapshot used by the Deno CLI"
[lib]
path = "lib.rs"
[features]
disable = []
[build-dependencies]
deno_runtime = { workspace = true, features = ["include_js_files_for_snapshotting", "only_snapshotted_js_sources", "snapshotting"] }

3
cli/snapshot/README.md Normal file
View file

@ -0,0 +1,3 @@
# deno_snapshots
v8 snapshot used in the Deno CLI.

Some files were not shown because too many files have changed in this diff Show more