0
0
Fork 0
mirror of https://github.com/denoland/deno.git synced 2025-02-12 16:59:32 -05:00

Compare commits

...

75 commits

Author SHA1 Message Date
Yoshiya Hinosawa
5da84658db
Merge branch 'main' into addCtimeToStats 2024-11-13 11:53:16 +09:00
Divy Srivastava
43812ee8ff
fix(ext/node): process.getBuiltinModule (#26833)
Closes https://github.com/denoland/deno/issues/26832
2024-11-13 08:02:09 +05:30
David Sherret
119910f339
fix(permissions): say to use --allow-run instead of --allow-all (#26842)
For https://github.com/denoland/deno/issues/26839
2024-11-12 17:14:19 -05:00
Richard Carson
01f3451869
chore: make fields public on PermissionDeniedError and deno_kv::KvConfig (#26798)
A few small changes to avoid needing unsafe mem transmutes to
instantiate the extensions

---------

Signed-off-by: Richard Carson <Rscarson@rogers.com>
2024-11-12 17:49:49 +00:00
Nathan Whitaker
c371b2a492
fix(install): re-setup bin entries after running lifecycle scripts (#26752)
Fixes #26677

Some packages (like supabase) declare bin entries that don't exist until
lifecycle scripts are run. For instance, the lifecycle script downloads
a binary file which serves as a bin entrypoint.

Unfortunately you can't just defer setting up the bin entries until
after lifecycle scripts have run, because the scripts may rely on them.

I looked into this, and PNPM just re-links bin entries after running
lifecycle scripts. I think that's about the best we can do as well.

Note that we'll only re-setup bin entries for packages whose lifecycle
scripts we run. This should limit the performance cost, as typically a
given project will not have many lifecycle scripts (and of those, many
of them probably don't have bin entries to set up).
2024-11-12 09:23:39 -08:00
Soc Virnyl S. Estela
15b6baff33
chore: update zeromq to 0.4.1 (#26811)
Closes: #26810

---------

Signed-off-by: Soc Virnyl Estela <contact@uncomfyhalomacro.pl>
Co-authored-by: Divy Srivastava <dj.srivastava23@gmail.com>
Co-authored-by: Bartek Iwańczuk <biwanczuk@gmail.com>
2024-11-12 17:25:59 +01:00
Divy Srivastava
7179bdcc77
fix(ext/node): handle --allow-sys=inspector (#26836)
`op_inspector_open` checks for "inspector" as one of the allowed sys
value.
2024-11-12 16:55:49 +01:00
Nayeem Rahman
7d326c269c
fix(lsp): skip code action edits that can't be converted (#26831) 2024-11-12 13:15:32 +00:00
Divy Srivastava
3b99f6833c
fix(ext/websocket): initialize error attribute of WebSocket ErrorEvent (#26796)
Fixes https://github.com/denoland/deno/issues/26216

Not required by the spec but Discord.js depends on it, see
https://github.com/denoland/deno/issues/26216#issuecomment-2466060306
2024-11-12 17:10:07 +05:30
Yoshiya Hinosawa
c3c2b37966
fix(ext/node): add autoSelectFamily option to net.createConnection (#26661) 2024-11-12 19:54:47 +09:00
Satya Rohith
90236d67c5
fix(ext/http): prefer brotli for accept-encoding: gzip, deflate, br, zstd (#26814)
Closes https://github.com/denoland/deno/issues/26813
2024-11-12 12:10:41 +05:30
Yoshiya Hinosawa
b955d03740
test(ext/node): prevent running the same test cases twice (#26812) 2024-11-11 17:16:33 +09:00
Leo Kettmeir
bfc143a5ac
fix(ext/webstorage): use error class for sqlite error case (#26806)
Fixes #26797
2024-11-10 02:43:04 -08:00
denobot
e1b40a69c0
chore: forward v2.0.6 release commit to main (#26804)
This is the release commit being forwarded back to main for 2.0.6

Signed-off-by: Divy Srivastava <dj.srivastava23@gmail.com>
Co-authored-by: Divy Srivastava <dj.srivastava23@gmail.com>
2024-11-10 13:12:18 +05:30
Divy Srivastava
ce778a947e
Revert "perf(upgrade): cache downloaded binaries in DENO_DIR" (#26799)
Reverts denoland/deno#26108

Tests are flaky on main
01de331742
2024-11-10 09:00:44 +05:30
Bartek Iwańczuk
01de331742
perf(upgrade): cache downloaded binaries in DENO_DIR (#26108)
Co-authored-by: Divy Srivastava <dj.srivastava23@gmail.com>
2024-11-09 15:19:46 +00:00
snek
73fbd61bd0
fix: performance.timeOrigin (#26787)
`performance.timeOrigin` was being set from when JS started executing,
but `op_now` measures from an `std::time::Instant` stored in `OpState`,
which is created at a completely different time. This caused
`performance.timeOrigin` to be very incorrect. This PR corrects the
origin and also cleans up some of the timer code.

Compared to `Date.now()`, `performance`'s time origin is now
consistently within 5us (0.005ms) of system time.


![image](https://github.com/user-attachments/assets/0a7be04a-4f6d-4816-bd25-38a2e6136926)
2024-11-08 23:20:24 +01:00
Nathan Whitaker
d4f1bd3dac
fix(install): cache jsr deps from all workspace config files (#26779)
Fixes #26772.

I wasn't aware that the `imports()` method only returned the workspace
root imports
2024-11-08 12:45:30 -08:00
Divy Srivastava
b482a50299
feat(ext/http): abort event when request is cancelled (#26781)
```js
Deno.serve(async (req) => {
  const { promise, resolve } = Promise.withResolvers<void>();

  req.signal.addEventListener("abort", () => {
    resolve();
  });

  await promise;

  return new Response("Ok");
});
```
2024-11-08 18:46:11 +05:30
Divy Srivastava
637b1d5508
fix(ext/cache): don't panic when creating cache (#26780) 2024-11-08 12:27:29 +05:30
Nathan Whitaker
bf82c6697a
chore: make commandWithCwdIsAsync test less flaky (#26770) 2024-11-07 15:02:33 -08:00
Divy Srivastava
b9262130fe
feat(ext/http): abort signal when request is cancelled (#26761)
Closes https://github.com/denoland/deno/issues/21653
2024-11-07 17:12:13 +05:30
Nathan Whitaker
742744d498
chore: serve node headers from a test server to fix flaky node-gyp test (#26749)
Fixes https://github.com/denoland/deno/issues/24749

Runs a server that just returns the header tarball and checksum, and
sets the `NODEJS_ORG_MIRROR` env var so that `node-gyp` uses it instead
of `nodejs.org`
2024-11-06 19:52:46 -08:00
Leo Kettmeir
1cab4f07a3
refactor: use concrete error type for remaining ops (#26746) 2024-11-06 16:57:57 -08:00
Kaveh
db53ec230d
refactor(ext/net): Use hickory dns instead of unmaintained trust-dns (#26741)
This PR replaces the unmaintained and rebranded `trust-dns` to `hickory`
for resolver in `deno_net`.
2024-11-06 15:49:32 -08:00
Satya Rohith
b3a3d84ce2
fix(node:zlib): gzip & gzipSync should accept ArrayBuffer (#26762)
Closes https://github.com/denoland/deno/issues/26638
2024-11-06 15:12:24 +01:00
snek
700f54a13c
fix(ext/node): better inspector support (#26471)
implement local inspector

future changes:
- wire up InspectorServer to enable open/close/url
- wire up connectToMainThread

Fixes https://github.com/denoland/deno/issues/25004
2024-11-06 14:08:26 +00:00
Bartek Iwańczuk
64e887083a
fix(fmt): don't use self-closing tags in HTML (#26754)
Closes https://github.com/denoland/deno/issues/26748
2024-11-06 13:56:03 +01:00
Nayeem Rahman
5088b25f23
feat(lsp): auto-import completions from byonm dependencies (#26680) 2024-11-06 06:26:46 +00:00
denobot
ef7432c03f
chore: forward v2.0.5 release commit to main (#26755)
This is the release commit being forwarded back to main for 2.0.5

Co-authored-by: bartlomieju <bartlomieju@users.noreply.github.com>
2024-11-06 02:27:14 +01:00
Divy Srivastava
770ef14600
chore: upgrade publish workflow to ubuntu 24 (#26731) 2024-11-05 13:10:45 +05:30
Divy Srivastava
4861108592
fix: panic_hook hangs without procfs (#26732)
Fixes https://github.com/denoland/deno/issues/26701

Ref
69e491353f
2024-11-05 13:10:23 +05:30
Mohammad Sulaiman
89f0b796bd
chore: deprecate run itests (#26444) 2024-11-05 06:39:05 +00:00
Nathan Whitaker
f9a05068d6
fix(install): handle invalid function error, and fallback to junctions regardless of the error (#26730)
Fixes #26116.

Handle the new error and treat is as lacking permission to make
symlinks, but also to make this more robust, just always fall back to
junctions no matter what the actual error is. Instead, warn if the error
isn't one we've handled, but go on to attempt creating the junction
2024-11-05 04:16:53 +00:00
Divy Srivastava
383cb85a73
fix: op_run_microtasks crash (#26718)
Upgrade deno_core to 0.318.0

Fixes https://github.com/denoland/deno_core/issues/951
Fixes https://github.com/denoland/deno/issues/26468
2024-11-05 09:13:54 +05:30
Nathan Whitaker
706b1dfcea
fix(add): better error message when adding package that only has pre-release versions (#26724)
Fixes https://github.com/denoland/deno/issues/26597

A small refactor as well to reduce some code duplication
2024-11-05 02:45:00 +00:00
Nathan Whitaker
44eca0505c
chore: fix serve_watch_all test (#26725)
It's been failing a ton lately, it looks like the test is just
incorrectly using TS syntax in a JS file
https://github.com/denoland/deno/actions/runs/11672972415/job/32502710624?pr=26724#step:43:2791

I'm not really sure how this ever passes
2024-11-05 01:09:17 +00:00
Bartek Iwańczuk
25ed90baae
ci: use self-hosted mac arm runner for building on tags (#26727) 2024-11-05 00:56:09 +01:00
Bartek Iwańczuk
051552172c
fix(workspace): support wildcard packages (#26568)
This commit adds support for wildcard packages in `workspace`
configuration option in `deno.json`. This is now supported:
```
{
  "workspace": [
    "./packages/*"
  ]
}
```

Closes https://github.com/denoland/deno/issues/25783
2024-11-05 00:42:18 +01:00
Nathan Whitaker
84aee0be9a
fix(ext/node): add findSourceMap to the default export of node:module (#26720)
Next.js 15.0.2 tries to use this and errors out
2024-11-04 21:08:29 +00:00
David Sherret
bb3ca84e6d
fix(fmt): do not panic for jsx ignore container followed by jsx text (#26723) 2024-11-04 21:40:05 +01:00
Nayeem Rahman
d67765b0b4
fix(lsp): scope attribution for lazily loaded assets (#26699) 2024-11-04 20:01:31 +00:00
Scott Twiname
d55e30f418
fix(types): missing import permission on PermissionOptionsObject (#26627) 2024-11-04 18:20:34 +00:00
Bartek Iwańczuk
9a39a98b57
fix(fmt): ignore file directive for YAML files (#26717)
Closes https://github.com/denoland/deno/issues/26712

Support `# deno-fmt-ignore-file` directive for YAML files.

Also added tests for single line ignores.
2024-11-04 17:57:29 +00:00
Leo Kettmeir
fe9f0ee593
refactor(runtime/permissions): use concrete error types (#26464) 2024-11-04 09:17:21 -08:00
Kenta Moriuchi
fb1d33a711
chore: update dlint to v0.68.0 for internal (#26711) 2024-11-04 12:17:11 -05:00
Nayeem Rahman
d95f06f20b
perf(lsp): don't walk coverage directory (#26715) 2024-11-04 16:36:21 +00:00
Nathan Whitaker
2c8a0e7917
fix(add): only add npm deps to package.json if it's at least as close as deno.json (#26683)
Fixes https://github.com/denoland/deno/issues/26653
2024-11-01 19:10:35 -07:00
David Sherret
826e42a5b5
fix: improved support for cjs and cts modules (#26558)
* cts support
* better cjs/cts type checking
* deno compile cjs/cts support
* More efficient detect cjs (going towards stabilization)
* Determination of whether .js, .ts, .jsx, or .tsx is cjs or esm is only
done after loading
* Support `import x = require(...);`

Co-authored-by: Bartek Iwańczuk <biwanczuk@gmail.com>
2024-11-01 12:27:00 -04:00
Divy Srivastava
4774eab64d
chore: upgrade to rust 1.82 and LLVM 19 (#26615)
Upgrade to rust 1.82 and LLVM 19 . Removes one webusb test because
`requestAdapter` not working on new ubuntu 24 runners
2024-11-01 16:13:02 +05:30
Nathan Whitaker
04ae1a5517
fix(cli): set npm_config_user_agent when running npm packages or tasks (#26639)
Fixes #25342.

Still not sure on the exact user agent to set (should it include
`node`?).

After this PR, here's the state of running some `create-*` packages
(just ones I could think of off the top of my head):

| package                  | prints/runs/suggests deno install | notes |
| ---------------- | ------------- | ------ |
| `create-next-app` |  | falls back to npm, needs a PR
([code](c32e280209/packages/create-next-app/helpers/get-pkg-manager.ts (L3)))
| `sv create` |  | uses `package-manager-detector`, needs a PR
([code](https://github.com/antfu-collective/package-manager-detector/tree/main))
| `create-qwik` |  | runs `deno install` but suggests `deno start`
which doesn't work (should be `deno task start` or `deno run start`)
| `create-astro` |  | runs `deno install` but suggests `npm run dev`
later in output, probably needs a PR
| `nuxi init` |  | deno not an option in dialog, needs a PR
([code](f04e2e8944/src/commands/init.ts (L96-L102)))
| `create-react-app` |  | uses npm
| `ng new` (`@angular/cli`) |  | uses npm
| `create-vite` |  | suggests working deno commands 🎉 
| `create-solid` |   | suggests npm commands, needs PR

It's possible that fixing `package-manager-detector` or other packages
might make some of these just work, but haven't looked too carefully at
each
2024-10-31 22:19:19 -07:00
Nathan Whitaker
6c6bbeb974
fix(node): Implement os.userInfo properly, add missing toPrimitive (#24702)
Fixes the implementation of `os.userInfo`, and adds a missing
`toPrimitive` for `tmpdir`. This allows us to enable the corresponding
node_compat test.
2024-10-31 22:18:33 -07:00
Nathan Whitaker
6d44952d4d
fix(ext/node): resolve exports even if parent module filename isn't present (#26553)
Fixes https://github.com/denoland/deno/issues/26505

I'm not exactly sure how this case comes about (I tried to write tests
for it but couldn't manage to reproduce it), but what happens is the
parent filename ends up null, and we bail out of resolving the specifier
in package exports.

I've checked, and in node the parent filename is also null (so that's
not a bug on our part), but node continues to resolve even in that case.
So this PR should match node's behavior more closely than we currently
do.
2024-10-31 10:02:31 -07:00
Nathan Whitaker
951103fc8d
fix(ext/node): convert errors from fs.readFile/fs.readFileSync to node format (#26632)
Fixes the original issue reported in #26404. storybook runs into other
errors after this PR (the new errors will be fixed in other PRs).

Some code used by a dependency of storybook does a [string comparison on
the error
message](ce30b2be34/node-src/lib/getConfiguration.ts (L88-L92))
thrown here to check for a file not found error.
2024-10-31 10:02:17 -07:00
David Sherret
90edca21a2
fix: surface package.json location on dep parse failure (#26665)
Related: https://github.com/denoland/deno/issues/26653
2024-10-31 15:35:17 +00:00
Taku Amano
50ea707b58
fix(coverage): exclude comment lines from coverage reports (#25939) 2024-10-31 23:20:26 +09:00
Pig Fang
56f25af2c7
fix(fmt): fix several HTML and components issues (#26654)
Fix #26245 
Close #26324 
Fix #26508 
Fix #26540 
Fix #26562

---------

Co-authored-by: Bartek Iwańczuk <biwanczuk@gmail.com>
2024-10-31 14:50:58 +01:00
Nayeem Rahman
2f0c25d33f
fix(lsp): include unstable features from editor settings (#26655) 2024-10-31 10:52:43 +00:00
Divy Srivastava
8bfd134da6
fix: clamp smi in fast calls by default (#26506)
Fixes https://github.com/denoland/deno/issues/26480

Ref
d2945fb65b
2024-10-31 10:10:07 +05:30
David Sherret
a8846eb70f
fix: remove permission check in op_require_node_module_paths (#26645) 2024-10-30 20:43:22 +00:00
Nathan Whitaker
a346071dd2
fix(ext/node): return this from http.Server.ref/unref() (#26647)
Fixes https://github.com/denoland/deno/issues/26642
2024-10-30 20:13:32 +00:00
HasanAlrimawi
5f5ac40631
fix(serve): support serve hmr (#26078)
This PR addresses issue #25600 

Changes:
Updated `fn has_hmr` to check `serve` subcommand and return its hmr
value if found, in order to run the worker in serve mode with
hmr_runner. Thus the hmr event can be dispatched upon changes on the
file served.
2024-10-30 10:32:18 -07:00
McSneaky
1431ffa9f8
docs(console): Update docstrings for install and uninstall (#26623)
When running `deno -h` then `install` and `uninstall` scripts had
description since deno 1 times :)
2024-10-30 16:32:04 +01:00
denobot
a1473d82c5
chore: forward v2.0.4 release commit to main (#26636)
This is the release commit being forwarded back to main for 2.0.4

Co-authored-by: bartlomieju <bartlomieju@users.noreply.github.com>
2024-10-30 13:46:31 +01:00
Marvin Hagemeister
c8d229dbf0
fix(install): percent encodings in interactive progress bar (#26600)
Fixes percent encodings showing up when installing scoped packages via
`deno add` or `deno install`. The issue is caused by us trying to map
back the package name from the resolved http url. This doesn't and has
never worked with private registries. The proper solution would be to
pass the original specifier into here, but that's a bit of a bigger
refactor.

So for now the quickest workaround is to replace `%2f` back to `/`.

Fixes https://github.com/denoland/deno/issues/26576
2024-10-29 23:35:42 +00:00
HasanAlrimawi
3b28446000
fix: support watch flag to enable watching other files than the main module on serve subcommand (#26622)
Closes #26618
2024-10-29 22:55:41 +01:00
Yoshiya Hinosawa
a69224ea5b
Revert "fix(ext/node): fix dns.lookup result ordering (#26264)" (#26621)
This reverts commit d59599fc18.

Closes #26588
2024-10-29 18:41:16 +01:00
Volker Schlecht
51978a7654
fix(ext/napi): export dynamic symbols list for {Free,Open}BSD (#26605)
The two BSD ports are reusing the Linux code here.
2024-10-29 19:41:49 +05:30
Volker Schlecht
efb5e912e5
fix(ext/node): compatibility with {Free,Open}BSD (#26604)
Ports for both BSDs contain patches to the same effect.
See
https://github.com/freebsd/freebsd-ports/blob/main/www/deno/files/patch-ext_node_ops_fs.rs
and
8644910cae/lang/deno/patches/patch-ext_node_ops_fs_rs
2024-10-29 19:40:32 +05:30
Yoshiya Hinosawa
aa2a354190
refactor(init): inline routing in deno init --serve template (#26595) 2024-10-29 14:37:21 +09:00
Bartek Iwańczuk
46e5ed1a64
Revert "fix(ext/node): use primordials in ext/node/polyfills/https.ts (#26323)" (#26613)
…s` (#26323)"

This reverts commit afb33b3c25.

Reverting because it caused a regression -
https://github.com/denoland/deno/issues/26612.

Closes https://github.com/denoland/deno/issues/26612.
2024-10-29 00:41:02 +00:00
Bartek Iwańczuk
484f8ca9c3
fix: provide hints in terminal errors for Node.js globals (#26610)
Add info/hint for terminal errors related to Node.js globals:
- __filename
- __dirname
- Buffer
- global
- setImmediate
- clearImmediate

Closes https://github.com/denoland/deno/issues/17494
2024-10-29 00:55:51 +01:00
David Sherret
0e641632c3
fix(check): expose more globals from @types/node (#26603)
Extracted out of https://github.com/denoland/deno/pull/26558

Closes https://github.com/denoland/deno/issues/26578
2024-10-28 17:43:41 -04:00
snek
4e38fbd0a3
fix: report exceptions from nextTick (#26579)
Fixes: https://github.com/denoland/deno/issues/24713
Fixes: https://github.com/denoland/deno/issues/25855
2024-10-28 18:16:43 +01:00
David Sherret
f61af864df
fix(compile): regression handling redirects (#26586)
Closes https://github.com/denoland/deno/issues/26583
2024-10-28 09:31:58 -04:00
2066 changed files with 16995 additions and 5991 deletions

View file

@ -65,10 +65,14 @@
"tests/wpt/runner/expectation.json",
"tests/wpt/runner/manifest.json",
"tests/wpt/suite",
"third_party"
"third_party",
"tests/specs/run/shebang_with_json_imports_tsc",
"tests/specs/run/shebang_with_json_imports_swc",
"tests/specs/run/ext_flag_takes_precedence_over_extension",
"tests/specs/run/error_syntax_empty_trailing_line/error_syntax_empty_trailing_line.mjs"
],
"plugins": [
"https://plugins.dprint.dev/typescript-0.93.0.wasm",
"https://plugins.dprint.dev/typescript-0.93.2.wasm",
"https://plugins.dprint.dev/json-0.19.4.wasm",
"https://plugins.dprint.dev/markdown-0.17.8.wasm",
"https://plugins.dprint.dev/toml-0.6.3.wasm",

View file

@ -10,7 +10,7 @@ concurrency:
jobs:
build:
name: cargo publish
runs-on: ubuntu-20.04-xl
runs-on: ubuntu-24.04-xl
timeout-minutes: 90
env:

View file

@ -5,15 +5,16 @@ import { stringify } from "jsr:@std/yaml@^0.221/stringify";
// Bump this number when you want to purge the cache.
// Note: the tools/release/01_bump_crate_versions.ts script will update this version
// automatically via regex, so ensure that this line maintains this format.
const cacheVersion = 22;
const cacheVersion = 25;
const ubuntuX86Runner = "ubuntu-22.04";
const ubuntuX86XlRunner = "ubuntu-22.04-xl";
const ubuntuX86Runner = "ubuntu-24.04";
const ubuntuX86XlRunner = "ubuntu-24.04-xl";
const ubuntuARMRunner = "ubicloud-standard-16-arm";
const windowsX86Runner = "windows-2022";
const windowsX86XlRunner = "windows-2022-xl";
const macosX86Runner = "macos-13";
const macosArmRunner = "macos-14";
const selfHostedMacosArmRunner = "self-hosted";
const Runners = {
linuxX86: {
@ -40,7 +41,8 @@ const Runners = {
macosArm: {
os: "macos",
arch: "aarch64",
runner: macosArmRunner,
runner:
`\${{ github.repository == 'denoland/deno' && startsWith(github.ref, 'refs/tags/') && '${selfHostedMacosArmRunner}' || '${macosArmRunner}' }}`,
},
windowsX86: {
os: "windows",
@ -59,7 +61,7 @@ const prCacheKeyPrefix =
`${cacheVersion}-cargo-target-\${{ matrix.os }}-\${{ matrix.arch }}-\${{ matrix.profile }}-\${{ matrix.job }}-`;
// Note that you may need to add more version to the `apt-get remove` line below if you change this
const llvmVersion = 18;
const llvmVersion = 19;
const installPkgsCommand =
`sudo apt-get install --no-install-recommends clang-${llvmVersion} lld-${llvmVersion} clang-tools-${llvmVersion} clang-format-${llvmVersion} clang-tidy-${llvmVersion}`;
const sysRootStep = {
@ -71,7 +73,7 @@ export DEBIAN_FRONTEND=noninteractive
sudo apt-get -qq remove --purge -y man-db > /dev/null 2> /dev/null
# Remove older clang before we install
sudo apt-get -qq remove \
'clang-12*' 'clang-13*' 'clang-14*' 'clang-15*' 'clang-16*' 'llvm-12*' 'llvm-13*' 'llvm-14*' 'llvm-15*' 'llvm-16*' 'lld-12*' 'lld-13*' 'lld-14*' 'lld-15*' 'lld-16*' > /dev/null 2> /dev/null
'clang-12*' 'clang-13*' 'clang-14*' 'clang-15*' 'clang-16*' 'clang-17*' 'clang-18*' 'llvm-12*' 'llvm-13*' 'llvm-14*' 'llvm-15*' 'llvm-16*' 'lld-12*' 'lld-13*' 'lld-14*' 'lld-15*' 'lld-16*' 'lld-17*' 'lld-18*' > /dev/null 2> /dev/null
# Install clang-XXX, lld-XXX, and debootstrap.
echo "deb http://apt.llvm.org/jammy/ llvm-toolchain-jammy-${llvmVersion} main" |
@ -86,7 +88,7 @@ ${installPkgsCommand} || echo 'Failed. Trying again.' && sudo apt-get clean && s
(yes '' | sudo update-alternatives --force --all) > /dev/null 2> /dev/null || true
echo "Decompressing sysroot..."
wget -q https://github.com/denoland/deno_sysroot_build/releases/download/sysroot-20240528/sysroot-\`uname -m\`.tar.xz -O /tmp/sysroot.tar.xz
wget -q https://github.com/denoland/deno_sysroot_build/releases/download/sysroot-20241030/sysroot-\`uname -m\`.tar.xz -O /tmp/sysroot.tar.xz
cd /
xzcat /tmp/sysroot.tar.xz | sudo tar -x
sudo mount --rbind /dev /sysroot/dev

View file

@ -62,18 +62,18 @@ jobs:
profile: debug
- os: macos
arch: x86_64
runner: '${{ (!contains(github.event.pull_request.labels.*.name, ''ci-full'') && (github.event_name == ''pull_request'')) && ''ubuntu-22.04'' || ''macos-13'' }}'
runner: '${{ (!contains(github.event.pull_request.labels.*.name, ''ci-full'') && (github.event_name == ''pull_request'')) && ''ubuntu-24.04'' || ''macos-13'' }}'
job: test
profile: release
skip: '${{ !contains(github.event.pull_request.labels.*.name, ''ci-full'') && (github.event_name == ''pull_request'') }}'
- os: macos
arch: aarch64
runner: macos-14
runner: '${{ github.repository == ''denoland/deno'' && startsWith(github.ref, ''refs/tags/'') && ''self-hosted'' || ''macos-14'' }}'
job: test
profile: debug
- os: macos
arch: aarch64
runner: '${{ (!contains(github.event.pull_request.labels.*.name, ''ci-full'') && (github.event_name == ''pull_request'')) && ''ubuntu-22.04'' || ''macos-14'' }}'
runner: '${{ (!contains(github.event.pull_request.labels.*.name, ''ci-full'') && (github.event_name == ''pull_request'')) && ''ubuntu-24.04'' || github.repository == ''denoland/deno'' && startsWith(github.ref, ''refs/tags/'') && ''self-hosted'' || ''macos-14'' }}'
job: test
profile: release
skip: '${{ !contains(github.event.pull_request.labels.*.name, ''ci-full'') && (github.event_name == ''pull_request'') }}'
@ -84,33 +84,33 @@ jobs:
profile: debug
- os: windows
arch: x86_64
runner: '${{ (!contains(github.event.pull_request.labels.*.name, ''ci-full'') && (github.event_name == ''pull_request'')) && ''ubuntu-22.04'' || github.repository == ''denoland/deno'' && ''windows-2022-xl'' || ''windows-2022'' }}'
runner: '${{ (!contains(github.event.pull_request.labels.*.name, ''ci-full'') && (github.event_name == ''pull_request'')) && ''ubuntu-24.04'' || github.repository == ''denoland/deno'' && ''windows-2022-xl'' || ''windows-2022'' }}'
job: test
profile: release
skip: '${{ !contains(github.event.pull_request.labels.*.name, ''ci-full'') && (github.event_name == ''pull_request'') }}'
- os: linux
arch: x86_64
runner: '${{ github.repository == ''denoland/deno'' && ''ubuntu-22.04-xl'' || ''ubuntu-22.04'' }}'
runner: '${{ github.repository == ''denoland/deno'' && ''ubuntu-24.04-xl'' || ''ubuntu-24.04'' }}'
job: test
profile: release
use_sysroot: true
wpt: '${{ !startsWith(github.ref, ''refs/tags/'') }}'
- os: linux
arch: x86_64
runner: '${{ (!contains(github.event.pull_request.labels.*.name, ''ci-full'') && (github.event_name == ''pull_request'' && !contains(github.event.pull_request.labels.*.name, ''ci-bench''))) && ''ubuntu-22.04'' || github.repository == ''denoland/deno'' && ''ubuntu-22.04-xl'' || ''ubuntu-22.04'' }}'
runner: '${{ (!contains(github.event.pull_request.labels.*.name, ''ci-full'') && (github.event_name == ''pull_request'' && !contains(github.event.pull_request.labels.*.name, ''ci-bench''))) && ''ubuntu-24.04'' || github.repository == ''denoland/deno'' && ''ubuntu-24.04-xl'' || ''ubuntu-24.04'' }}'
job: bench
profile: release
use_sysroot: true
skip: '${{ !contains(github.event.pull_request.labels.*.name, ''ci-full'') && (github.event_name == ''pull_request'' && !contains(github.event.pull_request.labels.*.name, ''ci-bench'')) }}'
- os: linux
arch: x86_64
runner: ubuntu-22.04
runner: ubuntu-24.04
job: test
profile: debug
use_sysroot: true
- os: linux
arch: x86_64
runner: ubuntu-22.04
runner: ubuntu-24.04
job: lint
profile: debug
- os: linux
@ -252,22 +252,22 @@ jobs:
# to complete.
sudo apt-get -qq remove --purge -y man-db > /dev/null 2> /dev/null
# Remove older clang before we install
sudo apt-get -qq remove 'clang-12*' 'clang-13*' 'clang-14*' 'clang-15*' 'clang-16*' 'llvm-12*' 'llvm-13*' 'llvm-14*' 'llvm-15*' 'llvm-16*' 'lld-12*' 'lld-13*' 'lld-14*' 'lld-15*' 'lld-16*' > /dev/null 2> /dev/null
sudo apt-get -qq remove 'clang-12*' 'clang-13*' 'clang-14*' 'clang-15*' 'clang-16*' 'clang-17*' 'clang-18*' 'llvm-12*' 'llvm-13*' 'llvm-14*' 'llvm-15*' 'llvm-16*' 'lld-12*' 'lld-13*' 'lld-14*' 'lld-15*' 'lld-16*' 'lld-17*' 'lld-18*' > /dev/null 2> /dev/null
# Install clang-XXX, lld-XXX, and debootstrap.
echo "deb http://apt.llvm.org/jammy/ llvm-toolchain-jammy-18 main" |
sudo dd of=/etc/apt/sources.list.d/llvm-toolchain-jammy-18.list
echo "deb http://apt.llvm.org/jammy/ llvm-toolchain-jammy-19 main" |
sudo dd of=/etc/apt/sources.list.d/llvm-toolchain-jammy-19.list
curl https://apt.llvm.org/llvm-snapshot.gpg.key |
gpg --dearmor |
sudo dd of=/etc/apt/trusted.gpg.d/llvm-snapshot.gpg
sudo apt-get update
# this was unreliable sometimes, so try again if it fails
sudo apt-get install --no-install-recommends clang-18 lld-18 clang-tools-18 clang-format-18 clang-tidy-18 || echo 'Failed. Trying again.' && sudo apt-get clean && sudo apt-get update && sudo apt-get install --no-install-recommends clang-18 lld-18 clang-tools-18 clang-format-18 clang-tidy-18
sudo apt-get install --no-install-recommends clang-19 lld-19 clang-tools-19 clang-format-19 clang-tidy-19 || echo 'Failed. Trying again.' && sudo apt-get clean && sudo apt-get update && sudo apt-get install --no-install-recommends clang-19 lld-19 clang-tools-19 clang-format-19 clang-tidy-19
# Fix alternatives
(yes '' | sudo update-alternatives --force --all) > /dev/null 2> /dev/null || true
echo "Decompressing sysroot..."
wget -q https://github.com/denoland/deno_sysroot_build/releases/download/sysroot-20240528/sysroot-`uname -m`.tar.xz -O /tmp/sysroot.tar.xz
wget -q https://github.com/denoland/deno_sysroot_build/releases/download/sysroot-20241030/sysroot-`uname -m`.tar.xz -O /tmp/sysroot.tar.xz
cd /
xzcat /tmp/sysroot.tar.xz | sudo tar -x
sudo mount --rbind /dev /sysroot/dev
@ -299,8 +299,8 @@ jobs:
CARGO_PROFILE_RELEASE_LTO=false
RUSTFLAGS<<__1
-C linker-plugin-lto=true
-C linker=clang-18
-C link-arg=-fuse-ld=lld-18
-C linker=clang-19
-C link-arg=-fuse-ld=lld-19
-C link-arg=-ldl
-C link-arg=-Wl,--allow-shlib-undefined
-C link-arg=-Wl,--thinlto-cache-dir=$(pwd)/target/release/lto-cache
@ -310,8 +310,8 @@ jobs:
__1
RUSTDOCFLAGS<<__1
-C linker-plugin-lto=true
-C linker=clang-18
-C link-arg=-fuse-ld=lld-18
-C linker=clang-19
-C link-arg=-fuse-ld=lld-19
-C link-arg=-ldl
-C link-arg=-Wl,--allow-shlib-undefined
-C link-arg=-Wl,--thinlto-cache-dir=$(pwd)/target/release/lto-cache
@ -319,7 +319,7 @@ jobs:
--cfg tokio_unstable
$RUSTFLAGS
__1
CC=/usr/bin/clang-18
CC=/usr/bin/clang-19
CFLAGS=-flto=thin $CFLAGS
" > $GITHUB_ENV
- name: Remove macOS cURL --ipv4 flag
@ -361,8 +361,8 @@ jobs:
path: |-
~/.cargo/registry/index
~/.cargo/registry/cache
key: '22-cargo-home-${{ matrix.os }}-${{ matrix.arch }}-${{ hashFiles(''Cargo.lock'') }}'
restore-keys: '22-cargo-home-${{ matrix.os }}-${{ matrix.arch }}'
key: '25-cargo-home-${{ matrix.os }}-${{ matrix.arch }}-${{ hashFiles(''Cargo.lock'') }}'
restore-keys: '25-cargo-home-${{ matrix.os }}-${{ matrix.arch }}'
if: '!(matrix.skip)'
- name: Restore cache build output (PR)
uses: actions/cache/restore@v4
@ -375,7 +375,7 @@ jobs:
!./target/*/*.zip
!./target/*/*.tar.gz
key: never_saved
restore-keys: '22-cargo-target-${{ matrix.os }}-${{ matrix.arch }}-${{ matrix.profile }}-${{ matrix.job }}-'
restore-keys: '25-cargo-target-${{ matrix.os }}-${{ matrix.arch }}-${{ matrix.profile }}-${{ matrix.job }}-'
- name: Apply and update mtime cache
if: '!(matrix.skip) && (!startsWith(github.ref, ''refs/tags/''))'
uses: ./.github/mtime_cache
@ -685,10 +685,10 @@ jobs:
!./target/*/*.zip
!./target/*/*.sha256sum
!./target/*/*.tar.gz
key: '22-cargo-target-${{ matrix.os }}-${{ matrix.arch }}-${{ matrix.profile }}-${{ matrix.job }}-${{ github.sha }}'
key: '25-cargo-target-${{ matrix.os }}-${{ matrix.arch }}-${{ matrix.profile }}-${{ matrix.job }}-${{ github.sha }}'
publish-canary:
name: publish canary
runs-on: ubuntu-22.04
runs-on: ubuntu-24.04
needs:
- build
if: github.repository == 'denoland/deno' && github.ref == 'refs/heads/main'

View file

@ -7,7 +7,7 @@ on:
jobs:
update-dl-version:
name: update dl.deno.land version
runs-on: ubuntu-22.04
runs-on: ubuntu-24.04
if: github.repository == 'denoland/deno'
steps:
- name: Authenticate with Google Cloud

View file

@ -16,7 +16,7 @@ on:
jobs:
build:
name: start release
runs-on: ubuntu-22.04
runs-on: ubuntu-24.04
timeout-minutes: 30
env:

View file

@ -16,7 +16,7 @@ on:
jobs:
build:
name: version bump
runs-on: ubuntu-22.04
runs-on: ubuntu-24.04
timeout-minutes: 90
env:

View file

@ -20,7 +20,7 @@ jobs:
fail-fast: false
matrix:
deno-version: [v1.x, canary]
os: [ubuntu-22.04-xl]
os: [ubuntu-24.04-xl]
steps:
- name: Clone repository

364
Cargo.lock generated
View file

@ -765,6 +765,8 @@ dependencies = [
"fastwebsockets",
"file_test_runner",
"flaky_test",
"hickory-client",
"hickory-server",
"http 1.1.0",
"http-body-util",
"hyper 1.4.1",
@ -778,8 +780,6 @@ dependencies = [
"serde",
"test_server",
"tokio",
"trust-dns-client",
"trust-dns-server",
"url",
"uuid",
"zeromq",
@ -1154,7 +1154,7 @@ dependencies = [
[[package]]
name = "deno"
version = "2.0.3"
version = "2.0.6"
dependencies = [
"anstream",
"async-trait",
@ -1279,9 +1279,9 @@ dependencies = [
[[package]]
name = "deno_ast"
version = "0.42.2"
version = "0.43.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b2b9d03b1bbeeecdac54367f075d572131736d06c5be3bc49037855bc5ab1bbb"
checksum = "48d00b724e06d2081a141ec1155756a0b465d413d8e2a7515221f61d482eb2ee"
dependencies = [
"base64 0.21.7",
"deno_media_type",
@ -1323,7 +1323,7 @@ dependencies = [
[[package]]
name = "deno_bench_util"
version = "0.168.0"
version = "0.171.0"
dependencies = [
"bencher",
"deno_core",
@ -1332,7 +1332,7 @@ dependencies = [
[[package]]
name = "deno_broadcast_channel"
version = "0.168.0"
version = "0.171.0"
dependencies = [
"async-trait",
"deno_core",
@ -1343,7 +1343,7 @@ dependencies = [
[[package]]
name = "deno_cache"
version = "0.106.0"
version = "0.109.0"
dependencies = [
"async-trait",
"deno_core",
@ -1356,9 +1356,9 @@ dependencies = [
[[package]]
name = "deno_cache_dir"
version = "0.13.1"
version = "0.13.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "693ca429aebf945de5fef30df232044f9f80be4cc5a5e7c8d767226c43880f5a"
checksum = "08c1f52170cd7715f8006da54cde1444863a0d6fbd9c11d037a737db2dec8e22"
dependencies = [
"base32",
"deno_media_type",
@ -1376,7 +1376,7 @@ dependencies = [
[[package]]
name = "deno_canvas"
version = "0.43.0"
version = "0.46.0"
dependencies = [
"deno_core",
"deno_webgpu",
@ -1387,9 +1387,9 @@ dependencies = [
[[package]]
name = "deno_config"
version = "0.37.2"
version = "0.38.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5900bfb37538d83b19ba0b157cdc785770e38422ee4632411e3bd3d90ac0f537"
checksum = "966825073480a6ac7e01977a3879d13edc8d6ea2d65ea164b37156a5fb206e9a"
dependencies = [
"anyhow",
"deno_package_json",
@ -1411,16 +1411,16 @@ dependencies = [
[[package]]
name = "deno_console"
version = "0.174.0"
version = "0.177.0"
dependencies = [
"deno_core",
]
[[package]]
name = "deno_core"
version = "0.314.2"
version = "0.318.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "83138917579676069b423c3eb9be3c1e579f60dc022d85f6ded4c792456255ff"
checksum = "10cae2393219ff9278123f7b24799cdfab37c7d6561b69ca06ced115cac92111"
dependencies = [
"anyhow",
"bincode",
@ -1456,7 +1456,7 @@ checksum = "a13951ea98c0a4c372f162d669193b4c9d991512de9f2381dd161027f34b26b1"
[[package]]
name = "deno_cron"
version = "0.54.0"
version = "0.57.0"
dependencies = [
"anyhow",
"async-trait",
@ -1469,7 +1469,7 @@ dependencies = [
[[package]]
name = "deno_crypto"
version = "0.188.0"
version = "0.191.0"
dependencies = [
"aes",
"aes-gcm",
@ -1506,9 +1506,9 @@ dependencies = [
[[package]]
name = "deno_doc"
version = "0.154.0"
version = "0.156.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "17e204e45b0d79750880114e37b34abe19ad0710d8435a8da8f23a528fe98de4"
checksum = "2585b98d6ad76dae30bf2d7b6d71b8363cae041158b8780d14a2f4fe17590a61"
dependencies = [
"anyhow",
"cfg-if",
@ -1531,7 +1531,7 @@ dependencies = [
[[package]]
name = "deno_fetch"
version = "0.198.0"
version = "0.201.0"
dependencies = [
"base64 0.21.7",
"bytes",
@ -1564,7 +1564,7 @@ dependencies = [
[[package]]
name = "deno_ffi"
version = "0.161.0"
version = "0.164.0"
dependencies = [
"deno_core",
"deno_permissions",
@ -1584,7 +1584,7 @@ dependencies = [
[[package]]
name = "deno_fs"
version = "0.84.0"
version = "0.87.0"
dependencies = [
"async-trait",
"base32",
@ -1606,9 +1606,9 @@ dependencies = [
[[package]]
name = "deno_graph"
version = "0.83.4"
version = "0.84.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5bd20bc0780071989c622cbfd5d4fb2e4fd05a247ccd7f791f13c8d2c3792228"
checksum = "cd4f4a14aa069087be41c2998077b0453f0191747898f96e6343f700abfc2c18"
dependencies = [
"anyhow",
"async-trait",
@ -1635,7 +1635,7 @@ dependencies = [
[[package]]
name = "deno_http"
version = "0.172.0"
version = "0.175.0"
dependencies = [
"async-compression",
"async-trait",
@ -1674,7 +1674,7 @@ dependencies = [
[[package]]
name = "deno_io"
version = "0.84.0"
version = "0.87.0"
dependencies = [
"async-trait",
"deno_core",
@ -1695,7 +1695,7 @@ dependencies = [
[[package]]
name = "deno_kv"
version = "0.82.0"
version = "0.85.0"
dependencies = [
"anyhow",
"async-trait",
@ -1726,9 +1726,9 @@ dependencies = [
[[package]]
name = "deno_lint"
version = "0.67.0"
version = "0.68.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "871b60e32bfb6c110cbb9b0688dbf048f81e5d347fe4ce5a42239263de9dd938"
checksum = "bb994e6d1b18223df0a756c7948143b35682941d615edffef60d5b38822f38ac"
dependencies = [
"anyhow",
"deno_ast",
@ -1756,9 +1756,9 @@ dependencies = [
[[package]]
name = "deno_media_type"
version = "0.1.4"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a8978229b82552bf8457a0125aa20863f023619cfc21ebb007b1e571d68fd85b"
checksum = "7fcf552fbdedbe81c89705349d7d2485c7051382b000dfddbdbf7fc25931cf83"
dependencies = [
"data-url",
"serde",
@ -1767,7 +1767,7 @@ dependencies = [
[[package]]
name = "deno_napi"
version = "0.105.0"
version = "0.108.0"
dependencies = [
"deno_core",
"deno_permissions",
@ -1795,24 +1795,24 @@ dependencies = [
[[package]]
name = "deno_net"
version = "0.166.0"
version = "0.169.0"
dependencies = [
"deno_core",
"deno_permissions",
"deno_tls",
"hickory-proto",
"hickory-resolver",
"pin-project",
"rustls-tokio-stream",
"serde",
"socket2",
"thiserror",
"tokio",
"trust-dns-proto",
"trust-dns-resolver",
]
[[package]]
name = "deno_node"
version = "0.111.0"
version = "0.114.0"
dependencies = [
"aead-gcm-stream",
"aes",
@ -1921,9 +1921,9 @@ dependencies = [
[[package]]
name = "deno_ops"
version = "0.190.1"
version = "0.194.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "26f46d4e4f52f26c882b74a9b58810ea75252b807cf0966166ec333077cdfd85"
checksum = "f760b492bd638c1dc3e992d11672c259fbe9a233162099a8347591c9e22d0391"
dependencies = [
"proc-macro-rules",
"proc-macro2",
@ -1961,7 +1961,7 @@ dependencies = [
[[package]]
name = "deno_permissions"
version = "0.34.0"
version = "0.37.0"
dependencies = [
"deno_core",
"deno_path_util",
@ -1972,13 +1972,14 @@ dependencies = [
"once_cell",
"percent-encoding",
"serde",
"thiserror",
"which 4.4.2",
"winapi",
]
[[package]]
name = "deno_resolver"
version = "0.6.0"
version = "0.9.0"
dependencies = [
"anyhow",
"base32",
@ -1994,7 +1995,7 @@ dependencies = [
[[package]]
name = "deno_runtime"
version = "0.183.0"
version = "0.186.0"
dependencies = [
"color-print",
"deno_ast",
@ -2112,7 +2113,7 @@ dependencies = [
[[package]]
name = "deno_tls"
version = "0.161.0"
version = "0.164.0"
dependencies = [
"deno_core",
"deno_native_certs",
@ -2161,7 +2162,7 @@ dependencies = [
[[package]]
name = "deno_url"
version = "0.174.0"
version = "0.177.0"
dependencies = [
"deno_bench_util",
"deno_console",
@ -2173,7 +2174,7 @@ dependencies = [
[[package]]
name = "deno_web"
version = "0.205.0"
version = "0.208.0"
dependencies = [
"async-trait",
"base64-simd 0.8.0",
@ -2195,7 +2196,7 @@ dependencies = [
[[package]]
name = "deno_webgpu"
version = "0.141.0"
version = "0.144.0"
dependencies = [
"deno_core",
"raw-window-handle",
@ -2208,7 +2209,7 @@ dependencies = [
[[package]]
name = "deno_webidl"
version = "0.174.0"
version = "0.177.0"
dependencies = [
"deno_bench_util",
"deno_core",
@ -2216,7 +2217,7 @@ dependencies = [
[[package]]
name = "deno_websocket"
version = "0.179.0"
version = "0.182.0"
dependencies = [
"bytes",
"deno_core",
@ -2238,7 +2239,7 @@ dependencies = [
[[package]]
name = "deno_webstorage"
version = "0.169.0"
version = "0.172.0"
dependencies = [
"deno_core",
"deno_web",
@ -2608,9 +2609,9 @@ dependencies = [
[[package]]
name = "dprint-plugin-typescript"
version = "0.93.0"
version = "0.93.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e9308d98b923b7c0335c2ee1560199e3f2321b1be82803107b4ba4ed5dac46cc"
checksum = "3ff29fd136541e59d51946f0d2d353fefc886776f61a799ebfb5838b06cef13b"
dependencies = [
"anyhow",
"deno_ast",
@ -2638,15 +2639,6 @@ dependencies = [
"text_lines",
]
[[package]]
name = "drain"
version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9d105028bd2b5dfcb33318fd79a445001ead36004dd8dffef1bdd7e493d8bc1e"
dependencies = [
"tokio",
]
[[package]]
name = "dsa"
version = "0.6.3"
@ -3544,6 +3536,92 @@ version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dfa686283ad6dd069f105e5ab091b04c62850d3e4cf5d67debad1933f55023df"
[[package]]
name = "hickory-client"
version = "0.24.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bab9683b08d8f8957a857b0236455d80e1886eaa8c6178af556aa7871fb61b55"
dependencies = [
"cfg-if",
"data-encoding",
"futures-channel",
"futures-util",
"hickory-proto",
"once_cell",
"radix_trie",
"rand",
"thiserror",
"tokio",
"tracing",
]
[[package]]
name = "hickory-proto"
version = "0.24.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "07698b8420e2f0d6447a436ba999ec85d8fbf2a398bbd737b82cac4a2e96e512"
dependencies = [
"async-trait",
"cfg-if",
"data-encoding",
"enum-as-inner",
"futures-channel",
"futures-io",
"futures-util",
"idna 0.4.0",
"ipnet",
"once_cell",
"rand",
"serde",
"thiserror",
"tinyvec",
"tokio",
"tracing",
"url",
]
[[package]]
name = "hickory-resolver"
version = "0.24.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "28757f23aa75c98f254cf0405e6d8c25b831b32921b050a66692427679b1f243"
dependencies = [
"cfg-if",
"futures-util",
"hickory-proto",
"ipconfig",
"lru-cache",
"once_cell",
"parking_lot",
"rand",
"resolv-conf",
"serde",
"smallvec",
"thiserror",
"tokio",
"tracing",
]
[[package]]
name = "hickory-server"
version = "0.24.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9be0e43c556b9b3fdb6c7c71a9a32153a2275d02419e3de809e520bfcfe40c37"
dependencies = [
"async-trait",
"bytes",
"cfg-if",
"enum-as-inner",
"futures-util",
"hickory-proto",
"serde",
"thiserror",
"time",
"tokio",
"tokio-util",
"tracing",
]
[[package]]
name = "hkdf"
version = "0.12.4"
@ -4194,9 +4272,9 @@ dependencies = [
[[package]]
name = "libsui"
version = "0.4.0"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "205eca4e7beaad637dcd38fe41292065894ee7f498077cf3c135d5f7252b9f27"
checksum = "89795977654ad6250d6c0915411b622bac22f9efb4f852af94b2e00964cab832"
dependencies = [
"editpe",
"libc",
@ -4311,9 +4389,9 @@ dependencies = [
[[package]]
name = "markup_fmt"
version = "0.14.0"
version = "0.15.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3f15d7b24ae4ea9b87279bc0696462a4fb6c2168847f2cc162a2da05fe1a0f61"
checksum = "ebae65c91eab3d42231232bf48107f351e5a8d511454927218c53aeb68bbdb6f"
dependencies = [
"aho-corasick",
"css_dataset",
@ -4483,7 +4561,7 @@ dependencies = [
[[package]]
name = "napi_sym"
version = "0.104.0"
version = "0.107.0"
dependencies = [
"quote",
"serde",
@ -4538,7 +4616,7 @@ dependencies = [
[[package]]
name = "node_resolver"
version = "0.13.0"
version = "0.16.0"
dependencies = [
"anyhow",
"async-trait",
@ -6146,15 +6224,6 @@ dependencies = [
"syn 2.0.72",
]
[[package]]
name = "serde_spanned"
version = "0.6.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "79e674e01f999af37c49f70a6ede167a8a60b2503e56c5599532a65baa5969a0"
dependencies = [
"serde",
]
[[package]]
name = "serde_urlencoded"
version = "0.7.1"
@ -6169,9 +6238,9 @@ dependencies = [
[[package]]
name = "serde_v8"
version = "0.223.1"
version = "0.227.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9cf3d859dda87ee96423c01244f10af864fa6d6a9fcdc2b77e0595078ea0ea11"
checksum = "0a8294c2223c53bed343be8b80564ece4dc0d03b643b06fa86c4ccc0e064eda0"
dependencies = [
"num-bigint",
"serde",
@ -7121,6 +7190,7 @@ dependencies = [
"console_static_text",
"deno_unsync",
"denokv_proto",
"faster-hex",
"fastwebsockets",
"flate2",
"futures",
@ -7368,40 +7438,6 @@ dependencies = [
"serde",
]
[[package]]
name = "toml"
version = "0.7.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dd79e69d3b627db300ff956027cc6c3798cef26d22526befdfcd12feeb6d2257"
dependencies = [
"serde",
"serde_spanned",
"toml_datetime",
"toml_edit",
]
[[package]]
name = "toml_datetime"
version = "0.6.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4badfd56924ae69bcc9039335b2e017639ce3f9b001c393c1b2d1ef846ce2cbf"
dependencies = [
"serde",
]
[[package]]
name = "toml_edit"
version = "0.19.15"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1b5bb770da30e5cbfde35a2d7b9b8a2c4b8ef89548a7a6aeab5c9a576e3e7421"
dependencies = [
"indexmap",
"serde",
"serde_spanned",
"toml_datetime",
"winnow 0.5.40",
]
[[package]]
name = "tower"
version = "0.4.13"
@ -7491,95 +7527,6 @@ dependencies = [
"stable_deref_trait",
]
[[package]]
name = "trust-dns-client"
version = "0.23.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "14135e72c7e6d4c9b6902d4437881a8598f0145dbb2e3f86f92dbad845b61e63"
dependencies = [
"cfg-if",
"data-encoding",
"futures-channel",
"futures-util",
"once_cell",
"radix_trie",
"rand",
"thiserror",
"tokio",
"tracing",
"trust-dns-proto",
]
[[package]]
name = "trust-dns-proto"
version = "0.23.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3119112651c157f4488931a01e586aa459736e9d6046d3bd9105ffb69352d374"
dependencies = [
"async-trait",
"cfg-if",
"data-encoding",
"enum-as-inner",
"futures-channel",
"futures-io",
"futures-util",
"idna 0.4.0",
"ipnet",
"once_cell",
"rand",
"serde",
"smallvec",
"thiserror",
"tinyvec",
"tokio",
"tracing",
"url",
]
[[package]]
name = "trust-dns-resolver"
version = "0.23.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "10a3e6c3aff1718b3c73e395d1f35202ba2ffa847c6a62eea0db8fb4cfe30be6"
dependencies = [
"cfg-if",
"futures-util",
"ipconfig",
"lru-cache",
"once_cell",
"parking_lot",
"rand",
"resolv-conf",
"serde",
"smallvec",
"thiserror",
"tokio",
"tracing",
"trust-dns-proto",
]
[[package]]
name = "trust-dns-server"
version = "0.23.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c540f73c2b2ec2f6c54eabd0900e7aafb747a820224b742f556e8faabb461bc7"
dependencies = [
"async-trait",
"bytes",
"cfg-if",
"drain",
"enum-as-inner",
"futures-executor",
"futures-util",
"serde",
"thiserror",
"time",
"tokio",
"toml 0.7.8",
"tracing",
"trust-dns-proto",
]
[[package]]
name = "try-lock"
version = "0.2.5"
@ -8329,15 +8276,6 @@ version = "0.52.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "32b752e52a2da0ddfbdbcc6fceadfeede4c939ed16d13e648833a61dfb611ed8"
[[package]]
name = "winnow"
version = "0.5.40"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f593a95398737aeed53e489c785df13f3618e41dbcd6718c6addbf1395aa6876"
dependencies = [
"memchr",
]
[[package]]
name = "winnow"
version = "0.6.15"
@ -8373,7 +8311,7 @@ version = "0.1.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b68db261ef59e9e52806f688020631e987592bd83619edccda9c47d42cde4f6c"
dependencies = [
"toml 0.5.11",
"toml",
]
[[package]]
@ -8450,7 +8388,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2a6a39b6b5ba0d02c910d05d7fbc366a4befb8901ea107dcde9c1c97acb8a366"
dependencies = [
"rowan",
"winnow 0.6.15",
"winnow",
]
[[package]]
@ -8547,9 +8485,9 @@ dependencies = [
[[package]]
name = "zeromq"
version = "0.4.0"
version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fb0560d00172817b7f7c2265060783519c475702ae290b154115ca75e976d4d0"
checksum = "6a4528179201f6eecf211961a7d3276faa61554c82651ecc66387f68fc3004bd"
dependencies = [
"async-trait",
"asynchronous-codec",

View file

@ -45,19 +45,19 @@ license = "MIT"
repository = "https://github.com/denoland/deno"
[workspace.dependencies]
deno_ast = { version = "=0.42.2", features = ["transpiling"] }
deno_core = { version = "0.314.2" }
deno_ast = { version = "=0.43.3", features = ["transpiling"] }
deno_core = { version = "0.318.0" }
deno_bench_util = { version = "0.168.0", path = "./bench_util" }
deno_bench_util = { version = "0.171.0", path = "./bench_util" }
deno_lockfile = "=0.23.1"
deno_media_type = { version = "0.1.4", features = ["module_specifier"] }
deno_media_type = { version = "0.2.0", features = ["module_specifier"] }
deno_npm = "=0.25.4"
deno_path_util = "=0.2.1"
deno_permissions = { version = "0.34.0", path = "./runtime/permissions" }
deno_runtime = { version = "0.183.0", path = "./runtime" }
deno_permissions = { version = "0.37.0", path = "./runtime/permissions" }
deno_runtime = { version = "0.186.0", path = "./runtime" }
deno_semver = "=0.5.16"
deno_terminal = "0.2.0"
napi_sym = { version = "0.104.0", path = "./ext/napi/sym" }
napi_sym = { version = "0.107.0", path = "./ext/napi/sym" }
test_util = { package = "test_server", path = "./tests/util/server" }
denokv_proto = "0.8.1"
@ -66,32 +66,32 @@ denokv_remote = "0.8.1"
denokv_sqlite = { default-features = false, version = "0.8.2" }
# exts
deno_broadcast_channel = { version = "0.168.0", path = "./ext/broadcast_channel" }
deno_cache = { version = "0.106.0", path = "./ext/cache" }
deno_canvas = { version = "0.43.0", path = "./ext/canvas" }
deno_console = { version = "0.174.0", path = "./ext/console" }
deno_cron = { version = "0.54.0", path = "./ext/cron" }
deno_crypto = { version = "0.188.0", path = "./ext/crypto" }
deno_fetch = { version = "0.198.0", path = "./ext/fetch" }
deno_ffi = { version = "0.161.0", path = "./ext/ffi" }
deno_fs = { version = "0.84.0", path = "./ext/fs" }
deno_http = { version = "0.172.0", path = "./ext/http" }
deno_io = { version = "0.84.0", path = "./ext/io" }
deno_kv = { version = "0.82.0", path = "./ext/kv" }
deno_napi = { version = "0.105.0", path = "./ext/napi" }
deno_net = { version = "0.166.0", path = "./ext/net" }
deno_node = { version = "0.111.0", path = "./ext/node" }
deno_tls = { version = "0.161.0", path = "./ext/tls" }
deno_url = { version = "0.174.0", path = "./ext/url" }
deno_web = { version = "0.205.0", path = "./ext/web" }
deno_webgpu = { version = "0.141.0", path = "./ext/webgpu" }
deno_webidl = { version = "0.174.0", path = "./ext/webidl" }
deno_websocket = { version = "0.179.0", path = "./ext/websocket" }
deno_webstorage = { version = "0.169.0", path = "./ext/webstorage" }
deno_broadcast_channel = { version = "0.171.0", path = "./ext/broadcast_channel" }
deno_cache = { version = "0.109.0", path = "./ext/cache" }
deno_canvas = { version = "0.46.0", path = "./ext/canvas" }
deno_console = { version = "0.177.0", path = "./ext/console" }
deno_cron = { version = "0.57.0", path = "./ext/cron" }
deno_crypto = { version = "0.191.0", path = "./ext/crypto" }
deno_fetch = { version = "0.201.0", path = "./ext/fetch" }
deno_ffi = { version = "0.164.0", path = "./ext/ffi" }
deno_fs = { version = "0.87.0", path = "./ext/fs" }
deno_http = { version = "0.175.0", path = "./ext/http" }
deno_io = { version = "0.87.0", path = "./ext/io" }
deno_kv = { version = "0.85.0", path = "./ext/kv" }
deno_napi = { version = "0.108.0", path = "./ext/napi" }
deno_net = { version = "0.169.0", path = "./ext/net" }
deno_node = { version = "0.114.0", path = "./ext/node" }
deno_tls = { version = "0.164.0", path = "./ext/tls" }
deno_url = { version = "0.177.0", path = "./ext/url" }
deno_web = { version = "0.208.0", path = "./ext/web" }
deno_webgpu = { version = "0.144.0", path = "./ext/webgpu" }
deno_webidl = { version = "0.177.0", path = "./ext/webidl" }
deno_websocket = { version = "0.182.0", path = "./ext/websocket" }
deno_webstorage = { version = "0.172.0", path = "./ext/webstorage" }
# resolvers
deno_resolver = { version = "0.6.0", path = "./resolvers/deno" }
node_resolver = { version = "0.13.0", path = "./resolvers/node" }
deno_resolver = { version = "0.9.0", path = "./resolvers/deno" }
node_resolver = { version = "0.16.0", path = "./resolvers/node" }
aes = "=0.8.3"
anyhow = "1.0.57"
@ -111,7 +111,7 @@ console_static_text = "=0.8.1"
dashmap = "5.5.3"
data-encoding = "2.3.3"
data-url = "=0.3.0"
deno_cache_dir = "=0.13.1"
deno_cache_dir = "=0.13.2"
deno_package_json = { version = "0.1.2", default-features = false }
dlopen2 = "0.6.1"
ecb = "=0.1.2"
@ -204,7 +204,7 @@ webpki-root-certs = "0.26.5"
webpki-roots = "0.26"
which = "4.2.5"
yoke = { version = "0.7.4", features = ["derive"] }
zeromq = { version = "=0.4.0", default-features = false, features = ["tcp-transport", "tokio-runtime"] }
zeromq = { version = "=0.4.1", default-features = false, features = ["tcp-transport", "tokio-runtime"] }
zstd = "=0.12.4"
# crypto

View file

@ -6,6 +6,86 @@ https://github.com/denoland/deno/releases
We also have one-line install commands at:
https://github.com/denoland/deno_install
### 2.0.6 / 2024.11.10
- feat(ext/http): abort event when request is cancelled (#26781)
- feat(ext/http): abort signal when request is cancelled (#26761)
- feat(lsp): auto-import completions from byonm dependencies (#26680)
- fix(ext/cache): don't panic when creating cache (#26780)
- fix(ext/node): better inspector support (#26471)
- fix(fmt): don't use self-closing tags in HTML (#26754)
- fix(install): cache jsr deps from all workspace config files (#26779)
- fix(node:zlib): gzip & gzipSync should accept ArrayBuffer (#26762)
- fix: performance.timeOrigin (#26787)
### 2.0.5 / 2024.11.05
- fix(add): better error message when adding package that only has pre-release
versions (#26724)
- fix(add): only add npm deps to package.json if it's at least as close as
deno.json (#26683)
- fix(cli): set `npm_config_user_agent` when running npm packages or tasks
(#26639)
- fix(coverage): exclude comment lines from coverage reports (#25939)
- fix(ext/node): add `findSourceMap` to the default export of `node:module`
(#26720)
- fix(ext/node): convert errors from `fs.readFile/fs.readFileSync` to node
format (#26632)
- fix(ext/node): resolve exports even if parent module filename isn't present
(#26553)
- fix(ext/node): return `this` from `http.Server.ref/unref()` (#26647)
- fix(fmt): do not panic for jsx ignore container followed by jsx text (#26723)
- fix(fmt): fix several HTML and components issues (#26654)
- fix(fmt): ignore file directive for YAML files (#26717)
- fix(install): handle invalid function error, and fallback to junctions
regardless of the error (#26730)
- fix(lsp): include unstable features from editor settings (#26655)
- fix(lsp): scope attribution for lazily loaded assets (#26699)
- fix(node): Implement `os.userInfo` properly, add missing `toPrimitive`
(#24702)
- fix(serve): support serve hmr (#26078)
- fix(types): missing `import` permission on `PermissionOptionsObject` (#26627)
- fix(workspace): support wildcard packages (#26568)
- fix: clamp smi in fast calls by default (#26506)
- fix: improved support for cjs and cts modules (#26558)
- fix: op_run_microtasks crash (#26718)
- fix: panic_hook hangs without procfs (#26732)
- fix: remove permission check in op_require_node_module_paths (#26645)
- fix: surface package.json location on dep parse failure (#26665)
- perf(lsp): don't walk coverage directory (#26715)
### 2.0.4 / 2024.10.29
- Revert "fix(ext/node): fix dns.lookup result ordering (#26264)" (#26621)
- Revert "fix(ext/node): use primordials in `ext/node/polyfills/https.ts`
(#26323)" (#26613)
- feat(lsp): "typescript.preferences.preferTypeOnlyAutoImports" setting (#26546)
- fix(check): expose more globals from @types/node (#26603)
- fix(check): ignore resolving `jsxImportSource` when jsx is not used in graph
(#26548)
- fix(cli): Make --watcher CLEAR_SCREEN clear scrollback buffer as well as
visible screen (#25997)
- fix(compile): regression handling redirects (#26586)
- fix(ext/napi): export dynamic symbols list for {Free,Open}BSD (#26605)
- fix(ext/node): add path to `fs.stat` and `fs.statSync` error (#26037)
- fix(ext/node): compatibility with {Free,Open}BSD (#26604)
- fix(ext/node): use primordials in
ext\node\polyfills\internal\crypto\_randomInt.ts (#26534)
- fix(install): cache json exports of JSR packages (#26552)
- fix(install): regression - do not panic when config file contains \r\n
newlines (#26547)
- fix(lsp): make missing import action fix infallible (#26539)
- fix(npm): match npm bearer token generation (#26544)
- fix(upgrade): stop running `deno lsp` processes on windows before attempting
to replace executable (#26542)
- fix(watch): don't panic on invalid file specifiers (#26577)
- fix: do not panic when failing to write to http cache (#26591)
- fix: provide hints in terminal errors for Node.js globals (#26610)
- fix: report exceptions from nextTick (#26579)
- fix: support watch flag to enable watching other files than the main module on
serve subcommand (#26622)
- perf: pass transpiled module to deno_core as known string (#26555)
### 2.0.3 / 2024.10.25
- feat(lsp): interactive inlay hints (#26382)

View file

@ -2,7 +2,7 @@
[package]
name = "deno_bench_util"
version = "0.168.0"
version = "0.171.0"
authors.workspace = true
edition.workspace = true
license.workspace = true

View file

@ -2,7 +2,7 @@
[package]
name = "deno"
version = "2.0.3"
version = "2.0.6"
authors.workspace = true
default-run = "deno"
edition.workspace = true
@ -70,11 +70,11 @@ winres.workspace = true
[dependencies]
deno_ast = { workspace = true, features = ["bundler", "cjs", "codegen", "proposal", "react", "sourcemap", "transforms", "typescript", "view", "visit"] }
deno_cache_dir = { workspace = true }
deno_config = { version = "=0.37.2", features = ["workspace", "sync"] }
deno_config = { version = "=0.38.2", features = ["workspace", "sync"] }
deno_core = { workspace = true, features = ["include_js_files_for_snapshotting"] }
deno_doc = { version = "0.154.0", default-features = false, features = ["rust", "html", "syntect"] }
deno_graph = { version = "=0.83.4" }
deno_lint = { version = "=0.67.0", features = ["docs"] }
deno_doc = { version = "0.156.0", default-features = false, features = ["rust", "html", "syntect"] }
deno_graph = { version = "=0.84.1" }
deno_lint = { version = "=0.68.0", features = ["docs"] }
deno_lockfile.workspace = true
deno_npm.workspace = true
deno_package_json.workspace = true
@ -84,7 +84,7 @@ deno_runtime = { workspace = true, features = ["include_js_files_for_snapshottin
deno_semver.workspace = true
deno_task_shell = "=0.18.1"
deno_terminal.workspace = true
libsui = "0.4.0"
libsui = "0.5.0"
node_resolver.workspace = true
anstream = "0.6.14"
@ -107,7 +107,7 @@ dotenvy = "0.15.7"
dprint-plugin-json = "=0.19.4"
dprint-plugin-jupyter = "=0.1.5"
dprint-plugin-markdown = "=0.17.8"
dprint-plugin-typescript = "=0.93.0"
dprint-plugin-typescript = "=0.93.2"
env_logger = "=0.10.0"
fancy-regex = "=0.10.0"
faster-hex.workspace = true
@ -129,7 +129,7 @@ libz-sys.workspace = true
log = { workspace = true, features = ["serde"] }
lsp-types.workspace = true
malva = "=0.11.0"
markup_fmt = "=0.14.0"
markup_fmt = "=0.15.0"
memmem.workspace = true
monch.workspace = true
notify.workspace = true

View file

@ -1179,8 +1179,8 @@ static DENO_HELP: &str = cstr!(
<y>Dependency management:</>
<g>add</> Add dependencies
<p(245)>deno add jsr:@std/assert | deno add npm:express</>
<g>install</> Install script as an executable
<g>uninstall</> Uninstall a script previously installed with deno install
<g>install</> Installs dependencies either in the local project or globally to a bin directory
<g>uninstall</> Uninstalls a dependency or an executable script in the installation root's bin directory
<g>remove</> Remove dependencies from the configuration file
<y>Tooling:</>
@ -3388,8 +3388,7 @@ fn permission_args(app: Command, requires: Option<&'static str>) -> Command {
.value_name("IP_OR_HOSTNAME")
.help("Allow network access. Optionally specify allowed IP addresses and host names, with ports as necessary")
.value_parser(flags_net::validator)
.hide(true)
;
.hide(true);
if let Some(requires) = requires {
arg = arg.requires(requires)
}

View file

@ -51,7 +51,7 @@ pub fn parse(paths: Vec<String>) -> clap::error::Result<Vec<String>> {
}
} else {
NetDescriptor::parse(&host_and_port).map_err(|e| {
clap::Error::raw(clap::error::ErrorKind::InvalidValue, format!("{e:?}"))
clap::Error::raw(clap::error::ErrorKind::InvalidValue, e.to_string())
})?;
out.push(host_and_port)
}

View file

@ -46,6 +46,7 @@ pub use flags::*;
pub use lockfile::CliLockfile;
pub use lockfile::CliLockfileReadFromPathOptions;
pub use package_json::NpmInstallDepsProvider;
pub use package_json::PackageJsonDepValueParseWithLocationError;
use deno_ast::ModuleSpecifier;
use deno_core::anyhow::bail;
@ -200,6 +201,8 @@ pub fn ts_config_to_transpile_and_emit_options(
precompile_jsx_dynamic_props: None,
transform_jsx,
var_decl_imports: false,
// todo(dsherret): support verbatim_module_syntax here properly
verbatim_module_syntax: false,
},
deno_ast::EmitOptions {
inline_sources: options.inline_sources,
@ -1452,6 +1455,12 @@ impl CliOptions {
watch: Some(WatchFlagsWithPaths { hmr, .. }),
..
}) = &self.flags.subcommand
{
*hmr
} else if let DenoSubcommand::Serve(ServeFlags {
watch: Some(WatchFlagsWithPaths { hmr, .. }),
..
}) = &self.flags.subcommand
{
*hmr
} else {
@ -1595,6 +1604,15 @@ impl CliOptions {
}
pub fn use_byonm(&self) -> bool {
if matches!(
self.sub_command(),
DenoSubcommand::Install(_)
| DenoSubcommand::Add(_)
| DenoSubcommand::Remove(_)
) {
// For `deno install/add/remove` we want to force the managed resolver so it can set up `node_modules/` directory.
return false;
}
if self.node_modules_dir().ok().flatten().is_none()
&& self.maybe_node_modules_folder.is_some()
&& self
@ -1672,6 +1690,10 @@ impl CliOptions {
if let DenoSubcommand::Run(RunFlags {
watch: Some(WatchFlagsWithPaths { paths, .. }),
..
})
| DenoSubcommand::Serve(ServeFlags {
watch: Some(WatchFlagsWithPaths { paths, .. }),
..
}) = &self.flags.subcommand
{
full_paths.extend(paths.iter().map(|path| self.initial_cwd.join(path)));

View file

@ -5,10 +5,12 @@ use std::sync::Arc;
use deno_config::workspace::Workspace;
use deno_core::serde_json;
use deno_core::url::Url;
use deno_package_json::PackageJsonDepValue;
use deno_package_json::PackageJsonDepValueParseError;
use deno_semver::npm::NpmPackageReqReference;
use deno_semver::package::PackageReq;
use thiserror::Error;
#[derive(Debug)]
pub struct InstallNpmRemotePkg {
@ -23,11 +25,20 @@ pub struct InstallNpmWorkspacePkg {
pub target_dir: PathBuf,
}
#[derive(Debug, Error, Clone)]
#[error("Failed to install '{}'\n at {}", alias, location)]
pub struct PackageJsonDepValueParseWithLocationError {
pub location: Url,
pub alias: String,
#[source]
pub source: PackageJsonDepValueParseError,
}
#[derive(Debug, Default)]
pub struct NpmInstallDepsProvider {
remote_pkgs: Vec<InstallNpmRemotePkg>,
workspace_pkgs: Vec<InstallNpmWorkspacePkg>,
pkg_json_dep_errors: Vec<PackageJsonDepValueParseError>,
pkg_json_dep_errors: Vec<PackageJsonDepValueParseWithLocationError>,
}
impl NpmInstallDepsProvider {
@ -89,7 +100,13 @@ impl NpmInstallDepsProvider {
let dep = match dep {
Ok(dep) => dep,
Err(err) => {
pkg_json_dep_errors.push(err);
pkg_json_dep_errors.push(
PackageJsonDepValueParseWithLocationError {
location: pkg_json.specifier(),
alias,
source: err,
},
);
continue;
}
};
@ -150,7 +167,9 @@ impl NpmInstallDepsProvider {
&self.workspace_pkgs
}
pub fn pkg_json_dep_errors(&self) -> &[PackageJsonDepValueParseError] {
pub fn pkg_json_dep_errors(
&self,
) -> &[PackageJsonDepValueParseWithLocationError] {
&self.pkg_json_dep_errors
}
}

View file

@ -1,6 +1,5 @@
// Copyright 2018-2024 the Deno authors. All rights reserved. MIT license.
// deno-lint-ignore-file no-console
// deno-lint-ignore-file no-console no-process-globals
let [total, count] = typeof Deno !== "undefined"
? Deno.args

View file

@ -1,6 +1,5 @@
// Copyright 2018-2024 the Deno authors. All rights reserved. MIT license.
// deno-lint-ignore-file no-console
// deno-lint-ignore-file no-console no-process-globals
let [total, count] = typeof Deno !== "undefined"
? Deno.args

View file

@ -1,6 +1,5 @@
// Copyright 2018-2024 the Deno authors. All rights reserved. MIT license.
// deno-lint-ignore-file no-console
// deno-lint-ignore-file no-console no-process-globals
const queueMicrotask = globalThis.queueMicrotask || process.nextTick;
let [total, count] = typeof Deno !== "undefined"

View file

@ -1,6 +1,5 @@
// Copyright 2018-2024 the Deno authors. All rights reserved. MIT license.
// deno-lint-ignore-file no-console
// deno-lint-ignore-file no-console no-process-globals
let [total, count] = typeof Deno !== "undefined"
? Deno.args

View file

@ -1,6 +1,5 @@
// Copyright 2018-2024 the Deno authors. All rights reserved. MIT license.
// deno-lint-ignore-file no-console
// deno-lint-ignore-file no-console no-process-globals
const queueMicrotask = globalThis.queueMicrotask || process.nextTick;
let [total, count] = typeof Deno !== "undefined"

View file

@ -1,6 +1,5 @@
// Copyright 2018-2024 the Deno authors. All rights reserved. MIT license.
// deno-lint-ignore-file no-console
// deno-lint-ignore-file no-console no-process-globals
const queueMicrotask = globalThis.queueMicrotask || process.nextTick;
let [total, count] = typeof Deno !== "undefined"

View file

@ -1,6 +1,5 @@
// Copyright 2018-2024 the Deno authors. All rights reserved. MIT license.
// deno-lint-ignore-file no-console
// deno-lint-ignore-file no-console no-process-globals
const queueMicrotask = globalThis.queueMicrotask || process.nextTick;
let [total, count] = typeof Deno !== "undefined"

View file

@ -57,7 +57,7 @@ impl rusqlite::types::FromSql for CacheDBHash {
}
/// What should the cache should do on failure?
#[derive(Default)]
#[derive(Debug, Default)]
pub enum CacheFailure {
/// Return errors if failure mode otherwise unspecified.
#[default]
@ -69,6 +69,7 @@ pub enum CacheFailure {
}
/// Configuration SQL and other parameters for a [`CacheDB`].
#[derive(Debug)]
pub struct CacheDBConfiguration {
/// SQL to run for a new database.
pub table_initializer: &'static str,
@ -98,6 +99,7 @@ impl CacheDBConfiguration {
}
}
#[derive(Debug)]
enum ConnectionState {
Connected(Connection),
Blackhole,
@ -106,7 +108,7 @@ enum ConnectionState {
/// A cache database that eagerly initializes itself off-thread, preventing initialization operations
/// from blocking the main thread.
#[derive(Clone)]
#[derive(Debug, Clone)]
pub struct CacheDB {
// TODO(mmastrac): We can probably simplify our thread-safe implementation here
conn: Arc<Mutex<OnceCell<ConnectionState>>>,

2
cli/cache/emit.rs vendored
View file

@ -10,6 +10,7 @@ use deno_core::unsync::sync::AtomicFlag;
use super::DiskCache;
/// The cache that stores previously emitted files.
#[derive(Debug)]
pub struct EmitCache {
disk_cache: DiskCache,
emit_failed_flag: AtomicFlag,
@ -91,6 +92,7 @@ impl EmitCache {
const LAST_LINE_PREFIX: &str = "\n// denoCacheMetadata=";
#[derive(Debug)]
struct EmitFileSerializer {
cli_version: &'static str,
}

101
cli/cache/mod.rs vendored
View file

@ -8,14 +8,9 @@ use crate::file_fetcher::FetchOptions;
use crate::file_fetcher::FetchPermissionsOptionRef;
use crate::file_fetcher::FileFetcher;
use crate::file_fetcher::FileOrRedirect;
use crate::npm::CliNpmResolver;
use crate::resolver::CliNodeResolver;
use crate::util::fs::atomic_write_file_with_retries;
use crate::util::fs::atomic_write_file_with_retries_and_fs;
use crate::util::fs::AtomicWriteFileFsAdapter;
use crate::util::path::specifier_has_extension;
use crate::util::text_encoding::arc_str_to_bytes;
use crate::util::text_encoding::from_utf8_lossy_owned;
use deno_ast::MediaType;
use deno_core::futures;
@ -25,7 +20,9 @@ use deno_graph::source::CacheInfo;
use deno_graph::source::LoadFuture;
use deno_graph::source::LoadResponse;
use deno_graph::source::Loader;
use deno_runtime::deno_fs;
use deno_runtime::deno_permissions::PermissionsContainer;
use node_resolver::InNpmPackageChecker;
use std::collections::HashMap;
use std::path::Path;
use std::path::PathBuf;
@ -60,7 +57,6 @@ pub use fast_check::FastCheckCache;
pub use incremental::IncrementalCache;
pub use module_info::ModuleInfoCache;
pub use node::NodeAnalysisCache;
pub use parsed_source::EsmOrCjsChecker;
pub use parsed_source::LazyGraphSourceParser;
pub use parsed_source::ParsedSourceCache;
@ -181,46 +177,40 @@ pub struct FetchCacherOptions {
pub permissions: PermissionsContainer,
/// If we're publishing for `deno publish`.
pub is_deno_publish: bool,
pub unstable_detect_cjs: bool,
}
/// A "wrapper" for the FileFetcher and DiskCache for the Deno CLI that provides
/// a concise interface to the DENO_DIR when building module graphs.
pub struct FetchCacher {
pub file_header_overrides: HashMap<ModuleSpecifier, HashMap<String, String>>,
esm_or_cjs_checker: Arc<EsmOrCjsChecker>,
file_fetcher: Arc<FileFetcher>,
fs: Arc<dyn deno_fs::FileSystem>,
global_http_cache: Arc<GlobalHttpCache>,
node_resolver: Arc<CliNodeResolver>,
npm_resolver: Arc<dyn CliNpmResolver>,
in_npm_pkg_checker: Arc<dyn InNpmPackageChecker>,
module_info_cache: Arc<ModuleInfoCache>,
permissions: PermissionsContainer,
is_deno_publish: bool,
unstable_detect_cjs: bool,
cache_info_enabled: bool,
}
impl FetchCacher {
pub fn new(
esm_or_cjs_checker: Arc<EsmOrCjsChecker>,
file_fetcher: Arc<FileFetcher>,
fs: Arc<dyn deno_fs::FileSystem>,
global_http_cache: Arc<GlobalHttpCache>,
node_resolver: Arc<CliNodeResolver>,
npm_resolver: Arc<dyn CliNpmResolver>,
in_npm_pkg_checker: Arc<dyn InNpmPackageChecker>,
module_info_cache: Arc<ModuleInfoCache>,
options: FetchCacherOptions,
) -> Self {
Self {
file_fetcher,
esm_or_cjs_checker,
fs,
global_http_cache,
node_resolver,
npm_resolver,
in_npm_pkg_checker,
module_info_cache,
file_header_overrides: options.file_header_overrides,
permissions: options.permissions,
is_deno_publish: options.is_deno_publish,
unstable_detect_cjs: options.unstable_detect_cjs,
cache_info_enabled: false,
}
}
@ -271,70 +261,23 @@ impl Loader for FetchCacher {
) -> LoadFuture {
use deno_graph::source::CacheSetting as LoaderCacheSetting;
if specifier.scheme() == "file" {
if specifier.path().contains("/node_modules/") {
// The specifier might be in a completely different symlinked tree than
// what the node_modules url is in (ex. `/my-project-1/node_modules`
// symlinked to `/my-project-2/node_modules`), so first we checked if the path
// is in a node_modules dir to avoid needlessly canonicalizing, then now compare
// against the canonicalized specifier.
let specifier =
crate::node::resolve_specifier_into_node_modules(specifier);
if self.npm_resolver.in_npm_package(&specifier) {
return Box::pin(futures::future::ready(Ok(Some(
LoadResponse::External { specifier },
))));
}
}
// make local CJS modules external to the graph
if specifier_has_extension(specifier, "cjs") {
if specifier.scheme() == "file"
&& specifier.path().contains("/node_modules/")
{
// The specifier might be in a completely different symlinked tree than
// what the node_modules url is in (ex. `/my-project-1/node_modules`
// symlinked to `/my-project-2/node_modules`), so first we checked if the path
// is in a node_modules dir to avoid needlessly canonicalizing, then now compare
// against the canonicalized specifier.
let specifier = crate::node::resolve_specifier_into_node_modules(
specifier,
self.fs.as_ref(),
);
if self.in_npm_pkg_checker.in_npm_package(&specifier) {
return Box::pin(futures::future::ready(Ok(Some(
LoadResponse::External {
specifier: specifier.clone(),
},
LoadResponse::External { specifier },
))));
}
if self.unstable_detect_cjs && specifier_has_extension(specifier, "js") {
if let Ok(Some(pkg_json)) =
self.node_resolver.get_closest_package_json(specifier)
{
if pkg_json.typ == "commonjs" {
if let Ok(path) = specifier.to_file_path() {
if let Ok(bytes) = std::fs::read(&path) {
let text: Arc<str> = from_utf8_lossy_owned(bytes).into();
let is_es_module = match self.esm_or_cjs_checker.is_esm(
specifier,
text.clone(),
MediaType::JavaScript,
) {
Ok(value) => value,
Err(err) => {
return Box::pin(futures::future::ready(Err(err.into())));
}
};
if !is_es_module {
self.node_resolver.mark_cjs_resolution(specifier.clone());
return Box::pin(futures::future::ready(Ok(Some(
LoadResponse::External {
specifier: specifier.clone(),
},
))));
} else {
return Box::pin(futures::future::ready(Ok(Some(
LoadResponse::Module {
specifier: specifier.clone(),
content: arc_str_to_bytes(text),
maybe_headers: None,
},
))));
}
}
}
}
}
}
}
if self.is_deno_publish

View file

@ -44,18 +44,32 @@ pub static MODULE_INFO_CACHE_DB: CacheDBConfiguration = CacheDBConfiguration {
/// A cache of `deno_graph::ModuleInfo` objects. Using this leads to a considerable
/// performance improvement because when it exists we can skip parsing a module for
/// deno_graph.
#[derive(Debug)]
pub struct ModuleInfoCache {
conn: CacheDB,
parsed_source_cache: Arc<ParsedSourceCache>,
}
impl ModuleInfoCache {
#[cfg(test)]
pub fn new_in_memory(version: &'static str) -> Self {
Self::new(CacheDB::in_memory(&MODULE_INFO_CACHE_DB, version))
pub fn new_in_memory(
version: &'static str,
parsed_source_cache: Arc<ParsedSourceCache>,
) -> Self {
Self::new(
CacheDB::in_memory(&MODULE_INFO_CACHE_DB, version),
parsed_source_cache,
)
}
pub fn new(conn: CacheDB) -> Self {
Self { conn }
pub fn new(
conn: CacheDB,
parsed_source_cache: Arc<ParsedSourceCache>,
) -> Self {
Self {
conn,
parsed_source_cache,
}
}
/// Useful for testing: re-create this cache DB with a different current version.
@ -63,6 +77,7 @@ impl ModuleInfoCache {
pub(crate) fn recreate_with_version(self, version: &'static str) -> Self {
Self {
conn: self.conn.recreate_with_version(version),
parsed_source_cache: self.parsed_source_cache,
}
}
@ -113,13 +128,10 @@ impl ModuleInfoCache {
Ok(())
}
pub fn as_module_analyzer<'a>(
&'a self,
parsed_source_cache: &'a Arc<ParsedSourceCache>,
) -> ModuleInfoCacheModuleAnalyzer<'a> {
pub fn as_module_analyzer(&self) -> ModuleInfoCacheModuleAnalyzer {
ModuleInfoCacheModuleAnalyzer {
module_info_cache: self,
parsed_source_cache,
parsed_source_cache: &self.parsed_source_cache,
}
}
}
@ -129,6 +141,84 @@ pub struct ModuleInfoCacheModuleAnalyzer<'a> {
parsed_source_cache: &'a Arc<ParsedSourceCache>,
}
impl<'a> ModuleInfoCacheModuleAnalyzer<'a> {
fn load_cached_module_info(
&self,
specifier: &ModuleSpecifier,
media_type: MediaType,
source_hash: CacheDBHash,
) -> Option<ModuleInfo> {
match self.module_info_cache.get_module_info(
specifier,
media_type,
source_hash,
) {
Ok(Some(info)) => Some(info),
Ok(None) => None,
Err(err) => {
log::debug!(
"Error loading module cache info for {}. {:#}",
specifier,
err
);
None
}
}
}
fn save_module_info_to_cache(
&self,
specifier: &ModuleSpecifier,
media_type: MediaType,
source_hash: CacheDBHash,
module_info: &ModuleInfo,
) {
if let Err(err) = self.module_info_cache.set_module_info(
specifier,
media_type,
source_hash,
module_info,
) {
log::debug!(
"Error saving module cache info for {}. {:#}",
specifier,
err
);
}
}
pub fn analyze_sync(
&self,
specifier: &ModuleSpecifier,
media_type: MediaType,
source: &Arc<str>,
) -> Result<ModuleInfo, deno_ast::ParseDiagnostic> {
// attempt to load from the cache
let source_hash = CacheDBHash::from_source(source);
if let Some(info) =
self.load_cached_module_info(specifier, media_type, source_hash)
{
return Ok(info);
}
// otherwise, get the module info from the parsed source cache
let parser = self.parsed_source_cache.as_capturing_parser();
let analyzer = ParserModuleAnalyzer::new(&parser);
let module_info =
analyzer.analyze_sync(specifier, source.clone(), media_type)?;
// then attempt to cache it
self.save_module_info_to_cache(
specifier,
media_type,
source_hash,
&module_info,
);
Ok(module_info)
}
}
#[async_trait::async_trait(?Send)]
impl<'a> deno_graph::ModuleAnalyzer for ModuleInfoCacheModuleAnalyzer<'a> {
async fn analyze(
@ -139,20 +229,10 @@ impl<'a> deno_graph::ModuleAnalyzer for ModuleInfoCacheModuleAnalyzer<'a> {
) -> Result<ModuleInfo, deno_ast::ParseDiagnostic> {
// attempt to load from the cache
let source_hash = CacheDBHash::from_source(&source);
match self.module_info_cache.get_module_info(
specifier,
media_type,
source_hash,
) {
Ok(Some(info)) => return Ok(info),
Ok(None) => {}
Err(err) => {
log::debug!(
"Error loading module cache info for {}. {:#}",
specifier,
err
);
}
if let Some(info) =
self.load_cached_module_info(specifier, media_type, source_hash)
{
return Ok(info);
}
// otherwise, get the module info from the parsed source cache
@ -169,18 +249,12 @@ impl<'a> deno_graph::ModuleAnalyzer for ModuleInfoCacheModuleAnalyzer<'a> {
.unwrap()?;
// then attempt to cache it
if let Err(err) = self.module_info_cache.set_module_info(
self.save_module_info_to_cache(
specifier,
media_type,
source_hash,
&module_info,
) {
log::debug!(
"Error saving module cache info for {}. {:#}",
specifier,
err
);
}
);
Ok(module_info)
}
@ -202,7 +276,7 @@ fn serialize_media_type(media_type: MediaType) -> i64 {
Tsx => 11,
Json => 12,
Wasm => 13,
TsBuildInfo => 14,
Css => 14,
SourceMap => 15,
Unknown => 16,
}
@ -217,7 +291,7 @@ mod test {
#[test]
pub fn module_info_cache_general_use() {
let cache = ModuleInfoCache::new_in_memory("1.0.0");
let cache = ModuleInfoCache::new_in_memory("1.0.0", Default::default());
let specifier1 =
ModuleSpecifier::parse("https://localhost/mod.ts").unwrap();
let specifier2 =

View file

@ -5,12 +5,11 @@ use std::sync::Arc;
use deno_ast::MediaType;
use deno_ast::ModuleSpecifier;
use deno_ast::ParseDiagnostic;
use deno_ast::ParsedSource;
use deno_core::parking_lot::Mutex;
use deno_graph::CapturingModuleParser;
use deno_graph::DefaultModuleParser;
use deno_graph::ModuleParser;
use deno_graph::CapturingEsParser;
use deno_graph::DefaultEsParser;
use deno_graph::EsParser;
use deno_graph::ParseOptions;
use deno_graph::ParsedSourceStore;
@ -47,7 +46,7 @@ impl<'a> LazyGraphSourceParser<'a> {
}
}
#[derive(Default)]
#[derive(Debug, Default)]
pub struct ParsedSourceCache {
sources: Mutex<HashMap<ModuleSpecifier, ParsedSource>>,
}
@ -58,12 +57,11 @@ impl ParsedSourceCache {
module: &deno_graph::JsModule,
) -> Result<ParsedSource, deno_ast::ParseDiagnostic> {
let parser = self.as_capturing_parser();
// this will conditionally parse because it's using a CapturingModuleParser
parser.parse_module(ParseOptions {
// this will conditionally parse because it's using a CapturingEsParser
parser.parse_program(ParseOptions {
specifier: &module.specifier,
source: module.source.clone(),
media_type: module.media_type,
// don't bother enabling because this method is currently only used for vendoring
scope_analysis: false,
})
}
@ -87,10 +85,9 @@ impl ParsedSourceCache {
specifier,
source,
media_type,
// don't bother enabling because this method is currently only used for emitting
scope_analysis: false,
};
DefaultModuleParser.parse_module(options)
DefaultEsParser.parse_program(options)
}
/// Frees the parsed source from memory.
@ -100,8 +97,8 @@ impl ParsedSourceCache {
/// Creates a parser that will reuse a ParsedSource from the store
/// if it exists, or else parse.
pub fn as_capturing_parser(&self) -> CapturingModuleParser {
CapturingModuleParser::new(None, self)
pub fn as_capturing_parser(&self) -> CapturingEsParser {
CapturingEsParser::new(None, self)
}
}
@ -150,42 +147,3 @@ impl deno_graph::ParsedSourceStore for ParsedSourceCache {
}
}
}
pub struct EsmOrCjsChecker {
parsed_source_cache: Arc<ParsedSourceCache>,
}
impl EsmOrCjsChecker {
pub fn new(parsed_source_cache: Arc<ParsedSourceCache>) -> Self {
Self {
parsed_source_cache,
}
}
pub fn is_esm(
&self,
specifier: &ModuleSpecifier,
source: Arc<str>,
media_type: MediaType,
) -> Result<bool, ParseDiagnostic> {
// todo(dsherret): add a file cache here to avoid parsing with swc on each run
let source = match self.parsed_source_cache.get_parsed_source(specifier) {
Some(source) => source.clone(),
None => {
let source = deno_ast::parse_program(deno_ast::ParseParams {
specifier: specifier.clone(),
text: source,
media_type,
capture_tokens: true, // capture because it's used for cjs export analysis
scope_analysis: false,
maybe_syntax: None,
})?;
self
.parsed_source_cache
.set_parsed_source(specifier.clone(), source.clone());
source
}
};
Ok(source.is_module())
}
}

View file

@ -3,11 +3,14 @@
use crate::cache::EmitCache;
use crate::cache::FastInsecureHasher;
use crate::cache::ParsedSourceCache;
use crate::resolver::CjsTracker;
use deno_ast::ModuleKind;
use deno_ast::SourceMapOption;
use deno_ast::SourceRange;
use deno_ast::SourceRanged;
use deno_ast::SourceRangedForSpanned;
use deno_ast::TranspileModuleOptions;
use deno_ast::TranspileResult;
use deno_core::error::AnyError;
use deno_core::futures::stream::FuturesUnordered;
@ -19,7 +22,9 @@ use deno_graph::Module;
use deno_graph::ModuleGraph;
use std::sync::Arc;
#[derive(Debug)]
pub struct Emitter {
cjs_tracker: Arc<CjsTracker>,
emit_cache: Arc<EmitCache>,
parsed_source_cache: Arc<ParsedSourceCache>,
transpile_and_emit_options:
@ -30,6 +35,7 @@ pub struct Emitter {
impl Emitter {
pub fn new(
cjs_tracker: Arc<CjsTracker>,
emit_cache: Arc<EmitCache>,
parsed_source_cache: Arc<ParsedSourceCache>,
transpile_options: deno_ast::TranspileOptions,
@ -42,6 +48,7 @@ impl Emitter {
hasher.finish()
};
Self {
cjs_tracker,
emit_cache,
parsed_source_cache,
transpile_and_emit_options: Arc::new((transpile_options, emit_options)),
@ -59,21 +66,19 @@ impl Emitter {
continue;
};
// todo(https://github.com/denoland/deno_media_type/pull/12): use is_emittable()
let is_emittable = matches!(
module.media_type,
MediaType::TypeScript
| MediaType::Mts
| MediaType::Cts
| MediaType::Jsx
| MediaType::Tsx
);
if is_emittable {
if module.media_type.is_emittable() {
futures.push(
self
.emit_parsed_source(
&module.specifier,
module.media_type,
ModuleKind::from_is_cjs(
self.cjs_tracker.is_cjs_with_known_is_script(
&module.specifier,
module.media_type,
module.is_script,
)?,
),
&module.source,
)
.boxed_local(),
@ -92,9 +97,10 @@ impl Emitter {
pub fn maybe_cached_emit(
&self,
specifier: &ModuleSpecifier,
module_kind: deno_ast::ModuleKind,
source: &str,
) -> Option<String> {
let source_hash = self.get_source_hash(source);
let source_hash = self.get_source_hash(module_kind, source);
self.emit_cache.get_emit_code(specifier, source_hash)
}
@ -102,11 +108,12 @@ impl Emitter {
&self,
specifier: &ModuleSpecifier,
media_type: MediaType,
module_kind: deno_ast::ModuleKind,
source: &Arc<str>,
) -> Result<String, AnyError> {
// Note: keep this in sync with the sync version below
let helper = EmitParsedSourceHelper(self);
match helper.pre_emit_parsed_source(specifier, source) {
match helper.pre_emit_parsed_source(specifier, module_kind, source) {
PreEmitResult::Cached(emitted_text) => Ok(emitted_text),
PreEmitResult::NotCached { source_hash } => {
let parsed_source_cache = self.parsed_source_cache.clone();
@ -119,8 +126,9 @@ impl Emitter {
EmitParsedSourceHelper::transpile(
&parsed_source_cache,
&specifier,
source.clone(),
media_type,
module_kind,
source.clone(),
&transpile_and_emit_options.0,
&transpile_and_emit_options.1,
)
@ -142,18 +150,20 @@ impl Emitter {
&self,
specifier: &ModuleSpecifier,
media_type: MediaType,
module_kind: deno_ast::ModuleKind,
source: &Arc<str>,
) -> Result<String, AnyError> {
// Note: keep this in sync with the async version above
let helper = EmitParsedSourceHelper(self);
match helper.pre_emit_parsed_source(specifier, source) {
match helper.pre_emit_parsed_source(specifier, module_kind, source) {
PreEmitResult::Cached(emitted_text) => Ok(emitted_text),
PreEmitResult::NotCached { source_hash } => {
let transpiled_source = EmitParsedSourceHelper::transpile(
&self.parsed_source_cache,
specifier,
source.clone(),
media_type,
module_kind,
source.clone(),
&self.transpile_and_emit_options.0,
&self.transpile_and_emit_options.1,
)?;
@ -171,6 +181,7 @@ impl Emitter {
pub async fn load_and_emit_for_hmr(
&self,
specifier: &ModuleSpecifier,
module_kind: deno_ast::ModuleKind,
) -> Result<String, AnyError> {
let media_type = MediaType::from_specifier(specifier);
let source_code = tokio::fs::read_to_string(
@ -193,9 +204,14 @@ impl Emitter {
let mut options = self.transpile_and_emit_options.1.clone();
options.source_map = SourceMapOption::None;
let transpiled_source = parsed_source
.transpile(&self.transpile_and_emit_options.0, &options)?
.into_source()
.into_string()?;
.transpile(
&self.transpile_and_emit_options.0,
&deno_ast::TranspileModuleOptions {
module_kind: Some(module_kind),
},
&options,
)?
.into_source();
Ok(transpiled_source.text)
}
MediaType::JavaScript
@ -206,7 +222,7 @@ impl Emitter {
| MediaType::Dcts
| MediaType::Json
| MediaType::Wasm
| MediaType::TsBuildInfo
| MediaType::Css
| MediaType::SourceMap
| MediaType::Unknown => {
// clear this specifier from the parsed source cache as it's now out of date
@ -219,10 +235,11 @@ impl Emitter {
/// A hashing function that takes the source code and uses the global emit
/// options then generates a string hash which can be stored to
/// determine if the cached emit is valid or not.
fn get_source_hash(&self, source_text: &str) -> u64 {
fn get_source_hash(&self, module_kind: ModuleKind, source_text: &str) -> u64 {
FastInsecureHasher::new_without_deno_version() // stored in the transpile_and_emit_options_hash
.write_str(source_text)
.write_u64(self.transpile_and_emit_options_hash)
.write_hashable(module_kind)
.finish()
}
}
@ -239,9 +256,10 @@ impl<'a> EmitParsedSourceHelper<'a> {
pub fn pre_emit_parsed_source(
&self,
specifier: &ModuleSpecifier,
module_kind: deno_ast::ModuleKind,
source: &Arc<str>,
) -> PreEmitResult {
let source_hash = self.0.get_source_hash(source);
let source_hash = self.0.get_source_hash(module_kind, source);
if let Some(emit_code) =
self.0.emit_cache.get_emit_code(specifier, source_hash)
@ -255,8 +273,9 @@ impl<'a> EmitParsedSourceHelper<'a> {
pub fn transpile(
parsed_source_cache: &ParsedSourceCache,
specifier: &ModuleSpecifier,
source: Arc<str>,
media_type: MediaType,
module_kind: deno_ast::ModuleKind,
source: Arc<str>,
transpile_options: &deno_ast::TranspileOptions,
emit_options: &deno_ast::EmitOptions,
) -> Result<String, AnyError> {
@ -265,8 +284,13 @@ impl<'a> EmitParsedSourceHelper<'a> {
let parsed_source = parsed_source_cache
.remove_or_parse_module(specifier, source, media_type)?;
ensure_no_import_assertion(&parsed_source)?;
let transpile_result =
parsed_source.transpile(transpile_options, emit_options)?;
let transpile_result = parsed_source.transpile(
transpile_options,
&TranspileModuleOptions {
module_kind: Some(module_kind),
},
emit_options,
)?;
let transpiled_source = match transpile_result {
TranspileResult::Owned(source) => source,
TranspileResult::Cloned(source) => {
@ -275,8 +299,7 @@ impl<'a> EmitParsedSourceHelper<'a> {
}
};
debug_assert!(transpiled_source.source_map.is_none());
let text = String::from_utf8(transpiled_source.source)?;
Ok(text)
Ok(transpiled_source.text)
}
pub fn post_emit_parsed_source(
@ -321,7 +344,7 @@ fn ensure_no_import_assertion(
deno_core::anyhow::anyhow!("{}", msg)
}
let Some(module) = parsed_source.program_ref().as_module() else {
let deno_ast::ProgramRef::Module(module) = parsed_source.program_ref() else {
return Ok(());
};

View file

@ -88,6 +88,10 @@ fn get_resolution_error_class(err: &ResolutionError) -> &'static str {
}
}
fn get_try_from_int_error_class(_: &std::num::TryFromIntError) -> &'static str {
"TypeError"
}
pub fn get_error_class_name(e: &AnyError) -> &'static str {
deno_runtime::errors::get_error_class_name(e)
.or_else(|| {
@ -106,5 +110,9 @@ pub fn get_error_class_name(e: &AnyError) -> &'static str {
e.downcast_ref::<ResolutionError>()
.map(get_resolution_error_class)
})
.or_else(|| {
e.downcast_ref::<std::num::TryFromIntError>()
.map(get_try_from_int_error_class)
})
.unwrap_or("Error")
}

View file

@ -11,10 +11,10 @@ use crate::args::StorageKeyResolver;
use crate::args::TsConfigType;
use crate::cache::Caches;
use crate::cache::CodeCache;
use crate::cache::DenoCacheEnvFsAdapter;
use crate::cache::DenoDir;
use crate::cache::DenoDirProvider;
use crate::cache::EmitCache;
use crate::cache::EsmOrCjsChecker;
use crate::cache::GlobalHttpCache;
use crate::cache::HttpCache;
use crate::cache::LocalHttpCache;
@ -33,12 +33,16 @@ use crate::module_loader::ModuleLoadPreparer;
use crate::node::CliCjsCodeAnalyzer;
use crate::node::CliNodeCodeTranslator;
use crate::npm::create_cli_npm_resolver;
use crate::npm::create_in_npm_pkg_checker;
use crate::npm::CliByonmNpmResolverCreateOptions;
use crate::npm::CliManagedInNpmPkgCheckerCreateOptions;
use crate::npm::CliManagedNpmResolverCreateOptions;
use crate::npm::CliNpmResolver;
use crate::npm::CliNpmResolverCreateOptions;
use crate::npm::CliNpmResolverManagedCreateOptions;
use crate::npm::CliNpmResolverManagedSnapshotOption;
use crate::resolver::CjsResolutionStore;
use crate::npm::CreateInNpmPkgCheckerOptions;
use crate::resolver::CjsTracker;
use crate::resolver::CjsTrackerOptions;
use crate::resolver::CliDenoResolverFs;
use crate::resolver::CliGraphResolver;
use crate::resolver::CliGraphResolverOptions;
@ -51,6 +55,7 @@ use crate::tools::check::TypeChecker;
use crate::tools::coverage::CoverageCollector;
use crate::tools::lint::LintRuleProvider;
use crate::tools::run::hmr::HmrRunner;
use crate::tsc::TypeCheckingCjsTracker;
use crate::util::file_watcher::WatcherCommunicator;
use crate::util::fs::canonicalize_path_maybe_not_exists;
use crate::util::progress_bar::ProgressBar;
@ -59,6 +64,7 @@ use crate::worker::CliMainWorkerFactory;
use crate::worker::CliMainWorkerOptions;
use std::path::PathBuf;
use deno_cache_dir::npm::NpmCacheDir;
use deno_config::workspace::PackageJsonDepResolution;
use deno_config::workspace::WorkspaceResolver;
use deno_core::error::AnyError;
@ -68,6 +74,7 @@ use deno_core::FeatureChecker;
use deno_runtime::deno_fs;
use deno_runtime::deno_node::DenoFsNodeResolverEnv;
use deno_runtime::deno_node::NodeResolver;
use deno_runtime::deno_node::PackageJsonResolver;
use deno_runtime::deno_permissions::Permissions;
use deno_runtime::deno_permissions::PermissionsContainer;
use deno_runtime::deno_tls::rustls::RootCertStore;
@ -77,6 +84,7 @@ use deno_runtime::inspector_server::InspectorServer;
use deno_runtime::permissions::RuntimePermissionDescriptorParser;
use log::warn;
use node_resolver::analyze::NodeCodeTranslator;
use node_resolver::InNpmPackageChecker;
use once_cell::sync::OnceCell;
use std::future::Future;
use std::sync::Arc;
@ -164,39 +172,41 @@ impl<T> Deferred<T> {
#[derive(Default)]
struct CliFactoryServices {
cli_options: Deferred<Arc<CliOptions>>,
blob_store: Deferred<Arc<BlobStore>>,
caches: Deferred<Arc<Caches>>,
cjs_tracker: Deferred<Arc<CjsTracker>>,
cli_node_resolver: Deferred<Arc<CliNodeResolver>>,
cli_options: Deferred<Arc<CliOptions>>,
code_cache: Deferred<Arc<CodeCache>>,
emit_cache: Deferred<Arc<EmitCache>>,
emitter: Deferred<Arc<Emitter>>,
feature_checker: Deferred<Arc<FeatureChecker>>,
file_fetcher: Deferred<Arc<FileFetcher>>,
fs: Deferred<Arc<dyn deno_fs::FileSystem>>,
global_http_cache: Deferred<Arc<GlobalHttpCache>>,
http_cache: Deferred<Arc<dyn HttpCache>>,
http_client_provider: Deferred<Arc<HttpClientProvider>>,
emit_cache: Deferred<Arc<EmitCache>>,
emitter: Deferred<Arc<Emitter>>,
esm_or_cjs_checker: Deferred<Arc<EsmOrCjsChecker>>,
fs: Deferred<Arc<dyn deno_fs::FileSystem>>,
in_npm_pkg_checker: Deferred<Arc<dyn InNpmPackageChecker>>,
main_graph_container: Deferred<Arc<MainModuleGraphContainer>>,
maybe_inspector_server: Deferred<Option<Arc<InspectorServer>>>,
root_cert_store_provider: Deferred<Arc<dyn RootCertStoreProvider>>,
blob_store: Deferred<Arc<BlobStore>>,
module_info_cache: Deferred<Arc<ModuleInfoCache>>,
parsed_source_cache: Deferred<Arc<ParsedSourceCache>>,
resolver: Deferred<Arc<CliGraphResolver>>,
maybe_file_watcher_reporter: Deferred<Option<FileWatcherReporter>>,
maybe_inspector_server: Deferred<Option<Arc<InspectorServer>>>,
module_graph_builder: Deferred<Arc<ModuleGraphBuilder>>,
module_graph_creator: Deferred<Arc<ModuleGraphCreator>>,
module_info_cache: Deferred<Arc<ModuleInfoCache>>,
module_load_preparer: Deferred<Arc<ModuleLoadPreparer>>,
node_code_translator: Deferred<Arc<CliNodeCodeTranslator>>,
node_resolver: Deferred<Arc<NodeResolver>>,
npm_cache_dir: Deferred<Arc<NpmCacheDir>>,
npm_resolver: Deferred<Arc<dyn CliNpmResolver>>,
parsed_source_cache: Deferred<Arc<ParsedSourceCache>>,
permission_desc_parser: Deferred<Arc<RuntimePermissionDescriptorParser>>,
pkg_json_resolver: Deferred<Arc<PackageJsonResolver>>,
resolver: Deferred<Arc<CliGraphResolver>>,
root_cert_store_provider: Deferred<Arc<dyn RootCertStoreProvider>>,
root_permissions_container: Deferred<PermissionsContainer>,
sloppy_imports_resolver: Deferred<Option<Arc<CliSloppyImportsResolver>>>,
text_only_progress_bar: Deferred<ProgressBar>,
type_checker: Deferred<Arc<TypeChecker>>,
cjs_resolutions: Deferred<Arc<CjsResolutionStore>>,
cli_node_resolver: Deferred<Arc<CliNodeResolver>>,
feature_checker: Deferred<Arc<FeatureChecker>>,
code_cache: Deferred<Arc<CodeCache>>,
workspace_resolver: Deferred<Arc<WorkspaceResolver>>,
}
@ -300,12 +310,6 @@ impl CliFactory {
.get_or_init(|| ProgressBar::new(ProgressBarStyle::TextOnly))
}
pub fn esm_or_cjs_checker(&self) -> &Arc<EsmOrCjsChecker> {
self.services.esm_or_cjs_checker.get_or_init(|| {
Arc::new(EsmOrCjsChecker::new(self.parsed_source_cache().clone()))
})
}
pub fn global_http_cache(&self) -> Result<&Arc<GlobalHttpCache>, AnyError> {
self.services.global_http_cache.get_or_try_init(|| {
Ok(Arc::new(GlobalHttpCache::new(
@ -359,56 +363,112 @@ impl CliFactory {
self.services.fs.get_or_init(|| Arc::new(deno_fs::RealFs))
}
pub fn in_npm_pkg_checker(
&self,
) -> Result<&Arc<dyn InNpmPackageChecker>, AnyError> {
self.services.in_npm_pkg_checker.get_or_try_init(|| {
let cli_options = self.cli_options()?;
let options = if cli_options.use_byonm() {
CreateInNpmPkgCheckerOptions::Byonm
} else {
CreateInNpmPkgCheckerOptions::Managed(
CliManagedInNpmPkgCheckerCreateOptions {
root_cache_dir_url: self.npm_cache_dir()?.root_dir_url(),
maybe_node_modules_path: cli_options
.node_modules_dir_path()
.map(|p| p.as_path()),
},
)
};
Ok(create_in_npm_pkg_checker(options))
})
}
pub fn npm_cache_dir(&self) -> Result<&Arc<NpmCacheDir>, AnyError> {
self.services.npm_cache_dir.get_or_try_init(|| {
let fs = self.fs();
let global_path = self.deno_dir()?.npm_folder_path();
let cli_options = self.cli_options()?;
Ok(Arc::new(NpmCacheDir::new(
&DenoCacheEnvFsAdapter(fs.as_ref()),
global_path,
cli_options.npmrc().get_all_known_registries_urls(),
)))
})
}
pub async fn npm_resolver(
&self,
) -> Result<&Arc<dyn CliNpmResolver>, AnyError> {
self
.services
.npm_resolver
.get_or_try_init_async(async {
let fs = self.fs();
let cli_options = self.cli_options()?;
// For `deno install` we want to force the managed resolver so it can set up `node_modules/` directory.
create_cli_npm_resolver(if cli_options.use_byonm() && !matches!(cli_options.sub_command(), DenoSubcommand::Install(_) | DenoSubcommand::Add(_) | DenoSubcommand::Remove(_)) {
CliNpmResolverCreateOptions::Byonm(CliByonmNpmResolverCreateOptions {
fs: CliDenoResolverFs(fs.clone()),
root_node_modules_dir: Some(match cli_options.node_modules_dir_path() {
Some(node_modules_path) => node_modules_path.to_path_buf(),
// path needs to be canonicalized for node resolution
// (node_modules_dir_path above is already canonicalized)
None => canonicalize_path_maybe_not_exists(cli_options.initial_cwd())?
.join("node_modules"),
}),
})
} else {
CliNpmResolverCreateOptions::Managed(CliNpmResolverManagedCreateOptions {
snapshot: match cli_options.resolve_npm_resolution_snapshot()? {
Some(snapshot) => {
CliNpmResolverManagedSnapshotOption::Specified(Some(snapshot))
}
None => match cli_options.maybe_lockfile() {
Some(lockfile) => {
CliNpmResolverManagedSnapshotOption::ResolveFromLockfile(
lockfile.clone(),
)
}
None => CliNpmResolverManagedSnapshotOption::Specified(None),
.get_or_try_init_async(
async {
let fs = self.fs();
let cli_options = self.cli_options()?;
create_cli_npm_resolver(if cli_options.use_byonm() {
CliNpmResolverCreateOptions::Byonm(
CliByonmNpmResolverCreateOptions {
fs: CliDenoResolverFs(fs.clone()),
pkg_json_resolver: self.pkg_json_resolver().clone(),
root_node_modules_dir: Some(
match cli_options.node_modules_dir_path() {
Some(node_modules_path) => node_modules_path.to_path_buf(),
// path needs to be canonicalized for node resolution
// (node_modules_dir_path above is already canonicalized)
None => canonicalize_path_maybe_not_exists(
cli_options.initial_cwd(),
)?
.join("node_modules"),
},
),
},
},
maybe_lockfile: cli_options.maybe_lockfile().cloned(),
fs: fs.clone(),
http_client_provider: self.http_client_provider().clone(),
npm_global_cache_dir: self.deno_dir()?.npm_folder_path(),
cache_setting: cli_options.cache_setting(),
text_only_progress_bar: self.text_only_progress_bar().clone(),
maybe_node_modules_path: cli_options.node_modules_dir_path().cloned(),
npm_install_deps_provider: Arc::new(NpmInstallDepsProvider::from_workspace(cli_options.workspace())),
npm_system_info: cli_options.npm_system_info(),
npmrc: cli_options.npmrc().clone(),
lifecycle_scripts: cli_options.lifecycle_scripts_config(),
)
} else {
CliNpmResolverCreateOptions::Managed(
CliManagedNpmResolverCreateOptions {
snapshot: match cli_options.resolve_npm_resolution_snapshot()? {
Some(snapshot) => {
CliNpmResolverManagedSnapshotOption::Specified(Some(
snapshot,
))
}
None => match cli_options.maybe_lockfile() {
Some(lockfile) => {
CliNpmResolverManagedSnapshotOption::ResolveFromLockfile(
lockfile.clone(),
)
}
None => {
CliNpmResolverManagedSnapshotOption::Specified(None)
}
},
},
maybe_lockfile: cli_options.maybe_lockfile().cloned(),
fs: fs.clone(),
http_client_provider: self.http_client_provider().clone(),
npm_cache_dir: self.npm_cache_dir()?.clone(),
cache_setting: cli_options.cache_setting(),
text_only_progress_bar: self.text_only_progress_bar().clone(),
maybe_node_modules_path: cli_options
.node_modules_dir_path()
.cloned(),
npm_install_deps_provider: Arc::new(
NpmInstallDepsProvider::from_workspace(
cli_options.workspace(),
),
),
npm_system_info: cli_options.npm_system_info(),
npmrc: cli_options.npmrc().clone(),
lifecycle_scripts: cli_options.lifecycle_scripts_config(),
},
)
})
}).await
}.boxed_local())
.await
}
.boxed_local(),
)
.await
}
@ -513,6 +573,7 @@ impl CliFactory {
self.services.module_info_cache.get_or_try_init(|| {
Ok(Arc::new(ModuleInfoCache::new(
self.caches()?.dep_analysis_db(),
self.parsed_source_cache().clone(),
)))
})
}
@ -541,6 +602,7 @@ impl CliFactory {
ts_config_result.ts_config,
)?;
Ok(Arc::new(Emitter::new(
self.cjs_tracker()?.clone(),
self.emit_cache()?.clone(),
self.parsed_source_cache().clone(),
transpile_options,
@ -564,7 +626,9 @@ impl CliFactory {
async {
Ok(Arc::new(NodeResolver::new(
DenoFsNodeResolverEnv::new(self.fs().clone()),
self.in_npm_pkg_checker()?.clone(),
self.npm_resolver().await?.clone().into_npm_resolver(),
self.pkg_json_resolver().clone(),
)))
}
.boxed_local(),
@ -582,24 +646,35 @@ impl CliFactory {
let caches = self.caches()?;
let node_analysis_cache =
NodeAnalysisCache::new(caches.node_analysis_db());
let node_resolver = self.cli_node_resolver().await?.clone();
let node_resolver = self.node_resolver().await?.clone();
let cjs_esm_analyzer = CliCjsCodeAnalyzer::new(
node_analysis_cache,
self.cjs_tracker()?.clone(),
self.fs().clone(),
node_resolver,
Some(self.parsed_source_cache().clone()),
self.cli_options()?.is_npm_main(),
);
Ok(Arc::new(NodeCodeTranslator::new(
cjs_esm_analyzer,
DenoFsNodeResolverEnv::new(self.fs().clone()),
self.node_resolver().await?.clone(),
self.in_npm_pkg_checker()?.clone(),
node_resolver,
self.npm_resolver().await?.clone().into_npm_resolver(),
self.pkg_json_resolver().clone(),
)))
})
.await
}
pub fn pkg_json_resolver(&self) -> &Arc<PackageJsonResolver> {
self.services.pkg_json_resolver.get_or_init(|| {
Arc::new(PackageJsonResolver::new(DenoFsNodeResolverEnv::new(
self.fs().clone(),
)))
})
}
pub async fn type_checker(&self) -> Result<&Arc<TypeChecker>, AnyError> {
self
.services
@ -608,6 +683,10 @@ impl CliFactory {
let cli_options = self.cli_options()?;
Ok(Arc::new(TypeChecker::new(
self.caches()?.clone(),
Arc::new(TypeCheckingCjsTracker::new(
self.cjs_tracker()?.clone(),
self.module_info_cache()?.clone(),
)),
cli_options.clone(),
self.module_graph_builder().await?.clone(),
self.node_resolver().await?.clone(),
@ -626,19 +705,18 @@ impl CliFactory {
.get_or_try_init_async(async {
let cli_options = self.cli_options()?;
Ok(Arc::new(ModuleGraphBuilder::new(
cli_options.clone(),
self.caches()?.clone(),
self.esm_or_cjs_checker().clone(),
cli_options.clone(),
self.file_fetcher()?.clone(),
self.fs().clone(),
self.resolver().await?.clone(),
self.cli_node_resolver().await?.clone(),
self.npm_resolver().await?.clone(),
self.module_info_cache()?.clone(),
self.parsed_source_cache().clone(),
self.global_http_cache()?.clone(),
self.in_npm_pkg_checker()?.clone(),
cli_options.maybe_lockfile().cloned(),
self.maybe_file_watcher_reporter().clone(),
self.file_fetcher()?.clone(),
self.global_http_cache()?.clone(),
self.module_info_cache()?.clone(),
self.npm_resolver().await?.clone(),
self.parsed_source_cache().clone(),
self.resolver().await?.clone(),
self.root_permissions_container()?.clone(),
)))
})
@ -710,8 +788,17 @@ impl CliFactory {
.await
}
pub fn cjs_resolutions(&self) -> &Arc<CjsResolutionStore> {
self.services.cjs_resolutions.get_or_init(Default::default)
pub fn cjs_tracker(&self) -> Result<&Arc<CjsTracker>, AnyError> {
self.services.cjs_tracker.get_or_try_init(|| {
let options = self.cli_options()?;
Ok(Arc::new(CjsTracker::new(
self.in_npm_pkg_checker()?.clone(),
self.pkg_json_resolver().clone(),
CjsTrackerOptions {
unstable_detect_cjs: options.unstable_detect_cjs(),
},
)))
})
}
pub async fn cli_node_resolver(
@ -722,8 +809,9 @@ impl CliFactory {
.cli_node_resolver
.get_or_try_init_async(async {
Ok(Arc::new(CliNodeResolver::new(
self.cjs_resolutions().clone(),
self.cjs_tracker()?.clone(),
self.fs().clone(),
self.in_npm_pkg_checker()?.clone(),
self.node_resolver().await?.clone(),
self.npm_resolver().await?.clone(),
)))
@ -761,6 +849,7 @@ impl CliFactory {
) -> Result<DenoCompileBinaryWriter, AnyError> {
let cli_options = self.cli_options()?;
Ok(DenoCompileBinaryWriter::new(
self.cjs_tracker()?,
self.deno_dir()?,
self.emitter()?,
self.file_fetcher()?,
@ -791,53 +880,60 @@ impl CliFactory {
&self,
) -> Result<CliMainWorkerFactory, AnyError> {
let cli_options = self.cli_options()?;
let fs = self.fs();
let node_resolver = self.node_resolver().await?;
let npm_resolver = self.npm_resolver().await?;
let fs = self.fs();
let cli_node_resolver = self.cli_node_resolver().await?;
let cli_npm_resolver = self.npm_resolver().await?.clone();
let in_npm_pkg_checker = self.in_npm_pkg_checker()?;
let maybe_file_watcher_communicator = if cli_options.has_hmr() {
Some(self.watcher_communicator.clone().unwrap())
} else {
None
};
let node_code_translator = self.node_code_translator().await?;
let cjs_tracker = self.cjs_tracker()?.clone();
let pkg_json_resolver = self.pkg_json_resolver().clone();
Ok(CliMainWorkerFactory::new(
self.blob_store().clone(),
self.cjs_resolutions().clone(),
if cli_options.code_cache_enabled() {
Some(self.code_cache()?.clone())
} else {
None
},
self.feature_checker()?.clone(),
self.fs().clone(),
fs.clone(),
maybe_file_watcher_communicator,
self.maybe_inspector_server()?.clone(),
cli_options.maybe_lockfile().cloned(),
Box::new(CliModuleLoaderFactory::new(
cli_options,
cjs_tracker,
if cli_options.code_cache_enabled() {
Some(self.code_cache()?.clone())
} else {
None
},
self.emitter()?.clone(),
fs.clone(),
in_npm_pkg_checker.clone(),
self.main_module_graph_container().await?.clone(),
self.module_load_preparer().await?.clone(),
node_code_translator.clone(),
cli_node_resolver.clone(),
cli_npm_resolver.clone(),
NpmModuleLoader::new(
self.cjs_resolutions().clone(),
self.node_code_translator().await?.clone(),
self.cjs_tracker()?.clone(),
fs.clone(),
cli_node_resolver.clone(),
node_code_translator.clone(),
),
self.parsed_source_cache().clone(),
self.resolver().await?.clone(),
)),
node_resolver.clone(),
npm_resolver.clone(),
pkg_json_resolver,
self.root_cert_store_provider().clone(),
self.root_permissions_container()?.clone(),
StorageKeyResolver::from_options(cli_options),
@ -853,8 +949,10 @@ impl CliFactory {
let create_hmr_runner = if cli_options.has_hmr() {
let watcher_communicator = self.watcher_communicator.clone().unwrap();
let emitter = self.emitter()?.clone();
let cjs_tracker = self.cjs_tracker()?.clone();
let fn_: crate::worker::CreateHmrRunnerCb = Box::new(move |session| {
Box::new(HmrRunner::new(
cjs_tracker.clone(),
emitter.clone(),
session,
watcher_communicator.clone(),
@ -891,7 +989,6 @@ impl CliFactory {
inspect_wait: cli_options.inspect_wait().is_some(),
strace_ops: cli_options.strace_ops().clone(),
is_inspecting: cli_options.is_inspecting(),
is_npm_main: cli_options.is_npm_main(),
location: cli_options.location_flag().clone(),
// if the user ran a binary command, we'll need to set process.argv[0]
// to be the name of the binary command instead of deno
@ -909,7 +1006,6 @@ impl CliFactory {
node_ipc: cli_options.node_ipc_fd(),
serve_port: cli_options.serve_port(),
serve_host: cli_options.serve_host(),
unstable_detect_cjs: cli_options.unstable_detect_cjs(),
})
}
}

View file

@ -6,7 +6,6 @@ use crate::args::CliLockfile;
use crate::args::CliOptions;
use crate::args::DENO_DISABLE_PEDANTIC_NODE_WARNINGS;
use crate::cache;
use crate::cache::EsmOrCjsChecker;
use crate::cache::GlobalHttpCache;
use crate::cache::ModuleInfoCache;
use crate::cache::ParsedSourceCache;
@ -15,7 +14,6 @@ use crate::errors::get_error_class_name;
use crate::file_fetcher::FileFetcher;
use crate::npm::CliNpmResolver;
use crate::resolver::CliGraphResolver;
use crate::resolver::CliNodeResolver;
use crate::resolver::CliSloppyImportsResolver;
use crate::resolver::SloppyImportsCachedFs;
use crate::tools::check;
@ -50,6 +48,7 @@ use deno_runtime::deno_permissions::PermissionsContainer;
use deno_semver::jsr::JsrDepPackageReq;
use deno_semver::package::PackageNv;
use import_map::ImportMapError;
use node_resolver::InNpmPackageChecker;
use std::collections::HashSet;
use std::error::Error;
use std::ops::Deref;
@ -379,54 +378,51 @@ pub struct BuildFastCheckGraphOptions<'a> {
}
pub struct ModuleGraphBuilder {
options: Arc<CliOptions>,
caches: Arc<cache::Caches>,
esm_or_cjs_checker: Arc<EsmOrCjsChecker>,
cli_options: Arc<CliOptions>,
file_fetcher: Arc<FileFetcher>,
fs: Arc<dyn FileSystem>,
resolver: Arc<CliGraphResolver>,
node_resolver: Arc<CliNodeResolver>,
npm_resolver: Arc<dyn CliNpmResolver>,
module_info_cache: Arc<ModuleInfoCache>,
parsed_source_cache: Arc<ParsedSourceCache>,
global_http_cache: Arc<GlobalHttpCache>,
in_npm_pkg_checker: Arc<dyn InNpmPackageChecker>,
lockfile: Option<Arc<CliLockfile>>,
maybe_file_watcher_reporter: Option<FileWatcherReporter>,
file_fetcher: Arc<FileFetcher>,
global_http_cache: Arc<GlobalHttpCache>,
module_info_cache: Arc<ModuleInfoCache>,
npm_resolver: Arc<dyn CliNpmResolver>,
parsed_source_cache: Arc<ParsedSourceCache>,
resolver: Arc<CliGraphResolver>,
root_permissions_container: PermissionsContainer,
}
impl ModuleGraphBuilder {
#[allow(clippy::too_many_arguments)]
pub fn new(
options: Arc<CliOptions>,
caches: Arc<cache::Caches>,
esm_or_cjs_checker: Arc<EsmOrCjsChecker>,
cli_options: Arc<CliOptions>,
file_fetcher: Arc<FileFetcher>,
fs: Arc<dyn FileSystem>,
resolver: Arc<CliGraphResolver>,
node_resolver: Arc<CliNodeResolver>,
npm_resolver: Arc<dyn CliNpmResolver>,
module_info_cache: Arc<ModuleInfoCache>,
parsed_source_cache: Arc<ParsedSourceCache>,
global_http_cache: Arc<GlobalHttpCache>,
in_npm_pkg_checker: Arc<dyn InNpmPackageChecker>,
lockfile: Option<Arc<CliLockfile>>,
maybe_file_watcher_reporter: Option<FileWatcherReporter>,
file_fetcher: Arc<FileFetcher>,
global_http_cache: Arc<GlobalHttpCache>,
module_info_cache: Arc<ModuleInfoCache>,
npm_resolver: Arc<dyn CliNpmResolver>,
parsed_source_cache: Arc<ParsedSourceCache>,
resolver: Arc<CliGraphResolver>,
root_permissions_container: PermissionsContainer,
) -> Self {
Self {
options,
caches,
esm_or_cjs_checker,
cli_options,
file_fetcher,
fs,
resolver,
node_resolver,
npm_resolver,
module_info_cache,
parsed_source_cache,
global_http_cache,
in_npm_pkg_checker,
lockfile,
maybe_file_watcher_reporter,
file_fetcher,
global_http_cache,
module_info_cache,
npm_resolver,
parsed_source_cache,
resolver,
root_permissions_container,
}
}
@ -512,13 +508,11 @@ impl ModuleGraphBuilder {
}
let maybe_imports = if options.graph_kind.include_types() {
self.options.to_compiler_option_types()?
self.cli_options.to_compiler_option_types()?
} else {
Vec::new()
};
let analyzer = self
.module_info_cache
.as_module_analyzer(&self.parsed_source_cache);
let analyzer = self.module_info_cache.as_module_analyzer();
let mut loader = match options.loader {
Some(loader) => MutLoaderRef::Borrowed(loader),
None => MutLoaderRef::Owned(self.create_graph_loader()),
@ -566,7 +560,7 @@ impl ModuleGraphBuilder {
// ensure an "npm install" is done if the user has explicitly
// opted into using a node_modules directory
if self
.options
.cli_options
.node_modules_dir()?
.map(|m| m.uses_node_modules_dir())
.unwrap_or(false)
@ -677,10 +671,10 @@ impl ModuleGraphBuilder {
graph.build_fast_check_type_graph(
deno_graph::BuildFastCheckTypeGraphOptions {
jsr_url_provider: &CliJsrUrlProvider,
es_parser: Some(&parser),
fast_check_cache: fast_check_cache.as_ref().map(|c| c as _),
fast_check_dts: false,
module_parser: Some(&parser),
jsr_url_provider: &CliJsrUrlProvider,
resolver: Some(graph_resolver),
npm_resolver: Some(&graph_npm_resolver),
workspace_fast_check: options.workspace_fast_check,
@ -699,20 +693,18 @@ impl ModuleGraphBuilder {
permissions: PermissionsContainer,
) -> cache::FetchCacher {
cache::FetchCacher::new(
self.esm_or_cjs_checker.clone(),
self.file_fetcher.clone(),
self.fs.clone(),
self.global_http_cache.clone(),
self.node_resolver.clone(),
self.npm_resolver.clone(),
self.in_npm_pkg_checker.clone(),
self.module_info_cache.clone(),
cache::FetchCacherOptions {
file_header_overrides: self.options.resolve_file_header_overrides(),
file_header_overrides: self.cli_options.resolve_file_header_overrides(),
permissions,
is_deno_publish: matches!(
self.options.sub_command(),
self.cli_options.sub_command(),
crate::args::DenoSubcommand::Publish { .. }
),
unstable_detect_cjs: self.options.unstable_detect_cjs(),
},
)
}
@ -737,12 +729,12 @@ impl ModuleGraphBuilder {
&self.fs,
roots,
GraphValidOptions {
kind: if self.options.type_check_mode().is_true() {
kind: if self.cli_options.type_check_mode().is_true() {
GraphKind::All
} else {
GraphKind::CodeOnly
},
check_js: self.options.check_js(),
check_js: self.cli_options.check_js(),
exit_integrity_errors: true,
},
)

View file

@ -10,9 +10,12 @@ use super::tsc;
use super::urls::url_to_uri;
use crate::args::jsr_url;
use crate::lsp::logging::lsp_warn;
use crate::lsp::search::PackageSearchApi;
use crate::tools::lint::CliLinter;
use crate::util::path::relative_specifier;
use deno_config::workspace::MappedResolution;
use deno_graph::source::ResolutionMode;
use deno_lint::diagnostic::LintDiagnosticRange;
use deno_ast::SourceRange;
@ -36,7 +39,6 @@ use deno_semver::package::PackageReq;
use deno_semver::package::PackageReqReference;
use deno_semver::Version;
use import_map::ImportMap;
use node_resolver::NpmResolver;
use once_cell::sync::Lazy;
use regex::Regex;
use std::borrow::Cow;
@ -229,6 +231,7 @@ pub struct TsResponseImportMapper<'a> {
documents: &'a Documents,
maybe_import_map: Option<&'a ImportMap>,
resolver: &'a LspResolver,
tsc_specifier_map: &'a tsc::TscSpecifierMap,
file_referrer: ModuleSpecifier,
}
@ -237,12 +240,14 @@ impl<'a> TsResponseImportMapper<'a> {
documents: &'a Documents,
maybe_import_map: Option<&'a ImportMap>,
resolver: &'a LspResolver,
tsc_specifier_map: &'a tsc::TscSpecifierMap,
file_referrer: &ModuleSpecifier,
) -> Self {
Self {
documents,
maybe_import_map,
resolver,
tsc_specifier_map,
file_referrer: file_referrer.clone(),
}
}
@ -336,7 +341,12 @@ impl<'a> TsResponseImportMapper<'a> {
.resolver
.maybe_managed_npm_resolver(Some(&self.file_referrer))
{
if npm_resolver.in_npm_package(specifier) {
let in_npm_pkg = self
.resolver
.maybe_node_resolver(Some(&self.file_referrer))
.map(|n| n.in_npm_package(specifier))
.unwrap_or(false);
if in_npm_pkg {
if let Ok(Some(pkg_id)) =
npm_resolver.resolve_pkg_id_from_specifier(specifier)
{
@ -383,6 +393,11 @@ impl<'a> TsResponseImportMapper<'a> {
}
}
}
} else if let Some(dep_name) = self
.resolver
.file_url_to_package_json_dep(specifier, Some(&self.file_referrer))
{
return Some(dep_name);
}
// check if the import map has this specifier
@ -453,19 +468,36 @@ impl<'a> TsResponseImportMapper<'a> {
specifier: &str,
referrer: &ModuleSpecifier,
) -> Option<String> {
if let Ok(specifier) = referrer.join(specifier) {
if let Some(specifier) = self.check_specifier(&specifier, referrer) {
return Some(specifier);
}
}
let specifier = specifier.strip_suffix(".js").unwrap_or(specifier);
for ext in SUPPORTED_EXTENSIONS {
let specifier_with_ext = format!("{specifier}{ext}");
if self
.documents
.contains_import(&specifier_with_ext, referrer)
let specifier_stem = specifier.strip_suffix(".js").unwrap_or(specifier);
let specifiers = std::iter::once(Cow::Borrowed(specifier)).chain(
SUPPORTED_EXTENSIONS
.iter()
.map(|ext| Cow::Owned(format!("{specifier_stem}{ext}"))),
);
for specifier in specifiers {
if let Some(specifier) = self
.resolver
.as_graph_resolver(Some(&self.file_referrer))
.resolve(
&specifier,
&deno_graph::Range {
specifier: referrer.clone(),
start: deno_graph::Position::zeroed(),
end: deno_graph::Position::zeroed(),
},
ResolutionMode::Types,
)
.ok()
.and_then(|s| self.tsc_specifier_map.normalize(s.as_str()).ok())
.filter(|s| self.documents.exists(s, Some(&self.file_referrer)))
{
return Some(specifier_with_ext);
if let Some(specifier) = self
.check_specifier(&specifier, referrer)
.or_else(|| relative_specifier(referrer, &specifier))
.filter(|s| !s.contains("/node_modules/"))
{
return Some(specifier);
}
}
}
None
@ -555,8 +587,9 @@ fn try_reverse_map_package_json_exports(
pub fn fix_ts_import_changes(
referrer: &ModuleSpecifier,
changes: &[tsc::FileTextChanges],
import_mapper: &TsResponseImportMapper,
language_server: &language_server::Inner,
) -> Result<Vec<tsc::FileTextChanges>, AnyError> {
let import_mapper = language_server.get_ts_response_import_mapper(referrer);
let mut r = Vec::new();
for change in changes {
let mut text_changes = Vec::new();
@ -601,7 +634,7 @@ pub fn fix_ts_import_changes(
fn fix_ts_import_action<'a>(
referrer: &ModuleSpecifier,
action: &'a tsc::CodeFixAction,
import_mapper: &TsResponseImportMapper,
language_server: &language_server::Inner,
) -> Option<Cow<'a, tsc::CodeFixAction>> {
if !matches!(
action.fix_name.as_str(),
@ -617,6 +650,7 @@ fn fix_ts_import_action<'a>(
let Some(specifier) = specifier else {
return Some(Cow::Borrowed(action));
};
let import_mapper = language_server.get_ts_response_import_mapper(referrer);
if let Some(new_specifier) =
import_mapper.check_unresolved_specifier(specifier, referrer)
{
@ -714,8 +748,14 @@ pub fn ts_changes_to_edit(
) -> Result<Option<lsp::WorkspaceEdit>, AnyError> {
let mut text_document_edits = Vec::new();
for change in changes {
let text_document_edit = change.to_text_document_edit(language_server)?;
text_document_edits.push(text_document_edit);
let edit = match change.to_text_document_edit(language_server) {
Ok(e) => e,
Err(err) => {
lsp_warn!("Couldn't covert text document edit: {:#}", err);
continue;
}
};
text_document_edits.push(edit);
}
Ok(Some(lsp::WorkspaceEdit {
changes: None,
@ -724,7 +764,7 @@ pub fn ts_changes_to_edit(
}))
}
#[derive(Debug, Deserialize)]
#[derive(Debug, Deserialize, Serialize)]
#[serde(rename_all = "camelCase")]
pub struct CodeActionData {
pub specifier: ModuleSpecifier,
@ -994,11 +1034,8 @@ impl CodeActionCollection {
"The action returned from TypeScript is unsupported.",
));
}
let Some(action) = fix_ts_import_action(
specifier,
action,
&language_server.get_ts_response_import_mapper(specifier),
) else {
let Some(action) = fix_ts_import_action(specifier, action, language_server)
else {
return Ok(());
};
let edit = ts_changes_to_edit(&action.changes, language_server)?;
@ -1047,10 +1084,12 @@ impl CodeActionCollection {
specifier: &ModuleSpecifier,
diagnostic: &lsp::Diagnostic,
) {
let data = Some(json!({
"specifier": specifier,
"fixId": action.fix_id,
}));
let data = action.fix_id.as_ref().map(|fix_id| {
json!(CodeActionData {
specifier: specifier.clone(),
fix_id: fix_id.clone(),
})
});
let title = if let Some(description) = &action.fix_all_description {
description.clone()
} else {
@ -1199,14 +1238,11 @@ impl CodeActionCollection {
}),
);
match parsed_source.program_ref() {
deno_ast::swc::ast::Program::Module(module) => module
.body
.iter()
.find(|i| i.range().contains(&specifier_range))
.map(|i| text_info.line_and_column_index(i.range().start)),
deno_ast::swc::ast::Program::Script(_) => None,
}
parsed_source
.program_ref()
.body()
.find(|i| i.range().contains(&specifier_range))
.map(|i| text_info.line_and_column_index(i.range().start))
}
async fn deno_types_for_npm_action(

View file

@ -421,7 +421,7 @@ pub fn collect_test(
) -> Result<Vec<lsp::CodeLens>, AnyError> {
let mut collector =
DenoTestCollector::new(specifier.clone(), parsed_source.clone());
parsed_source.module().visit_with(&mut collector);
parsed_source.program().visit_with(&mut collector);
Ok(collector.take())
}
@ -581,7 +581,7 @@ mod tests {
.unwrap();
let mut collector =
DenoTestCollector::new(specifier, parsed_module.clone());
parsed_module.module().visit_with(&mut collector);
parsed_module.program().visit_with(&mut collector);
assert_eq!(
collector.take(),
vec![

View file

@ -41,6 +41,7 @@ use deno_runtime::deno_node::PackageJson;
use indexmap::IndexSet;
use lsp_types::ClientCapabilities;
use std::collections::BTreeMap;
use std::collections::BTreeSet;
use std::collections::HashMap;
use std::ops::Deref;
use std::ops::DerefMut;
@ -984,7 +985,7 @@ impl Config {
| MediaType::Tsx => Some(&workspace_settings.typescript),
MediaType::Json
| MediaType::Wasm
| MediaType::TsBuildInfo
| MediaType::Css
| MediaType::SourceMap
| MediaType::Unknown => None,
}
@ -1190,6 +1191,7 @@ pub struct ConfigData {
pub resolver: Arc<WorkspaceResolver>,
pub sloppy_imports_resolver: Option<Arc<CliSloppyImportsResolver>>,
pub import_map_from_settings: Option<ModuleSpecifier>,
pub unstable: BTreeSet<String>,
watched_files: HashMap<ModuleSpecifier, ConfigWatchedFileType>,
}
@ -1587,9 +1589,16 @@ impl ConfigData {
.join("\n")
);
}
let unstable = member_dir
.workspace
.unstable_features()
.iter()
.chain(settings.unstable.as_deref())
.cloned()
.collect::<BTreeSet<_>>();
let unstable_sloppy_imports = std::env::var("DENO_UNSTABLE_SLOPPY_IMPORTS")
.is_ok()
|| member_dir.workspace.has_unstable("sloppy-imports");
|| unstable.contains("sloppy-imports");
let sloppy_imports_resolver = unstable_sloppy_imports.then(|| {
Arc::new(CliSloppyImportsResolver::new(
SloppyImportsCachedFs::new_without_stat_cache(Arc::new(
@ -1630,6 +1639,7 @@ impl ConfigData {
lockfile,
npmrc,
import_map_from_settings,
unstable,
watched_files,
}
}

View file

@ -272,7 +272,7 @@ fn get_maybe_test_module_fut(
parsed_source.specifier().clone(),
parsed_source.text_info_lazy().clone(),
);
parsed_source.module().visit_with(&mut collector);
parsed_source.program().visit_with(&mut collector);
Arc::new(collector.take())
})
.map(Result::ok)
@ -332,12 +332,8 @@ impl Document {
.filter(|s| cache.is_valid_file_referrer(s))
.cloned()
.or(file_referrer);
let media_type = resolve_media_type(
&specifier,
maybe_headers.as_ref(),
maybe_language_id,
&resolver,
);
let media_type =
resolve_media_type(&specifier, maybe_headers.as_ref(), maybe_language_id);
let (maybe_parsed_source, maybe_module) =
if media_type_is_diagnosable(media_type) {
parse_and_analyze_module(
@ -399,7 +395,6 @@ impl Document {
&self.specifier,
self.maybe_headers.as_ref(),
self.maybe_language_id,
&resolver,
);
let dependencies;
let maybe_types_dependency;
@ -764,14 +759,7 @@ fn resolve_media_type(
specifier: &ModuleSpecifier,
maybe_headers: Option<&HashMap<String, String>>,
maybe_language_id: Option<LanguageId>,
resolver: &LspResolver,
) -> MediaType {
if resolver.in_node_modules(specifier) {
if let Some(media_type) = resolver.node_media_type(specifier) {
return media_type;
}
}
if let Some(language_id) = maybe_language_id {
return MediaType::from_specifier_and_content_type(
specifier,
@ -1071,34 +1059,6 @@ impl Documents {
self.cache.is_valid_file_referrer(specifier)
}
/// Return `true` if the provided specifier can be resolved to a document,
/// otherwise `false`.
pub fn contains_import(
&self,
specifier: &str,
referrer: &ModuleSpecifier,
) -> bool {
let file_referrer = self.get_file_referrer(referrer);
let maybe_specifier = self
.resolver
.as_graph_resolver(file_referrer.as_deref())
.resolve(
specifier,
&deno_graph::Range {
specifier: referrer.clone(),
start: deno_graph::Position::zeroed(),
end: deno_graph::Position::zeroed(),
},
ResolutionMode::Types,
)
.ok();
if let Some(import_specifier) = maybe_specifier {
self.exists(&import_specifier, file_referrer.as_deref())
} else {
false
}
}
pub fn resolve_document_specifier(
&self,
specifier: &ModuleSpecifier,
@ -1561,7 +1521,7 @@ fn parse_source(
text: Arc<str>,
media_type: MediaType,
) -> ParsedSourceResult {
deno_ast::parse_module(deno_ast::ParseParams {
deno_ast::parse_program(deno_ast::ParseParams {
specifier,
text,
media_type,

View file

@ -863,7 +863,10 @@ impl Inner {
// We ignore these directories by default because there is a
// high likelihood they aren't relevant. Someone can opt-into
// them by specifying one of them as an enabled path.
if matches!(dir_name.as_str(), "vendor" | "node_modules" | ".git") {
if matches!(
dir_name.as_str(),
"vendor" | "coverage" | "node_modules" | ".git"
) {
continue;
}
// ignore cargo target directories for anyone using Deno with Rust
@ -904,7 +907,7 @@ impl Inner {
| MediaType::Tsx => {}
MediaType::Wasm
| MediaType::SourceMap
| MediaType::TsBuildInfo
| MediaType::Css
| MediaType::Unknown => {
if path.extension().and_then(|s| s.to_str()) != Some("jsonc") {
continue;
@ -1384,14 +1387,10 @@ impl Inner {
.clone();
fmt_options.use_tabs = Some(!params.options.insert_spaces);
fmt_options.indent_width = Some(params.options.tab_size as u8);
let maybe_workspace = self
.config
.tree
.data_for_specifier(&specifier)
.map(|d| &d.member_dir.workspace);
let config_data = self.config.tree.data_for_specifier(&specifier);
let unstable_options = UnstableFmtOptions {
component: maybe_workspace
.map(|w| w.has_unstable("fmt-component"))
component: config_data
.map(|d| d.unstable.contains("fmt-component"))
.unwrap_or(false),
};
let document = document.clone();
@ -1838,7 +1837,7 @@ impl Inner {
fix_ts_import_changes(
&code_action_data.specifier,
&combined_code_actions.changes,
&self.get_ts_response_import_mapper(&code_action_data.specifier),
self,
)
.map_err(|err| {
error!("Unable to remap changes: {:#}", err);
@ -1891,7 +1890,7 @@ impl Inner {
refactor_edit_info.edits = fix_ts_import_changes(
&action_data.specifier,
&refactor_edit_info.edits,
&self.get_ts_response_import_mapper(&action_data.specifier),
self,
)
.map_err(|err| {
error!("Unable to remap changes: {:#}", err);
@ -1922,7 +1921,8 @@ impl Inner {
// todo(dsherret): this should probably just take the resolver itself
// as the import map is an implementation detail
.and_then(|d| d.resolver.maybe_import_map()),
self.resolver.as_ref(),
&self.resolver,
&self.ts_server.specifier_map,
file_referrer,
)
}
@ -2285,7 +2285,11 @@ impl Inner {
.into(),
scope.cloned(),
)
.await;
.await
.unwrap_or_else(|err| {
error!("Unable to get completion info from TypeScript: {:#}", err);
None
});
if let Some(completions) = maybe_completion_info {
response = Some(
@ -3948,7 +3952,9 @@ mod tests {
fn test_walk_workspace() {
let temp_dir = TempDir::new();
temp_dir.create_dir_all("root1/vendor/");
temp_dir.create_dir_all("root1/coverage/");
temp_dir.write("root1/vendor/mod.ts", ""); // no, vendor
temp_dir.write("root1/coverage/mod.ts", ""); // no, coverage
temp_dir.create_dir_all("root1/node_modules/");
temp_dir.write("root1/node_modules/mod.ts", ""); // no, node_modules

View file

@ -2,6 +2,8 @@
use dashmap::DashMap;
use deno_ast::MediaType;
use deno_ast::ParsedSource;
use deno_cache_dir::npm::NpmCacheDir;
use deno_cache_dir::HttpCache;
use deno_config::workspace::PackageJsonDepResolution;
use deno_config::workspace::WorkspaceResolver;
@ -14,15 +16,15 @@ use deno_path_util::url_to_file_path;
use deno_runtime::deno_fs;
use deno_runtime::deno_node::NodeResolver;
use deno_runtime::deno_node::PackageJson;
use deno_runtime::deno_node::PackageJsonResolver;
use deno_semver::jsr::JsrPackageReqReference;
use deno_semver::npm::NpmPackageReqReference;
use deno_semver::package::PackageNv;
use deno_semver::package::PackageReq;
use indexmap::IndexMap;
use node_resolver::errors::ClosestPkgJsonError;
use node_resolver::NodeResolution;
use node_resolver::InNpmPackageChecker;
use node_resolver::NodeResolutionMode;
use node_resolver::NpmResolver;
use std::borrow::Cow;
use std::collections::BTreeMap;
use std::collections::BTreeSet;
@ -36,6 +38,7 @@ use crate::args::create_default_npmrc;
use crate::args::CacheSetting;
use crate::args::CliLockfile;
use crate::args::NpmInstallDepsProvider;
use crate::cache::DenoCacheEnvFsAdapter;
use crate::graph_util::CliJsrUrlProvider;
use crate::http_util::HttpClientProvider;
use crate::lsp::config::Config;
@ -43,40 +46,50 @@ use crate::lsp::config::ConfigData;
use crate::lsp::logging::lsp_warn;
use crate::npm::create_cli_npm_resolver_for_lsp;
use crate::npm::CliByonmNpmResolverCreateOptions;
use crate::npm::CliManagedInNpmPkgCheckerCreateOptions;
use crate::npm::CliManagedNpmResolverCreateOptions;
use crate::npm::CliNpmResolver;
use crate::npm::CliNpmResolverCreateOptions;
use crate::npm::CliNpmResolverManagedCreateOptions;
use crate::npm::CliNpmResolverManagedSnapshotOption;
use crate::npm::CreateInNpmPkgCheckerOptions;
use crate::npm::ManagedCliNpmResolver;
use crate::resolver::CjsResolutionStore;
use crate::resolver::CjsTracker;
use crate::resolver::CjsTrackerOptions;
use crate::resolver::CliDenoResolverFs;
use crate::resolver::CliGraphResolver;
use crate::resolver::CliGraphResolverOptions;
use crate::resolver::CliNodeResolver;
use crate::resolver::WorkerCliNpmGraphResolver;
use crate::tsc::into_specifier_and_media_type;
use crate::util::progress_bar::ProgressBar;
use crate::util::progress_bar::ProgressBarStyle;
#[derive(Debug, Clone)]
struct LspScopeResolver {
cjs_tracker: Option<Arc<LspCjsTracker>>,
graph_resolver: Arc<CliGraphResolver>,
jsr_resolver: Option<Arc<JsrCacheResolver>>,
npm_resolver: Option<Arc<dyn CliNpmResolver>>,
node_resolver: Option<Arc<CliNodeResolver>>,
pkg_json_resolver: Option<Arc<PackageJsonResolver>>,
redirect_resolver: Option<Arc<RedirectResolver>>,
graph_imports: Arc<IndexMap<ModuleSpecifier, GraphImport>>,
package_json_deps_by_resolution: Arc<IndexMap<ModuleSpecifier, String>>,
config_data: Option<Arc<ConfigData>>,
}
impl Default for LspScopeResolver {
fn default() -> Self {
Self {
cjs_tracker: None,
graph_resolver: create_graph_resolver(None, None, None),
jsr_resolver: None,
npm_resolver: None,
node_resolver: None,
pkg_json_resolver: None,
redirect_resolver: None,
graph_imports: Default::default(),
package_json_deps_by_resolution: Default::default(),
config_data: None,
}
}
@ -90,14 +103,35 @@ impl LspScopeResolver {
) -> Self {
let mut npm_resolver = None;
let mut node_resolver = None;
let mut lsp_cjs_tracker = None;
let fs = Arc::new(deno_fs::RealFs);
let pkg_json_resolver = Arc::new(PackageJsonResolver::new(
deno_runtime::deno_node::DenoFsNodeResolverEnv::new(fs.clone()),
));
if let Some(http_client) = http_client_provider {
npm_resolver = create_npm_resolver(
config_data.map(|d| d.as_ref()),
cache,
http_client,
&pkg_json_resolver,
)
.await;
node_resolver = create_node_resolver(npm_resolver.as_ref());
if let Some(npm_resolver) = &npm_resolver {
let in_npm_pkg_checker = create_in_npm_pkg_checker(npm_resolver);
let cjs_tracker = create_cjs_tracker(
in_npm_pkg_checker.clone(),
pkg_json_resolver.clone(),
);
lsp_cjs_tracker =
Some(Arc::new(LspCjsTracker::new(cjs_tracker.clone())));
node_resolver = Some(create_node_resolver(
cjs_tracker,
fs.clone(),
in_npm_pkg_checker,
npm_resolver,
pkg_json_resolver.clone(),
));
}
}
let graph_resolver = create_graph_resolver(
config_data.map(|d| d.as_ref()),
@ -133,13 +167,43 @@ impl LspScopeResolver {
)
})
.unwrap_or_default();
let package_json_deps_by_resolution = (|| {
let node_resolver = node_resolver.as_ref()?;
let package_json = config_data?.maybe_pkg_json()?;
let referrer = package_json.specifier();
let dependencies = package_json.dependencies.as_ref()?;
let result = dependencies
.iter()
.flat_map(|(name, _)| {
let req_ref =
NpmPackageReqReference::from_str(&format!("npm:{name}")).ok()?;
let specifier = into_specifier_and_media_type(Some(
node_resolver
.resolve_req_reference(
&req_ref,
&referrer,
NodeResolutionMode::Types,
)
.ok()?,
))
.0;
Some((specifier, name.clone()))
})
.collect();
Some(result)
})();
let package_json_deps_by_resolution =
Arc::new(package_json_deps_by_resolution.unwrap_or_default());
Self {
cjs_tracker: lsp_cjs_tracker,
graph_resolver,
jsr_resolver,
npm_resolver,
node_resolver,
pkg_json_resolver: Some(pkg_json_resolver),
redirect_resolver,
graph_imports,
package_json_deps_by_resolution,
config_data: config_data.cloned(),
}
}
@ -147,19 +211,44 @@ impl LspScopeResolver {
fn snapshot(&self) -> Arc<Self> {
let npm_resolver =
self.npm_resolver.as_ref().map(|r| r.clone_snapshotted());
let node_resolver = create_node_resolver(npm_resolver.as_ref());
let fs = Arc::new(deno_fs::RealFs);
let pkg_json_resolver = Arc::new(PackageJsonResolver::new(
deno_runtime::deno_node::DenoFsNodeResolverEnv::new(fs.clone()),
));
let mut node_resolver = None;
let mut lsp_cjs_tracker = None;
if let Some(npm_resolver) = &npm_resolver {
let in_npm_pkg_checker = create_in_npm_pkg_checker(npm_resolver);
let cjs_tracker = create_cjs_tracker(
in_npm_pkg_checker.clone(),
pkg_json_resolver.clone(),
);
lsp_cjs_tracker = Some(Arc::new(LspCjsTracker::new(cjs_tracker.clone())));
node_resolver = Some(create_node_resolver(
cjs_tracker,
fs,
in_npm_pkg_checker,
npm_resolver,
pkg_json_resolver.clone(),
));
}
let graph_resolver = create_graph_resolver(
self.config_data.as_deref(),
npm_resolver.as_ref(),
node_resolver.as_ref(),
);
Arc::new(Self {
cjs_tracker: lsp_cjs_tracker,
graph_resolver,
jsr_resolver: self.jsr_resolver.clone(),
npm_resolver,
node_resolver,
redirect_resolver: self.redirect_resolver.clone(),
pkg_json_resolver: Some(pkg_json_resolver),
graph_imports: self.graph_imports.clone(),
package_json_deps_by_resolution: self
.package_json_deps_by_resolution
.clone(),
config_data: self.config_data.clone(),
})
}
@ -261,6 +350,22 @@ impl LspResolver {
resolver.graph_resolver.create_graph_npm_resolver()
}
pub fn maybe_cjs_tracker(
&self,
file_referrer: Option<&ModuleSpecifier>,
) -> Option<&Arc<LspCjsTracker>> {
let resolver = self.get_scope_resolver(file_referrer);
resolver.cjs_tracker.as_ref()
}
pub fn maybe_node_resolver(
&self,
file_referrer: Option<&ModuleSpecifier>,
) -> Option<&Arc<CliNodeResolver>> {
let resolver = self.get_scope_resolver(file_referrer);
resolver.node_resolver.as_ref()
}
pub fn maybe_managed_npm_resolver(
&self,
file_referrer: Option<&ModuleSpecifier>,
@ -328,13 +433,25 @@ impl LspResolver {
) -> Option<(ModuleSpecifier, MediaType)> {
let resolver = self.get_scope_resolver(file_referrer);
let node_resolver = resolver.node_resolver.as_ref()?;
Some(NodeResolution::into_specifier_and_media_type(Some(
Some(into_specifier_and_media_type(Some(
node_resolver
.resolve_req_reference(req_ref, referrer, NodeResolutionMode::Types)
.ok()?,
)))
}
pub fn file_url_to_package_json_dep(
&self,
specifier: &ModuleSpecifier,
file_referrer: Option<&ModuleSpecifier>,
) -> Option<String> {
let resolver = self.get_scope_resolver(file_referrer);
resolver
.package_json_deps_by_resolution
.get(specifier)
.cloned()
}
pub fn in_node_modules(&self, specifier: &ModuleSpecifier) -> bool {
fn has_node_modules_dir(specifier: &ModuleSpecifier) -> bool {
// consider any /node_modules/ directory as being in the node_modules
@ -346,14 +463,10 @@ impl LspResolver {
.contains("/node_modules/")
}
let global_npm_resolver = self
.get_scope_resolver(Some(specifier))
.npm_resolver
.as_ref()
.and_then(|npm_resolver| npm_resolver.as_managed())
.filter(|r| r.root_node_modules_path().is_none());
if let Some(npm_resolver) = &global_npm_resolver {
if npm_resolver.in_npm_package(specifier) {
if let Some(node_resolver) =
&self.get_scope_resolver(Some(specifier)).node_resolver
{
if node_resolver.in_npm_package(specifier) {
return true;
}
}
@ -361,18 +474,6 @@ impl LspResolver {
has_node_modules_dir(specifier)
}
pub fn node_media_type(
&self,
specifier: &ModuleSpecifier,
) -> Option<MediaType> {
let resolver = self.get_scope_resolver(Some(specifier));
let node_resolver = resolver.node_resolver.as_ref()?;
let resolution = node_resolver
.url_to_node_resolution(specifier.clone())
.ok()?;
Some(NodeResolution::into_specifier_and_media_type(Some(resolution)).1)
}
pub fn is_bare_package_json_dep(
&self,
specifier_text: &str,
@ -398,10 +499,10 @@ impl LspResolver {
referrer: &ModuleSpecifier,
) -> Result<Option<Arc<PackageJson>>, ClosestPkgJsonError> {
let resolver = self.get_scope_resolver(Some(referrer));
let Some(node_resolver) = resolver.node_resolver.as_ref() else {
let Some(pkg_json_resolver) = resolver.pkg_json_resolver.as_ref() else {
return Ok(None);
};
node_resolver.get_closest_package_json(referrer)
pkg_json_resolver.get_closest_package_json(referrer)
}
pub fn resolve_redirects(
@ -457,11 +558,13 @@ async fn create_npm_resolver(
config_data: Option<&ConfigData>,
cache: &LspCache,
http_client_provider: &Arc<HttpClientProvider>,
pkg_json_resolver: &Arc<PackageJsonResolver>,
) -> Option<Arc<dyn CliNpmResolver>> {
let enable_byonm = config_data.map(|d| d.byonm).unwrap_or(false);
let options = if enable_byonm {
CliNpmResolverCreateOptions::Byonm(CliByonmNpmResolverCreateOptions {
fs: CliDenoResolverFs(Arc::new(deno_fs::RealFs)),
pkg_json_resolver: pkg_json_resolver.clone(),
root_node_modules_dir: config_data.and_then(|config_data| {
config_data.node_modules_dir.clone().or_else(|| {
url_to_file_path(&config_data.scope)
@ -471,7 +574,15 @@ async fn create_npm_resolver(
}),
})
} else {
CliNpmResolverCreateOptions::Managed(CliNpmResolverManagedCreateOptions {
let npmrc = config_data
.and_then(|d| d.npmrc.clone())
.unwrap_or_else(create_default_npmrc);
let npm_cache_dir = Arc::new(NpmCacheDir::new(
&DenoCacheEnvFsAdapter(&deno_fs::RealFs),
cache.deno_dir().npm_folder_path(),
npmrc.get_all_known_registries_urls(),
));
CliNpmResolverCreateOptions::Managed(CliManagedNpmResolverCreateOptions {
http_client_provider: http_client_provider.clone(),
snapshot: match config_data.and_then(|d| d.lockfile.as_ref()) {
Some(lockfile) => {
@ -485,7 +596,7 @@ async fn create_npm_resolver(
// updating it. Only the cache request should update the lockfile.
maybe_lockfile: None,
fs: Arc::new(deno_fs::RealFs),
npm_global_cache_dir: cache.deno_dir().npm_folder_path(),
npm_cache_dir,
// Use an "only" cache setting in order to make the
// user do an explicit "cache" command and prevent
// the cache from being filled with lots of packages while
@ -496,9 +607,7 @@ async fn create_npm_resolver(
.and_then(|d| d.node_modules_dir.clone()),
// only used for top level install, so we can ignore this
npm_install_deps_provider: Arc::new(NpmInstallDepsProvider::empty()),
npmrc: config_data
.and_then(|d| d.npmrc.clone())
.unwrap_or_else(create_default_npmrc),
npmrc,
npm_system_info: NpmSystemInfo::default(),
lifecycle_scripts: Default::default(),
})
@ -506,28 +615,59 @@ async fn create_npm_resolver(
Some(create_cli_npm_resolver_for_lsp(options).await)
}
fn create_cjs_tracker(
in_npm_pkg_checker: Arc<dyn InNpmPackageChecker>,
pkg_json_resolver: Arc<PackageJsonResolver>,
) -> Arc<CjsTracker> {
Arc::new(CjsTracker::new(
in_npm_pkg_checker,
pkg_json_resolver,
CjsTrackerOptions {
// todo(dsherret): support in the lsp by stabilizing the feature
// so that we don't have to pipe the config in here
unstable_detect_cjs: false,
},
))
}
fn create_in_npm_pkg_checker(
npm_resolver: &Arc<dyn CliNpmResolver>,
) -> Arc<dyn InNpmPackageChecker> {
crate::npm::create_in_npm_pkg_checker(match npm_resolver.as_inner() {
crate::npm::InnerCliNpmResolverRef::Byonm(_) => {
CreateInNpmPkgCheckerOptions::Byonm
}
crate::npm::InnerCliNpmResolverRef::Managed(m) => {
CreateInNpmPkgCheckerOptions::Managed(
CliManagedInNpmPkgCheckerCreateOptions {
root_cache_dir_url: m.global_cache_root_url(),
maybe_node_modules_path: m.maybe_node_modules_path(),
},
)
}
})
}
fn create_node_resolver(
npm_resolver: Option<&Arc<dyn CliNpmResolver>>,
) -> Option<Arc<CliNodeResolver>> {
use once_cell::sync::Lazy;
// it's not ideal to share this across all scopes and to
// never clear it, but it's fine for the time being
static CJS_RESOLUTIONS: Lazy<Arc<CjsResolutionStore>> =
Lazy::new(Default::default);
let npm_resolver = npm_resolver?;
let fs = Arc::new(deno_fs::RealFs);
cjs_tracker: Arc<CjsTracker>,
fs: Arc<dyn deno_fs::FileSystem>,
in_npm_pkg_checker: Arc<dyn InNpmPackageChecker>,
npm_resolver: &Arc<dyn CliNpmResolver>,
pkg_json_resolver: Arc<PackageJsonResolver>,
) -> Arc<CliNodeResolver> {
let node_resolver_inner = Arc::new(NodeResolver::new(
deno_runtime::deno_node::DenoFsNodeResolverEnv::new(fs.clone()),
in_npm_pkg_checker.clone(),
npm_resolver.clone().into_npm_resolver(),
pkg_json_resolver.clone(),
));
Some(Arc::new(CliNodeResolver::new(
CJS_RESOLUTIONS.clone(),
Arc::new(CliNodeResolver::new(
cjs_tracker.clone(),
fs,
in_npm_pkg_checker,
node_resolver_inner,
npm_resolver.clone(),
)))
))
}
fn create_graph_resolver(
@ -555,8 +695,8 @@ fn create_graph_resolver(
workspace.to_maybe_jsx_import_source_config().ok().flatten()
}),
maybe_vendor_dir: config_data.and_then(|d| d.vendor_dir.as_ref()),
bare_node_builtins_enabled: workspace
.is_some_and(|workspace| workspace.has_unstable("bare-node-builtins")),
bare_node_builtins_enabled: config_data
.is_some_and(|d| d.unstable.contains("bare-node-builtins")),
sloppy_imports_resolver: config_data
.and_then(|d| d.sloppy_imports_resolver.clone()),
}))
@ -702,6 +842,45 @@ impl RedirectResolver {
}
}
#[derive(Debug)]
pub struct LspCjsTracker {
cjs_tracker: Arc<CjsTracker>,
}
impl LspCjsTracker {
pub fn new(cjs_tracker: Arc<CjsTracker>) -> Self {
Self { cjs_tracker }
}
pub fn is_cjs(
&self,
specifier: &ModuleSpecifier,
media_type: MediaType,
maybe_parsed_source: Option<&ParsedSource>,
) -> bool {
if let Some(module_kind) =
self.cjs_tracker.get_known_kind(specifier, media_type)
{
module_kind.is_cjs()
} else {
let maybe_is_script = maybe_parsed_source.map(|p| p.compute_is_script());
maybe_is_script
.and_then(|is_script| {
self
.cjs_tracker
.is_cjs_with_known_is_script(specifier, media_type, is_script)
.ok()
})
.unwrap_or_else(|| {
self
.cjs_tracker
.is_maybe_cjs(specifier, media_type)
.unwrap_or(false)
})
}
}
}
#[cfg(test)]
mod tests {
use super::*;

View file

@ -650,7 +650,7 @@ pub mod tests {
.unwrap();
let text_info = parsed_module.text_info_lazy().clone();
let mut collector = TestCollector::new(specifier, text_info);
parsed_module.module().visit_with(&mut collector);
parsed_module.program().visit_with(&mut collector);
collector.take()
}

View file

@ -236,7 +236,7 @@ pub struct TsServer {
performance: Arc<Performance>,
sender: mpsc::UnboundedSender<Request>,
receiver: Mutex<Option<mpsc::UnboundedReceiver<Request>>>,
specifier_map: Arc<TscSpecifierMap>,
pub specifier_map: Arc<TscSpecifierMap>,
inspector_server: Mutex<Option<Arc<InspectorServer>>>,
pending_change: Mutex<Option<PendingChange>>,
}
@ -882,20 +882,22 @@ impl TsServer {
options: GetCompletionsAtPositionOptions,
format_code_settings: FormatCodeSettings,
scope: Option<ModuleSpecifier>,
) -> Option<CompletionInfo> {
) -> Result<Option<CompletionInfo>, AnyError> {
let req = TscRequest::GetCompletionsAtPosition(Box::new((
self.specifier_map.denormalize(&specifier),
position,
options,
format_code_settings,
)));
match self.request(snapshot, req, scope).await {
Ok(maybe_info) => maybe_info,
Err(err) => {
log::error!("Unable to get completion info from TypeScript: {:#}", err);
None
}
}
self
.request::<Option<CompletionInfo>>(snapshot, req, scope)
.await
.map(|mut info| {
if let Some(info) = &mut info {
info.normalize(&self.specifier_map);
}
info
})
}
pub async fn get_completion_details(
@ -3642,6 +3644,12 @@ pub struct CompletionInfo {
}
impl CompletionInfo {
fn normalize(&mut self, specifier_map: &TscSpecifierMap) {
for entry in &mut self.entries {
entry.normalize(specifier_map);
}
}
pub fn as_completion_response(
&self,
line_index: Arc<LineIndex>,
@ -3703,11 +3711,17 @@ pub struct CompletionItemData {
#[derive(Debug, Deserialize, Serialize)]
#[serde(rename_all = "camelCase")]
struct CompletionEntryDataImport {
struct CompletionEntryDataAutoImport {
module_specifier: String,
file_name: String,
}
#[derive(Debug)]
pub struct CompletionNormalizedAutoImportData {
raw: CompletionEntryDataAutoImport,
normalized: ModuleSpecifier,
}
#[derive(Debug, Default, Deserialize, Serialize)]
#[serde(rename_all = "camelCase")]
pub struct CompletionEntry {
@ -3740,9 +3754,28 @@ pub struct CompletionEntry {
is_import_statement_completion: Option<bool>,
#[serde(skip_serializing_if = "Option::is_none")]
data: Option<Value>,
/// This is not from tsc, we add it for convenience during normalization.
/// Represents `self.data.file_name`, but normalized.
#[serde(skip)]
auto_import_data: Option<CompletionNormalizedAutoImportData>,
}
impl CompletionEntry {
fn normalize(&mut self, specifier_map: &TscSpecifierMap) {
let Some(data) = &self.data else {
return;
};
let Ok(raw) =
serde_json::from_value::<CompletionEntryDataAutoImport>(data.clone())
else {
return;
};
if let Ok(normalized) = specifier_map.normalize(&raw.file_name) {
self.auto_import_data =
Some(CompletionNormalizedAutoImportData { raw, normalized });
}
}
fn get_commit_characters(
&self,
info: &CompletionInfo,
@ -3891,25 +3924,24 @@ impl CompletionEntry {
if let Some(source) = &self.source {
let mut display_source = source.clone();
if let Some(data) = &self.data {
if let Ok(import_data) =
serde_json::from_value::<CompletionEntryDataImport>(data.clone())
if let Some(import_data) = &self.auto_import_data {
if let Some(new_module_specifier) = language_server
.get_ts_response_import_mapper(specifier)
.check_specifier(&import_data.normalized, specifier)
.or_else(|| relative_specifier(specifier, &import_data.normalized))
{
if let Ok(import_specifier) = resolve_url(&import_data.file_name) {
if let Some(new_module_specifier) = language_server
.get_ts_response_import_mapper(specifier)
.check_specifier(&import_specifier, specifier)
.or_else(|| relative_specifier(specifier, &import_specifier))
{
display_source.clone_from(&new_module_specifier);
if new_module_specifier != import_data.module_specifier {
specifier_rewrite =
Some((import_data.module_specifier, new_module_specifier));
}
} else if source.starts_with(jsr_url().as_str()) {
return None;
}
if new_module_specifier.contains("/node_modules/") {
return None;
}
display_source.clone_from(&new_module_specifier);
if new_module_specifier != import_data.raw.module_specifier {
specifier_rewrite = Some((
import_data.raw.module_specifier.clone(),
new_module_specifier,
));
}
} else if source.starts_with(jsr_url().as_str()) {
return None;
}
}
// We want relative or bare (import-mapped or otherwise) specifiers to
@ -4212,6 +4244,13 @@ impl TscSpecifierMap {
return specifier.to_string();
}
let mut specifier = original.to_string();
if specifier.contains("/node_modules/.deno/")
&& !specifier.contains("/node_modules/@types/node/")
{
// The ts server doesn't give completions from files in
// `node_modules/.deno/`. We work around it like this.
specifier = specifier.replace("/node_modules/", "/$node_modules/");
}
let media_type = MediaType::from_specifier(original);
// If the URL-inferred media type doesn't correspond to tsc's path-inferred
// media type, force it to be the same by appending an extension.
@ -4329,7 +4368,7 @@ fn op_is_cancelled(state: &mut OpState) -> bool {
fn op_is_node_file(state: &mut OpState, #[string] path: String) -> bool {
let state = state.borrow::<State>();
let mark = state.performance.mark("tsc.op.op_is_node_file");
let r = match ModuleSpecifier::parse(&path) {
let r = match state.specifier_map.normalize(path) {
Ok(specifier) => state.state_snapshot.resolver.in_node_modules(&specifier),
Err(_) => false,
};
@ -4362,14 +4401,25 @@ fn op_load<'s>(
None
} else {
let asset_or_document = state.get_asset_or_document(&specifier);
asset_or_document.map(|doc| LoadResponse {
data: doc.text(),
script_kind: crate::tsc::as_ts_script_kind(doc.media_type()),
version: state.script_version(&specifier),
is_cjs: matches!(
doc.media_type(),
MediaType::Cjs | MediaType::Cts | MediaType::Dcts
),
asset_or_document.map(|doc| {
let maybe_cjs_tracker = state
.state_snapshot
.resolver
.maybe_cjs_tracker(Some(&specifier));
LoadResponse {
data: doc.text(),
script_kind: crate::tsc::as_ts_script_kind(doc.media_type()),
version: state.script_version(&specifier),
is_cjs: maybe_cjs_tracker
.map(|t| {
t.is_cjs(
&specifier,
doc.media_type(),
doc.maybe_parsed_source().and_then(|p| p.as_ref().ok()),
)
})
.unwrap_or(false),
}
})
};
@ -4598,7 +4648,10 @@ fn op_script_names(state: &mut OpState) -> ScriptNames {
for doc in &docs {
let specifier = doc.specifier();
let is_open = doc.is_open();
if is_open || specifier.scheme() == "file" {
if is_open
|| (specifier.scheme() == "file"
&& !state.state_snapshot.resolver.in_node_modules(specifier))
{
let script_names = doc
.scope()
.and_then(|s| result.by_scope.get_mut(s))
@ -6024,6 +6077,7 @@ mod tests {
Some(temp_dir.url()),
)
.await
.unwrap()
.unwrap();
assert_eq!(info.entries.len(), 22);
let details = ts_server
@ -6183,6 +6237,7 @@ mod tests {
Some(temp_dir.url()),
)
.await
.unwrap()
.unwrap();
let entry = info
.entries

View file

@ -135,7 +135,7 @@ async fn run_subcommand(flags: Arc<Flags>) -> Result<i32, AnyError> {
tools::compile::compile(flags, compile_flags).await
}),
DenoSubcommand::Coverage(coverage_flags) => spawn_subcommand(async {
tools::coverage::cover_files(flags, coverage_flags).await
tools::coverage::cover_files(flags, coverage_flags)
}),
DenoSubcommand::Fmt(fmt_flags) => {
spawn_subcommand(

View file

@ -2,6 +2,7 @@
use std::borrow::Cow;
use std::cell::RefCell;
use std::path::Path;
use std::path::PathBuf;
use std::pin::Pin;
use std::rc::Rc;
@ -23,19 +24,23 @@ use crate::graph_container::ModuleGraphUpdatePermit;
use crate::graph_util::CreateGraphOptions;
use crate::graph_util::ModuleGraphBuilder;
use crate::node;
use crate::node::CliNodeCodeTranslator;
use crate::npm::CliNpmResolver;
use crate::resolver::CjsTracker;
use crate::resolver::CliGraphResolver;
use crate::resolver::CliNodeResolver;
use crate::resolver::ModuleCodeStringSource;
use crate::resolver::NotSupportedKindInNpmError;
use crate::resolver::NpmModuleLoader;
use crate::tools::check;
use crate::tools::check::TypeChecker;
use crate::util::progress_bar::ProgressBar;
use crate::util::text_encoding::code_without_source_map;
use crate::util::text_encoding::source_map_from_code;
use crate::worker::ModuleLoaderAndSourceMapGetter;
use crate::worker::CreateModuleLoaderResult;
use crate::worker::ModuleLoaderFactory;
use deno_ast::MediaType;
use deno_ast::ModuleKind;
use deno_core::anyhow::anyhow;
use deno_core::anyhow::bail;
use deno_core::anyhow::Context;
@ -63,9 +68,12 @@ use deno_graph::Module;
use deno_graph::ModuleGraph;
use deno_graph::Resolution;
use deno_runtime::code_cache;
use deno_runtime::deno_fs::FileSystem;
use deno_runtime::deno_node::create_host_defined_options;
use deno_runtime::deno_node::NodeRequireLoader;
use deno_runtime::deno_permissions::PermissionsContainer;
use deno_semver::npm::NpmPackageReqReference;
use node_resolver::InNpmPackageChecker;
use node_resolver::NodeResolutionMode;
pub struct ModuleLoadPreparer {
@ -198,11 +206,16 @@ struct SharedCliModuleLoaderState {
lib_worker: TsTypeLib,
initial_cwd: PathBuf,
is_inspecting: bool,
is_npm_main: bool,
is_repl: bool,
cjs_tracker: Arc<CjsTracker>,
code_cache: Option<Arc<CodeCache>>,
emitter: Arc<Emitter>,
fs: Arc<dyn FileSystem>,
in_npm_pkg_checker: Arc<dyn InNpmPackageChecker>,
main_module_graph_container: Arc<MainModuleGraphContainer>,
module_load_preparer: Arc<ModuleLoadPreparer>,
node_code_translator: Arc<CliNodeCodeTranslator>,
node_resolver: Arc<CliNodeResolver>,
npm_resolver: Arc<dyn CliNpmResolver>,
npm_module_loader: NpmModuleLoader,
@ -218,10 +231,14 @@ impl CliModuleLoaderFactory {
#[allow(clippy::too_many_arguments)]
pub fn new(
options: &CliOptions,
cjs_tracker: Arc<CjsTracker>,
code_cache: Option<Arc<CodeCache>>,
emitter: Arc<Emitter>,
fs: Arc<dyn FileSystem>,
in_npm_pkg_checker: Arc<dyn InNpmPackageChecker>,
main_module_graph_container: Arc<MainModuleGraphContainer>,
module_load_preparer: Arc<ModuleLoadPreparer>,
node_code_translator: Arc<CliNodeCodeTranslator>,
node_resolver: Arc<CliNodeResolver>,
npm_resolver: Arc<dyn CliNpmResolver>,
npm_module_loader: NpmModuleLoader,
@ -235,14 +252,19 @@ impl CliModuleLoaderFactory {
lib_worker: options.ts_type_lib_worker(),
initial_cwd: options.initial_cwd().to_path_buf(),
is_inspecting: options.is_inspecting(),
is_npm_main: options.is_npm_main(),
is_repl: matches!(
options.sub_command(),
DenoSubcommand::Repl(_) | DenoSubcommand::Jupyter(_)
),
cjs_tracker,
code_cache,
emitter,
fs,
in_npm_pkg_checker,
main_module_graph_container,
module_load_preparer,
node_code_translator,
node_resolver,
npm_resolver,
npm_module_loader,
@ -259,19 +281,30 @@ impl CliModuleLoaderFactory {
is_worker: bool,
parent_permissions: PermissionsContainer,
permissions: PermissionsContainer,
) -> ModuleLoaderAndSourceMapGetter {
let loader = Rc::new(CliModuleLoader(Rc::new(CliModuleLoaderInner {
lib,
is_worker,
parent_permissions,
permissions,
) -> CreateModuleLoaderResult {
let module_loader =
Rc::new(CliModuleLoader(Rc::new(CliModuleLoaderInner {
lib,
is_worker,
is_npm_main: self.shared.is_npm_main,
parent_permissions,
permissions,
graph_container: graph_container.clone(),
node_code_translator: self.shared.node_code_translator.clone(),
emitter: self.shared.emitter.clone(),
parsed_source_cache: self.shared.parsed_source_cache.clone(),
shared: self.shared.clone(),
})));
let node_require_loader = Rc::new(CliNodeRequireLoader::new(
self.shared.emitter.clone(),
self.shared.fs.clone(),
graph_container,
emitter: self.shared.emitter.clone(),
parsed_source_cache: self.shared.parsed_source_cache.clone(),
shared: self.shared.clone(),
})));
ModuleLoaderAndSourceMapGetter {
module_loader: loader,
self.shared.in_npm_pkg_checker.clone(),
self.shared.npm_resolver.clone(),
));
CreateModuleLoaderResult {
module_loader,
node_require_loader,
}
}
}
@ -280,7 +313,7 @@ impl ModuleLoaderFactory for CliModuleLoaderFactory {
fn create_for_main(
&self,
root_permissions: PermissionsContainer,
) -> ModuleLoaderAndSourceMapGetter {
) -> CreateModuleLoaderResult {
self.create_with_lib(
(*self.shared.main_module_graph_container).clone(),
self.shared.lib_window,
@ -294,7 +327,7 @@ impl ModuleLoaderFactory for CliModuleLoaderFactory {
&self,
parent_permissions: PermissionsContainer,
permissions: PermissionsContainer,
) -> ModuleLoaderAndSourceMapGetter {
) -> CreateModuleLoaderResult {
self.create_with_lib(
// create a fresh module graph for the worker
WorkerModuleGraphContainer::new(Arc::new(ModuleGraph::new(
@ -310,6 +343,7 @@ impl ModuleLoaderFactory for CliModuleLoaderFactory {
struct CliModuleLoaderInner<TGraphContainer: ModuleGraphContainer> {
lib: TsTypeLib,
is_npm_main: bool,
is_worker: bool,
/// The initial set of permissions used to resolve the static imports in the
/// worker. These are "allow all" for main worker, and parent thread
@ -318,6 +352,7 @@ struct CliModuleLoaderInner<TGraphContainer: ModuleGraphContainer> {
permissions: PermissionsContainer,
shared: Arc<SharedCliModuleLoaderState>,
emitter: Arc<Emitter>,
node_code_translator: Arc<CliNodeCodeTranslator>,
parsed_source_cache: Arc<ParsedSourceCache>,
graph_container: TGraphContainer,
}
@ -331,24 +366,7 @@ impl<TGraphContainer: ModuleGraphContainer>
maybe_referrer: Option<&ModuleSpecifier>,
requested_module_type: RequestedModuleType,
) -> Result<ModuleSource, AnyError> {
let code_source = match self.load_prepared_module(specifier).await? {
Some(code_source) => code_source,
None => {
if self.shared.npm_module_loader.if_in_npm_package(specifier) {
self
.shared
.npm_module_loader
.load(specifier, maybe_referrer)
.await?
} else {
let mut msg = format!("Loading unprepared module: {specifier}");
if let Some(referrer) = maybe_referrer {
msg = format!("{}, imported from: {}", msg, referrer.as_str());
}
return Err(anyhow!(msg));
}
}
};
let code_source = self.load_code_source(specifier, maybe_referrer).await?;
let code = if self.shared.is_inspecting {
// we need the code with the source map in order for
// it to work with --inspect or --inspect-brk
@ -402,6 +420,29 @@ impl<TGraphContainer: ModuleGraphContainer>
))
}
async fn load_code_source(
&self,
specifier: &ModuleSpecifier,
maybe_referrer: Option<&ModuleSpecifier>,
) -> Result<ModuleCodeStringSource, AnyError> {
if let Some(code_source) = self.load_prepared_module(specifier).await? {
return Ok(code_source);
}
if self.shared.node_resolver.in_npm_package(specifier) {
return self
.shared
.npm_module_loader
.load(specifier, maybe_referrer)
.await;
}
let mut msg = format!("Loading unprepared module: {specifier}");
if let Some(referrer) = maybe_referrer {
msg = format!("{}, imported from: {}", msg, referrer.as_str());
}
Err(anyhow!(msg))
}
fn resolve_referrer(
&self,
referrer: &str,
@ -474,15 +515,11 @@ impl<TGraphContainer: ModuleGraphContainer>
if self.shared.is_repl {
if let Ok(reference) = NpmPackageReqReference::from_specifier(&specifier)
{
return self
.shared
.node_resolver
.resolve_req_reference(
&reference,
referrer,
NodeResolutionMode::Execution,
)
.map(|res| res.into_url());
return self.shared.node_resolver.resolve_req_reference(
&reference,
referrer,
NodeResolutionMode::Execution,
);
}
}
@ -506,13 +543,15 @@ impl<TGraphContainer: ModuleGraphContainer>
.with_context(|| {
format!("Could not resolve '{}'.", module.nv_reference)
})?
.into_url()
}
Some(Module::Node(module)) => module.specifier.clone(),
Some(Module::Js(module)) => module.specifier.clone(),
Some(Module::Json(module)) => module.specifier.clone(),
Some(Module::External(module)) => {
node::resolve_specifier_into_node_modules(&module.specifier)
node::resolve_specifier_into_node_modules(
&module.specifier,
self.shared.fs.as_ref(),
)
}
None => specifier.into_owned(),
};
@ -534,7 +573,7 @@ impl<TGraphContainer: ModuleGraphContainer>
}) => {
let transpile_result = self
.emitter
.emit_parsed_source(specifier, media_type, source)
.emit_parsed_source(specifier, media_type, ModuleKind::Esm, source)
.await?;
// at this point, we no longer need the parsed source in memory, so free it
@ -547,11 +586,19 @@ impl<TGraphContainer: ModuleGraphContainer>
media_type,
}))
}
Some(CodeOrDeferredEmit::Cjs {
specifier,
media_type,
source,
}) => self
.load_maybe_cjs(specifier, media_type, source)
.await
.map(Some),
None => Ok(None),
}
}
fn load_prepared_module_sync(
fn load_prepared_module_for_source_map_sync(
&self,
specifier: &ModuleSpecifier,
) -> Result<Option<ModuleCodeStringSource>, AnyError> {
@ -564,9 +611,12 @@ impl<TGraphContainer: ModuleGraphContainer>
media_type,
source,
}) => {
let transpile_result = self
.emitter
.emit_parsed_source_sync(specifier, media_type, source)?;
let transpile_result = self.emitter.emit_parsed_source_sync(
specifier,
media_type,
ModuleKind::Esm,
source,
)?;
// at this point, we no longer need the parsed source in memory, so free it
self.parsed_source_cache.free(specifier);
@ -578,6 +628,14 @@ impl<TGraphContainer: ModuleGraphContainer>
media_type,
}))
}
Some(CodeOrDeferredEmit::Cjs { .. }) => {
self.parsed_source_cache.free(specifier);
// todo(dsherret): to make this work, we should probably just
// rely on the CJS export cache. At the moment this is hard because
// cjs export analysis is only async
Ok(None)
}
None => Ok(None),
}
}
@ -607,20 +665,40 @@ impl<TGraphContainer: ModuleGraphContainer>
source,
media_type,
specifier,
is_script,
..
})) => {
// todo(dsherret): revert in https://github.com/denoland/deno/pull/26439
if self.is_npm_main && *is_script
|| self.shared.cjs_tracker.is_cjs_with_known_is_script(
specifier,
*media_type,
*is_script,
)?
{
return Ok(Some(CodeOrDeferredEmit::Cjs {
specifier,
media_type: *media_type,
source,
}));
}
let code: ModuleCodeString = match media_type {
MediaType::JavaScript
| MediaType::Unknown
| MediaType::Cjs
| MediaType::Mjs
| MediaType::Json => source.clone().into(),
MediaType::Dts | MediaType::Dcts | MediaType::Dmts => {
Default::default()
}
MediaType::Cjs | MediaType::Cts => {
return Ok(Some(CodeOrDeferredEmit::Cjs {
specifier,
media_type: *media_type,
source,
}));
}
MediaType::TypeScript
| MediaType::Mts
| MediaType::Cts
| MediaType::Jsx
| MediaType::Tsx => {
return Ok(Some(CodeOrDeferredEmit::DeferredEmit {
@ -629,7 +707,7 @@ impl<TGraphContainer: ModuleGraphContainer>
source,
}));
}
MediaType::TsBuildInfo | MediaType::Wasm | MediaType::SourceMap => {
MediaType::Css | MediaType::Wasm | MediaType::SourceMap => {
panic!("Unexpected media type {media_type} for {specifier}")
}
};
@ -651,6 +729,48 @@ impl<TGraphContainer: ModuleGraphContainer>
| None => Ok(None),
}
}
async fn load_maybe_cjs(
&self,
specifier: &ModuleSpecifier,
media_type: MediaType,
original_source: &Arc<str>,
) -> Result<ModuleCodeStringSource, AnyError> {
let js_source = if media_type.is_emittable() {
Cow::Owned(
self
.emitter
.emit_parsed_source(
specifier,
media_type,
ModuleKind::Cjs,
original_source,
)
.await?,
)
} else {
Cow::Borrowed(original_source.as_ref())
};
let text = self
.node_code_translator
.translate_cjs_to_esm(specifier, Some(js_source))
.await?;
// at this point, we no longer need the parsed source in memory, so free it
self.parsed_source_cache.free(specifier);
Ok(ModuleCodeStringSource {
code: match text {
// perf: if the text is borrowed, that means it didn't make any changes
// to the original source, so we can just provide that instead of cloning
// the borrowed text
Cow::Borrowed(_) => {
ModuleSourceCode::String(original_source.clone().into())
}
Cow::Owned(text) => ModuleSourceCode::String(text.into()),
},
found_url: specifier.clone(),
media_type,
})
}
}
enum CodeOrDeferredEmit<'a> {
@ -660,6 +780,11 @@ enum CodeOrDeferredEmit<'a> {
media_type: MediaType,
source: &'a Arc<str>,
},
Cjs {
specifier: &'a ModuleSpecifier,
media_type: MediaType,
source: &'a Arc<str>,
},
}
// todo(dsherret): this double Rc boxing is not ideal
@ -821,7 +946,10 @@ impl<TGraphContainer: ModuleGraphContainer> ModuleLoader
"wasm" | "file" | "http" | "https" | "data" | "blob" => (),
_ => return None,
}
let source = self.0.load_prepared_module_sync(&specifier).ok()??;
let source = self
.0
.load_prepared_module_for_source_map_sync(&specifier)
.ok()??;
source_map_from_code(source.code.as_bytes())
}
@ -900,3 +1028,79 @@ impl ModuleGraphUpdatePermit for WorkerModuleGraphUpdatePermit {
drop(self.permit); // explicit drop for clarity
}
}
#[derive(Debug)]
struct CliNodeRequireLoader<TGraphContainer: ModuleGraphContainer> {
emitter: Arc<Emitter>,
fs: Arc<dyn FileSystem>,
graph_container: TGraphContainer,
in_npm_pkg_checker: Arc<dyn InNpmPackageChecker>,
npm_resolver: Arc<dyn CliNpmResolver>,
}
impl<TGraphContainer: ModuleGraphContainer>
CliNodeRequireLoader<TGraphContainer>
{
pub fn new(
emitter: Arc<Emitter>,
fs: Arc<dyn FileSystem>,
graph_container: TGraphContainer,
in_npm_pkg_checker: Arc<dyn InNpmPackageChecker>,
npm_resolver: Arc<dyn CliNpmResolver>,
) -> Self {
Self {
emitter,
fs,
graph_container,
in_npm_pkg_checker,
npm_resolver,
}
}
}
impl<TGraphContainer: ModuleGraphContainer> NodeRequireLoader
for CliNodeRequireLoader<TGraphContainer>
{
fn ensure_read_permission<'a>(
&self,
permissions: &mut dyn deno_runtime::deno_node::NodePermissions,
path: &'a Path,
) -> Result<std::borrow::Cow<'a, Path>, AnyError> {
if let Ok(url) = deno_path_util::url_from_file_path(path) {
// allow reading if it's in the module graph
if self.graph_container.graph().get(&url).is_some() {
return Ok(std::borrow::Cow::Borrowed(path));
}
}
self.npm_resolver.ensure_read_permission(permissions, path)
}
fn load_text_file_lossy(&self, path: &Path) -> Result<String, AnyError> {
// todo(dsherret): use the preloaded module from the graph if available?
let media_type = MediaType::from_path(path);
let text = self.fs.read_text_file_lossy_sync(path, None)?;
if media_type.is_emittable() {
let specifier = deno_path_util::url_from_file_path(path)?;
if self.in_npm_pkg_checker.in_npm_package(&specifier) {
return Err(
NotSupportedKindInNpmError {
media_type,
specifier,
}
.into(),
);
}
self.emitter.emit_parsed_source_sync(
&specifier,
media_type,
// this is probably not super accurate due to require esm, but probably ok.
// If we find this causes a lot of churn in the emit cache then we should
// investigate how we can make this better
ModuleKind::Cjs,
&text.into(),
)
} else {
Ok(text)
}
}
}

View file

@ -1,11 +1,14 @@
// Copyright 2018-2024 the Deno authors. All rights reserved. MIT license.
use std::borrow::Cow;
use std::sync::Arc;
use deno_ast::MediaType;
use deno_ast::ModuleSpecifier;
use deno_core::error::AnyError;
use deno_graph::ParsedSourceStore;
use deno_path_util::url_from_file_path;
use deno_path_util::url_to_file_path;
use deno_runtime::deno_fs;
use deno_runtime::deno_node::DenoFsNodeResolverEnv;
use node_resolver::analyze::CjsAnalysis as ExtNodeCjsAnalysis;
@ -18,8 +21,8 @@ use serde::Serialize;
use crate::cache::CacheDBHash;
use crate::cache::NodeAnalysisCache;
use crate::cache::ParsedSourceCache;
use crate::resolver::CliNodeResolver;
use crate::util::fs::canonicalize_path_maybe_not_exists;
use crate::resolver::CjsTracker;
use crate::util::fs::canonicalize_path_maybe_not_exists_with_fs;
pub type CliNodeCodeTranslator =
NodeCodeTranslator<CliCjsCodeAnalyzer, DenoFsNodeResolverEnv>;
@ -32,14 +35,14 @@ pub type CliNodeCodeTranslator =
/// because the node_modules folder might not exist at that time.
pub fn resolve_specifier_into_node_modules(
specifier: &ModuleSpecifier,
fs: &dyn deno_fs::FileSystem,
) -> ModuleSpecifier {
specifier
.to_file_path()
url_to_file_path(specifier)
.ok()
// this path might not exist at the time the graph is being created
// because the node_modules folder might not yet exist
.and_then(|path| canonicalize_path_maybe_not_exists(&path).ok())
.and_then(|path| ModuleSpecifier::from_file_path(path).ok())
.and_then(|path| canonicalize_path_maybe_not_exists_with_fs(&path, fs).ok())
.and_then(|path| url_from_file_path(&path).ok())
.unwrap_or_else(|| specifier.clone())
}
@ -56,23 +59,29 @@ pub enum CliCjsAnalysis {
pub struct CliCjsCodeAnalyzer {
cache: NodeAnalysisCache,
cjs_tracker: Arc<CjsTracker>,
fs: deno_fs::FileSystemRc,
node_resolver: Arc<CliNodeResolver>,
parsed_source_cache: Option<Arc<ParsedSourceCache>>,
// todo(dsherret): hack, remove in https://github.com/denoland/deno/pull/26439
// For example, this does not properly handle if cjs analysis was already done
// and has been cached.
is_npm_main: bool,
}
impl CliCjsCodeAnalyzer {
pub fn new(
cache: NodeAnalysisCache,
cjs_tracker: Arc<CjsTracker>,
fs: deno_fs::FileSystemRc,
node_resolver: Arc<CliNodeResolver>,
parsed_source_cache: Option<Arc<ParsedSourceCache>>,
is_npm_main: bool,
) -> Self {
Self {
cache,
cjs_tracker,
fs,
node_resolver,
parsed_source_cache,
is_npm_main,
}
}
@ -88,7 +97,7 @@ impl CliCjsCodeAnalyzer {
return Ok(analysis);
}
let mut media_type = MediaType::from_specifier(specifier);
let media_type = MediaType::from_specifier(specifier);
if media_type == MediaType::Json {
return Ok(CliCjsAnalysis::Cjs {
exports: vec![],
@ -96,62 +105,53 @@ impl CliCjsCodeAnalyzer {
});
}
if media_type == MediaType::JavaScript {
if let Some(package_json) =
self.node_resolver.get_closest_package_json(specifier)?
{
match package_json.typ.as_str() {
"commonjs" => {
media_type = MediaType::Cjs;
}
"module" => {
media_type = MediaType::Mjs;
}
_ => {}
}
}
}
let cjs_tracker = self.cjs_tracker.clone();
let is_npm_main = self.is_npm_main;
let is_maybe_cjs =
cjs_tracker.is_maybe_cjs(specifier, media_type)? || is_npm_main;
let analysis = if is_maybe_cjs {
let maybe_parsed_source = self
.parsed_source_cache
.as_ref()
.and_then(|c| c.remove_parsed_source(specifier));
let maybe_parsed_source = self
.parsed_source_cache
.as_ref()
.and_then(|c| c.remove_parsed_source(specifier));
let analysis = deno_core::unsync::spawn_blocking({
let specifier = specifier.clone();
let source: Arc<str> = source.into();
move || -> Result<_, deno_ast::ParseDiagnostic> {
let parsed_source =
maybe_parsed_source.map(Ok).unwrap_or_else(|| {
deno_ast::parse_program(deno_ast::ParseParams {
specifier,
text: source,
media_type,
capture_tokens: true,
scope_analysis: false,
maybe_syntax: None,
deno_core::unsync::spawn_blocking({
let specifier = specifier.clone();
let source: Arc<str> = source.into();
move || -> Result<_, AnyError> {
let parsed_source =
maybe_parsed_source.map(Ok).unwrap_or_else(|| {
deno_ast::parse_program(deno_ast::ParseParams {
specifier,
text: source,
media_type,
capture_tokens: true,
scope_analysis: false,
maybe_syntax: None,
})
})?;
let is_script = parsed_source.compute_is_script();
let is_cjs = cjs_tracker.is_cjs_with_known_is_script(
parsed_source.specifier(),
media_type,
is_script,
)? || is_script && is_npm_main;
if is_cjs {
let analysis = parsed_source.analyze_cjs();
Ok(CliCjsAnalysis::Cjs {
exports: analysis.exports,
reexports: analysis.reexports,
})
})?;
if parsed_source.is_script() {
let analysis = parsed_source.analyze_cjs();
Ok(CliCjsAnalysis::Cjs {
exports: analysis.exports,
reexports: analysis.reexports,
})
} else if media_type == MediaType::Cjs {
// FIXME: `deno_ast` should internally handle MediaType::Cjs implying that
// the result must never be Esm
Ok(CliCjsAnalysis::Cjs {
exports: vec![],
reexports: vec![],
})
} else {
Ok(CliCjsAnalysis::Esm)
} else {
Ok(CliCjsAnalysis::Esm)
}
}
}
})
.await
.unwrap()?;
})
.await
.unwrap()?
} else {
CliCjsAnalysis::Esm
};
self
.cache
@ -163,11 +163,11 @@ impl CliCjsCodeAnalyzer {
#[async_trait::async_trait(?Send)]
impl CjsCodeAnalyzer for CliCjsCodeAnalyzer {
async fn analyze_cjs(
async fn analyze_cjs<'a>(
&self,
specifier: &ModuleSpecifier,
source: Option<String>,
) -> Result<ExtNodeCjsAnalysis, AnyError> {
source: Option<Cow<'a, str>>,
) -> Result<ExtNodeCjsAnalysis<'a>, AnyError> {
let source = match source {
Some(source) => source,
None => {
@ -175,7 +175,7 @@ impl CjsCodeAnalyzer for CliCjsCodeAnalyzer {
if let Ok(source_from_file) =
self.fs.read_text_file_lossy_async(path, None).await
{
source_from_file
Cow::Owned(source_from_file)
} else {
return Ok(ExtNodeCjsAnalysis::Cjs(CjsAnalysisExports {
exports: vec![],

View file

@ -10,8 +10,8 @@ use deno_core::serde_json;
use deno_core::url::Url;
use deno_resolver::npm::ByonmNpmResolver;
use deno_resolver::npm::ByonmNpmResolverCreateOptions;
use deno_runtime::deno_node::DenoFsNodeResolverEnv;
use deno_runtime::deno_node::NodePermissions;
use deno_runtime::deno_node::NodeRequireResolver;
use deno_runtime::ops::process::NpmProcessStateProvider;
use deno_semver::package::PackageReq;
use node_resolver::NpmResolver;
@ -25,30 +25,14 @@ use super::InnerCliNpmResolverRef;
use super::ResolvePkgFolderFromDenoReqError;
pub type CliByonmNpmResolverCreateOptions =
ByonmNpmResolverCreateOptions<CliDenoResolverFs>;
pub type CliByonmNpmResolver = ByonmNpmResolver<CliDenoResolverFs>;
ByonmNpmResolverCreateOptions<CliDenoResolverFs, DenoFsNodeResolverEnv>;
pub type CliByonmNpmResolver =
ByonmNpmResolver<CliDenoResolverFs, DenoFsNodeResolverEnv>;
// todo(dsherret): the services hanging off `CliNpmResolver` doesn't seem ideal. We should probably decouple.
#[derive(Debug)]
struct CliByonmWrapper(Arc<CliByonmNpmResolver>);
impl NodeRequireResolver for CliByonmWrapper {
fn ensure_read_permission<'a>(
&self,
permissions: &mut dyn NodePermissions,
path: &'a Path,
) -> Result<Cow<'a, Path>, AnyError> {
if !path
.components()
.any(|c| c.as_os_str().to_ascii_lowercase() == "node_modules")
{
permissions.check_read_path(path)
} else {
Ok(Cow::Borrowed(path))
}
}
}
impl NpmProcessStateProvider for CliByonmWrapper {
fn get_npm_process_state(&self) -> String {
serde_json::to_string(&NpmProcessState {
@ -67,10 +51,6 @@ impl CliNpmResolver for CliByonmNpmResolver {
self
}
fn into_require_resolver(self: Arc<Self>) -> Arc<dyn NodeRequireResolver> {
Arc::new(CliByonmWrapper(self))
}
fn into_process_state_provider(
self: Arc<Self>,
) -> Arc<dyn NpmProcessStateProvider> {
@ -100,6 +80,21 @@ impl CliNpmResolver for CliByonmNpmResolver {
.map_err(ResolvePkgFolderFromDenoReqError::Byonm)
}
fn ensure_read_permission<'a>(
&self,
permissions: &mut dyn NodePermissions,
path: &'a Path,
) -> Result<Cow<'a, Path>, AnyError> {
if !path
.components()
.any(|c| c.as_os_str().to_ascii_lowercase() == "node_modules")
{
permissions.check_read_path(path).map_err(Into::into)
} else {
Ok(Cow::Borrowed(path))
}
}
fn check_state_hash(&self) -> Option<u64> {
// it is very difficult to determine the check state hash for byonm
// so we just return None to signify check caching is not supported

View file

@ -36,7 +36,7 @@ pub use tarball::TarballCache;
/// Stores a single copy of npm packages in a cache.
#[derive(Debug)]
pub struct NpmCache {
cache_dir: NpmCacheDir,
cache_dir: Arc<NpmCacheDir>,
cache_setting: CacheSetting,
npmrc: Arc<ResolvedNpmRc>,
/// ensures a package is only downloaded once per run
@ -45,7 +45,7 @@ pub struct NpmCache {
impl NpmCache {
pub fn new(
cache_dir: NpmCacheDir,
cache_dir: Arc<NpmCacheDir>,
cache_setting: CacheSetting,
npmrc: Arc<ResolvedNpmRc>,
) -> Self {
@ -61,6 +61,10 @@ impl NpmCache {
&self.cache_setting
}
pub fn root_dir_path(&self) -> &Path {
self.cache_dir.root_dir()
}
pub fn root_dir_url(&self) -> &Url {
self.cache_dir.root_dir_url()
}
@ -152,10 +156,6 @@ impl NpmCache {
self.cache_dir.package_name_folder(name, registry_url)
}
pub fn root_folder(&self) -> PathBuf {
self.cache_dir.root_dir().to_owned()
}
pub fn resolve_package_folder_id_from_specifier(
&self,
specifier: &ModuleSpecifier,

View file

@ -12,6 +12,7 @@ use deno_cache_dir::npm::NpmCacheDir;
use deno_core::anyhow::Context;
use deno_core::error::AnyError;
use deno_core::serde_json;
use deno_core::url::Url;
use deno_npm::npm_rc::ResolvedNpmRc;
use deno_npm::registry::NpmPackageInfo;
use deno_npm::registry::NpmRegistryApi;
@ -24,12 +25,12 @@ use deno_npm::NpmSystemInfo;
use deno_runtime::colors;
use deno_runtime::deno_fs::FileSystem;
use deno_runtime::deno_node::NodePermissions;
use deno_runtime::deno_node::NodeRequireResolver;
use deno_runtime::ops::process::NpmProcessStateProvider;
use deno_semver::package::PackageNv;
use deno_semver::package::PackageReq;
use node_resolver::errors::PackageFolderResolveError;
use node_resolver::errors::PackageFolderResolveIoError;
use node_resolver::InNpmPackageChecker;
use node_resolver::NpmResolver;
use resolution::AddPkgReqsResult;
@ -38,7 +39,7 @@ use crate::args::LifecycleScriptsConfig;
use crate::args::NpmInstallDepsProvider;
use crate::args::NpmProcessState;
use crate::args::NpmProcessStateKind;
use crate::cache::DenoCacheEnvFsAdapter;
use crate::args::PackageJsonDepValueParseWithLocationError;
use crate::cache::FastInsecureHasher;
use crate::http_util::HttpClientProvider;
use crate::util::fs::canonicalize_path_maybe_not_exists_with_fs;
@ -65,12 +66,12 @@ pub enum CliNpmResolverManagedSnapshotOption {
Specified(Option<ValidSerializedNpmResolutionSnapshot>),
}
pub struct CliNpmResolverManagedCreateOptions {
pub struct CliManagedNpmResolverCreateOptions {
pub snapshot: CliNpmResolverManagedSnapshotOption,
pub maybe_lockfile: Option<Arc<CliLockfile>>,
pub fs: Arc<dyn deno_runtime::deno_fs::FileSystem>,
pub http_client_provider: Arc<crate::http_util::HttpClientProvider>,
pub npm_global_cache_dir: PathBuf,
pub npm_cache_dir: Arc<NpmCacheDir>,
pub cache_setting: crate::args::CacheSetting,
pub text_only_progress_bar: crate::util::progress_bar::ProgressBar,
pub maybe_node_modules_path: Option<PathBuf>,
@ -81,7 +82,7 @@ pub struct CliNpmResolverManagedCreateOptions {
}
pub async fn create_managed_npm_resolver_for_lsp(
options: CliNpmResolverManagedCreateOptions,
options: CliManagedNpmResolverCreateOptions,
) -> Arc<dyn CliNpmResolver> {
let npm_cache = create_cache(&options);
let npm_api = create_api(&options, npm_cache.clone());
@ -114,7 +115,7 @@ pub async fn create_managed_npm_resolver_for_lsp(
}
pub async fn create_managed_npm_resolver(
options: CliNpmResolverManagedCreateOptions,
options: CliManagedNpmResolverCreateOptions,
) -> Result<Arc<dyn CliNpmResolver>, AnyError> {
let npm_cache = create_cache(&options);
let npm_api = create_api(&options, npm_cache.clone());
@ -188,20 +189,16 @@ fn create_inner(
))
}
fn create_cache(options: &CliNpmResolverManagedCreateOptions) -> Arc<NpmCache> {
fn create_cache(options: &CliManagedNpmResolverCreateOptions) -> Arc<NpmCache> {
Arc::new(NpmCache::new(
NpmCacheDir::new(
&DenoCacheEnvFsAdapter(options.fs.as_ref()),
options.npm_global_cache_dir.clone(),
options.npmrc.get_all_known_registries_urls(),
),
options.npm_cache_dir.clone(),
options.cache_setting.clone(),
options.npmrc.clone(),
))
}
fn create_api(
options: &CliNpmResolverManagedCreateOptions,
options: &CliManagedNpmResolverCreateOptions,
npm_cache: Arc<NpmCache>,
) -> Arc<CliNpmRegistryApi> {
Arc::new(CliNpmRegistryApi::new(
@ -258,6 +255,35 @@ async fn snapshot_from_lockfile(
Ok(snapshot)
}
#[derive(Debug)]
struct ManagedInNpmPackageChecker {
root_dir: Url,
}
impl InNpmPackageChecker for ManagedInNpmPackageChecker {
fn in_npm_package(&self, specifier: &Url) -> bool {
specifier.as_ref().starts_with(self.root_dir.as_str())
}
}
pub struct CliManagedInNpmPkgCheckerCreateOptions<'a> {
pub root_cache_dir_url: &'a Url,
pub maybe_node_modules_path: Option<&'a Path>,
}
pub fn create_managed_in_npm_pkg_checker(
options: CliManagedInNpmPkgCheckerCreateOptions,
) -> Arc<dyn InNpmPackageChecker> {
let root_dir = match options.maybe_node_modules_path {
Some(node_modules_folder) => {
deno_path_util::url_from_directory_path(node_modules_folder).unwrap()
}
None => options.root_cache_dir_url.clone(),
};
debug_assert!(root_dir.as_str().ends_with('/'));
Arc::new(ManagedInNpmPackageChecker { root_dir })
}
/// An npm resolver where the resolution is managed by Deno rather than
/// the user bringing their own node_modules (BYONM) on the file system.
pub struct ManagedCliNpmResolver {
@ -480,19 +506,24 @@ impl ManagedCliNpmResolver {
self.resolution.resolve_pkg_id_from_pkg_req(req)
}
pub fn ensure_no_pkg_json_dep_errors(&self) -> Result<(), AnyError> {
pub fn ensure_no_pkg_json_dep_errors(
&self,
) -> Result<(), Box<PackageJsonDepValueParseWithLocationError>> {
for err in self.npm_install_deps_provider.pkg_json_dep_errors() {
match err {
match &err.source {
deno_package_json::PackageJsonDepValueParseError::VersionReq(_) => {
return Err(
AnyError::from(err.clone())
.context("Failed to install from package.json"),
);
return Err(Box::new(err.clone()));
}
deno_package_json::PackageJsonDepValueParseError::Unsupported {
..
} => {
log::warn!("{} {} in package.json", colors::yellow("Warning"), err)
// only warn for this one
log::warn!(
"{} {}\n at {}",
colors::yellow("Warning"),
err.source,
err.location,
)
}
}
}
@ -549,8 +580,16 @@ impl ManagedCliNpmResolver {
.map_err(|err| err.into())
}
pub fn global_cache_root_folder(&self) -> PathBuf {
self.npm_cache.root_folder()
pub fn maybe_node_modules_path(&self) -> Option<&Path> {
self.fs_resolver.node_modules_path()
}
pub fn global_cache_root_path(&self) -> &Path {
self.npm_cache.root_dir_path()
}
pub fn global_cache_root_url(&self) -> &Url {
self.npm_cache.root_dir_url()
}
}
@ -585,22 +624,6 @@ impl NpmResolver for ManagedCliNpmResolver {
log::debug!("Resolved {} from {} to {}", name, referrer, path.display());
Ok(path)
}
fn in_npm_package(&self, specifier: &ModuleSpecifier) -> bool {
let root_dir_url = self.fs_resolver.root_dir_url();
debug_assert!(root_dir_url.as_str().ends_with('/'));
specifier.as_ref().starts_with(root_dir_url.as_str())
}
}
impl NodeRequireResolver for ManagedCliNpmResolver {
fn ensure_read_permission<'a>(
&self,
permissions: &mut dyn NodePermissions,
path: &'a Path,
) -> Result<Cow<'a, Path>, AnyError> {
self.fs_resolver.ensure_read_permission(permissions, path)
}
}
impl NpmProcessStateProvider for ManagedCliNpmResolver {
@ -617,10 +640,6 @@ impl CliNpmResolver for ManagedCliNpmResolver {
self
}
fn into_require_resolver(self: Arc<Self>) -> Arc<dyn NodeRequireResolver> {
self
}
fn into_process_state_provider(
self: Arc<Self>,
) -> Arc<dyn NpmProcessStateProvider> {
@ -681,6 +700,14 @@ impl CliNpmResolver for ManagedCliNpmResolver {
.map_err(ResolvePkgFolderFromDenoReqError::Managed)
}
fn ensure_read_permission<'a>(
&self,
permissions: &mut dyn NodePermissions,
path: &'a Path,
) -> Result<Cow<'a, Path>, AnyError> {
self.fs_resolver.ensure_read_permission(permissions, path)
}
fn check_state_hash(&self) -> Option<u64> {
// We could go further and check all the individual
// npm packages, but that's probably overkill.

View file

@ -17,7 +17,6 @@ use deno_core::anyhow::Context;
use deno_core::error::AnyError;
use deno_core::futures;
use deno_core::futures::StreamExt;
use deno_core::url::Url;
use deno_npm::NpmPackageCacheFolderId;
use deno_npm::NpmPackageId;
use deno_npm::NpmResolutionPackage;
@ -30,9 +29,6 @@ use crate::npm::managed::cache::TarballCache;
/// Part of the resolution that interacts with the file system.
#[async_trait(?Send)]
pub trait NpmPackageFsResolver: Send + Sync {
/// Specifier for the root directory.
fn root_dir_url(&self) -> &Url;
/// The local node_modules folder if it is applicable to the implementation.
fn node_modules_path(&self) -> Option<&Path>;
@ -137,7 +133,7 @@ impl RegistryReadPermissionChecker {
}
}
permissions.check_read_path(path)
permissions.check_read_path(path).map_err(Into::into)
}
}

View file

@ -18,6 +18,7 @@ pub struct BinEntries<'a> {
seen_names: HashMap<&'a str, &'a NpmPackageId>,
/// The bin entries
entries: Vec<(&'a NpmResolutionPackage, PathBuf)>,
sorted: bool,
}
/// Returns the name of the default binary for the given package.
@ -31,6 +32,20 @@ fn default_bin_name(package: &NpmResolutionPackage) -> &str {
.map_or(package.id.nv.name.as_str(), |(_, name)| name)
}
pub fn warn_missing_entrypoint(
bin_name: &str,
package_path: &Path,
entrypoint: &Path,
) {
log::warn!(
"{} Trying to set up '{}' bin for \"{}\", but the entry point \"{}\" doesn't exist.",
deno_terminal::colors::yellow("Warning"),
bin_name,
package_path.display(),
entrypoint.display()
);
}
impl<'a> BinEntries<'a> {
pub fn new() -> Self {
Self::default()
@ -42,6 +57,7 @@ impl<'a> BinEntries<'a> {
package: &'a NpmResolutionPackage,
package_path: PathBuf,
) {
self.sorted = false;
// check for a new collision, if we haven't already
// found one
match package.bin.as_ref().unwrap() {
@ -79,16 +95,21 @@ impl<'a> BinEntries<'a> {
&str, // bin name
&str, // bin script
) -> Result<(), AnyError>,
mut filter: impl FnMut(&NpmResolutionPackage) -> bool,
) -> Result<(), AnyError> {
if !self.collisions.is_empty() {
if !self.collisions.is_empty() && !self.sorted {
// walking the dependency tree to find out the depth of each package
// is sort of expensive, so we only do it if there's a collision
sort_by_depth(snapshot, &mut self.entries, &mut self.collisions);
self.sorted = true;
}
let mut seen = HashSet::new();
for (package, package_path) in &self.entries {
if !filter(package) {
continue;
}
if let Some(bin_entries) = &package.bin {
match bin_entries {
deno_npm::registry::NpmPackageVersionBinEntry::String(script) => {
@ -118,8 +139,8 @@ impl<'a> BinEntries<'a> {
}
/// Collect the bin entries into a vec of (name, script path)
pub fn into_bin_files(
mut self,
pub fn collect_bin_files(
&mut self,
snapshot: &NpmResolutionSnapshot,
) -> Vec<(String, PathBuf)> {
let mut bins = Vec::new();
@ -131,17 +152,18 @@ impl<'a> BinEntries<'a> {
bins.push((name.to_string(), package_path.join(script)));
Ok(())
},
|_| true,
)
.unwrap();
bins
}
/// Finish setting up the bin entries, writing the necessary files
/// to disk.
pub fn finish(
fn set_up_entries_filtered(
mut self,
snapshot: &NpmResolutionSnapshot,
bin_node_modules_dir_path: &Path,
filter: impl FnMut(&NpmResolutionPackage) -> bool,
mut handler: impl FnMut(&EntrySetupOutcome<'_>),
) -> Result<(), AnyError> {
if !self.entries.is_empty() && !bin_node_modules_dir_path.exists() {
std::fs::create_dir_all(bin_node_modules_dir_path).with_context(
@ -160,18 +182,54 @@ impl<'a> BinEntries<'a> {
Ok(())
},
|package, package_path, name, script| {
set_up_bin_entry(
let outcome = set_up_bin_entry(
package,
name,
script,
package_path,
bin_node_modules_dir_path,
)
)?;
handler(&outcome);
Ok(())
},
filter,
)?;
Ok(())
}
/// Finish setting up the bin entries, writing the necessary files
/// to disk.
pub fn finish(
self,
snapshot: &NpmResolutionSnapshot,
bin_node_modules_dir_path: &Path,
handler: impl FnMut(&EntrySetupOutcome<'_>),
) -> Result<(), AnyError> {
self.set_up_entries_filtered(
snapshot,
bin_node_modules_dir_path,
|_| true,
handler,
)
}
/// Finish setting up the bin entries, writing the necessary files
/// to disk.
pub fn finish_only(
self,
snapshot: &NpmResolutionSnapshot,
bin_node_modules_dir_path: &Path,
handler: impl FnMut(&EntrySetupOutcome<'_>),
only: &HashSet<&NpmPackageId>,
) -> Result<(), AnyError> {
self.set_up_entries_filtered(
snapshot,
bin_node_modules_dir_path,
|package| only.contains(&package.id),
handler,
)
}
}
// walk the dependency tree to find out the depth of each package
@ -233,16 +291,17 @@ fn sort_by_depth(
});
}
pub fn set_up_bin_entry(
package: &NpmResolutionPackage,
bin_name: &str,
pub fn set_up_bin_entry<'a>(
package: &'a NpmResolutionPackage,
bin_name: &'a str,
#[allow(unused_variables)] bin_script: &str,
#[allow(unused_variables)] package_path: &Path,
#[allow(unused_variables)] package_path: &'a Path,
bin_node_modules_dir_path: &Path,
) -> Result<(), AnyError> {
) -> Result<EntrySetupOutcome<'a>, AnyError> {
#[cfg(windows)]
{
set_up_bin_shim(package, bin_name, bin_node_modules_dir_path)?;
Ok(EntrySetupOutcome::Success)
}
#[cfg(unix)]
{
@ -252,9 +311,8 @@ pub fn set_up_bin_entry(
bin_script,
package_path,
bin_node_modules_dir_path,
)?;
)
}
Ok(())
}
#[cfg(windows)]
@ -301,14 +359,39 @@ fn make_executable_if_exists(path: &Path) -> Result<bool, AnyError> {
Ok(true)
}
pub enum EntrySetupOutcome<'a> {
#[cfg_attr(windows, allow(dead_code))]
MissingEntrypoint {
bin_name: &'a str,
package_path: &'a Path,
entrypoint: PathBuf,
package: &'a NpmResolutionPackage,
},
Success,
}
impl<'a> EntrySetupOutcome<'a> {
pub fn warn_if_failed(&self) {
match self {
EntrySetupOutcome::MissingEntrypoint {
bin_name,
package_path,
entrypoint,
..
} => warn_missing_entrypoint(bin_name, package_path, entrypoint),
EntrySetupOutcome::Success => {}
}
}
}
#[cfg(unix)]
fn symlink_bin_entry(
_package: &NpmResolutionPackage,
bin_name: &str,
fn symlink_bin_entry<'a>(
package: &'a NpmResolutionPackage,
bin_name: &'a str,
bin_script: &str,
package_path: &Path,
package_path: &'a Path,
bin_node_modules_dir_path: &Path,
) -> Result<(), AnyError> {
) -> Result<EntrySetupOutcome<'a>, AnyError> {
use std::io;
use std::os::unix::fs::symlink;
let link = bin_node_modules_dir_path.join(bin_name);
@ -318,14 +401,12 @@ fn symlink_bin_entry(
format!("Can't set up '{}' bin at {}", bin_name, original.display())
})?;
if !found {
log::warn!(
"{} Trying to set up '{}' bin for \"{}\", but the entry point \"{}\" doesn't exist.",
deno_terminal::colors::yellow("Warning"),
return Ok(EntrySetupOutcome::MissingEntrypoint {
bin_name,
package_path.display(),
original.display()
);
return Ok(());
package_path,
entrypoint: original,
package,
});
}
let original_relative =
@ -348,7 +429,7 @@ fn symlink_bin_entry(
original_relative.display()
)
})?;
return Ok(());
return Ok(EntrySetupOutcome::Success);
}
return Err(err).with_context(|| {
format!(
@ -359,5 +440,5 @@ fn symlink_bin_entry(
});
}
Ok(())
Ok(EntrySetupOutcome::Success)
}

View file

@ -10,6 +10,7 @@ use deno_runtime::deno_io::FromRawIoHandle;
use deno_semver::package::PackageNv;
use deno_semver::Version;
use std::borrow::Cow;
use std::collections::HashSet;
use std::rc::Rc;
use std::path::Path;
@ -61,7 +62,7 @@ impl<'a> LifecycleScripts<'a> {
}
}
fn has_lifecycle_scripts(
pub fn has_lifecycle_scripts(
package: &NpmResolutionPackage,
package_path: &Path,
) -> bool {
@ -83,7 +84,7 @@ fn is_broken_default_install_script(script: &str, package_path: &Path) -> bool {
}
impl<'a> LifecycleScripts<'a> {
fn can_run_scripts(&self, package_nv: &PackageNv) -> bool {
pub fn can_run_scripts(&self, package_nv: &PackageNv) -> bool {
if !self.strategy.can_run_scripts() {
return false;
}
@ -98,6 +99,9 @@ impl<'a> LifecycleScripts<'a> {
PackagesAllowedScripts::None => false,
}
}
pub fn has_run_scripts(&self, package: &NpmResolutionPackage) -> bool {
self.strategy.has_run(package)
}
/// Register a package for running lifecycle scripts, if applicable.
///
/// `package_path` is the path containing the package's code (its root dir).
@ -110,12 +114,12 @@ impl<'a> LifecycleScripts<'a> {
) {
if has_lifecycle_scripts(package, &package_path) {
if self.can_run_scripts(&package.id.nv) {
if !self.strategy.has_run(package) {
if !self.has_run_scripts(package) {
self
.packages_with_scripts
.push((package, package_path.into_owned()));
}
} else if !self.strategy.has_run(package)
} else if !self.has_run_scripts(package)
&& (self.config.explicit_install || !self.strategy.has_warned(package))
{
// Skip adding `esbuild` as it is known that it can work properly without lifecycle script
@ -149,22 +153,32 @@ impl<'a> LifecycleScripts<'a> {
self,
snapshot: &NpmResolutionSnapshot,
packages: &[NpmResolutionPackage],
root_node_modules_dir_path: Option<&Path>,
root_node_modules_dir_path: &Path,
progress_bar: &ProgressBar,
) -> Result<(), AnyError> {
self.warn_not_run_scripts()?;
let get_package_path =
|p: &NpmResolutionPackage| self.strategy.package_path(p);
let mut failed_packages = Vec::new();
let mut bin_entries = BinEntries::new();
if !self.packages_with_scripts.is_empty() {
let package_ids = self
.packages_with_scripts
.iter()
.map(|(p, _)| &p.id)
.collect::<HashSet<_>>();
// get custom commands for each bin available in the node_modules dir (essentially
// the scripts that are in `node_modules/.bin`)
let base =
resolve_baseline_custom_commands(snapshot, packages, get_package_path)?;
let base = resolve_baseline_custom_commands(
&mut bin_entries,
snapshot,
packages,
get_package_path,
)?;
let init_cwd = &self.config.initial_cwd;
let process_state = crate::npm::managed::npm_process_state(
snapshot.as_valid_serialized(),
root_node_modules_dir_path,
Some(root_node_modules_dir_path),
);
let mut env_vars = crate::task_runner::real_env_vars();
@ -221,7 +235,7 @@ impl<'a> LifecycleScripts<'a> {
custom_commands: custom_commands.clone(),
init_cwd,
argv: &[],
root_node_modules_dir: root_node_modules_dir_path,
root_node_modules_dir: Some(root_node_modules_dir_path),
stdio: Some(crate::task_runner::TaskIo {
stderr: TaskStdio::piped(),
stdout: TaskStdio::piped(),
@ -262,6 +276,17 @@ impl<'a> LifecycleScripts<'a> {
}
self.strategy.did_run_scripts(package)?;
}
// re-set up bin entries for the packages which we've run scripts for.
// lifecycle scripts can create files that are linked to by bin entries,
// and the only reliable way to handle this is to re-link bin entries
// (this is what PNPM does as well)
bin_entries.finish_only(
snapshot,
&root_node_modules_dir_path.join(".bin"),
|outcome| outcome.warn_if_failed(),
&package_ids,
)?;
}
if failed_packages.is_empty() {
Ok(())
@ -281,9 +306,10 @@ impl<'a> LifecycleScripts<'a> {
// take in all (non copy) packages from snapshot,
// and resolve the set of available binaries to create
// custom commands available to the task runner
fn resolve_baseline_custom_commands(
snapshot: &NpmResolutionSnapshot,
packages: &[NpmResolutionPackage],
fn resolve_baseline_custom_commands<'a>(
bin_entries: &mut BinEntries<'a>,
snapshot: &'a NpmResolutionSnapshot,
packages: &'a [NpmResolutionPackage],
get_package_path: impl Fn(&NpmResolutionPackage) -> PathBuf,
) -> Result<crate::task_runner::TaskCustomCommands, AnyError> {
let mut custom_commands = crate::task_runner::TaskCustomCommands::new();
@ -306,6 +332,7 @@ fn resolve_baseline_custom_commands(
// doing it for packages that are set up already.
// realistically, scripts won't be run very often so it probably isn't too big of an issue.
resolve_custom_commands_from_packages(
bin_entries,
custom_commands,
snapshot,
packages,
@ -320,12 +347,12 @@ fn resolve_custom_commands_from_packages<
'a,
P: IntoIterator<Item = &'a NpmResolutionPackage>,
>(
bin_entries: &mut BinEntries<'a>,
mut commands: crate::task_runner::TaskCustomCommands,
snapshot: &'a NpmResolutionSnapshot,
packages: P,
get_package_path: impl Fn(&'a NpmResolutionPackage) -> PathBuf,
) -> Result<crate::task_runner::TaskCustomCommands, AnyError> {
let mut bin_entries = BinEntries::new();
for package in packages {
let package_path = get_package_path(package);
@ -333,7 +360,7 @@ fn resolve_custom_commands_from_packages<
bin_entries.add(package, package_path);
}
}
let bins = bin_entries.into_bin_files(snapshot);
let bins: Vec<(String, PathBuf)> = bin_entries.collect_bin_files(snapshot);
for (bin_name, script_path) in bins {
commands.insert(
bin_name.clone(),
@ -356,7 +383,9 @@ fn resolve_custom_commands_from_deps(
snapshot: &NpmResolutionSnapshot,
get_package_path: impl Fn(&NpmResolutionPackage) -> PathBuf,
) -> Result<crate::task_runner::TaskCustomCommands, AnyError> {
let mut bin_entries = BinEntries::new();
resolve_custom_commands_from_packages(
&mut bin_entries,
baseline,
snapshot,
package

View file

@ -11,7 +11,6 @@ use crate::colors;
use async_trait::async_trait;
use deno_ast::ModuleSpecifier;
use deno_core::error::AnyError;
use deno_core::url::Url;
use deno_npm::NpmPackageCacheFolderId;
use deno_npm::NpmPackageId;
use deno_npm::NpmResolutionPackage;
@ -56,7 +55,7 @@ impl GlobalNpmPackageResolver {
Self {
registry_read_permission_checker: RegistryReadPermissionChecker::new(
fs,
cache.root_folder(),
cache.root_dir_path().to_path_buf(),
),
cache,
tarball_cache,
@ -69,10 +68,6 @@ impl GlobalNpmPackageResolver {
#[async_trait(?Send)]
impl NpmPackageFsResolver for GlobalNpmPackageResolver {
fn root_dir_url(&self) -> &Url {
self.cache.root_dir_url()
}
fn node_modules_path(&self) -> Option<&Path> {
None
}

View file

@ -55,6 +55,7 @@ use crate::util::progress_bar::ProgressMessagePrompt;
use super::super::cache::NpmCache;
use super::super::cache::TarballCache;
use super::super::resolution::NpmResolution;
use super::common::bin_entries;
use super::common::NpmPackageFsResolver;
use super::common::RegistryReadPermissionChecker;
@ -155,10 +156,6 @@ impl LocalNpmPackageResolver {
#[async_trait(?Send)]
impl NpmPackageFsResolver for LocalNpmPackageResolver {
fn root_dir_url(&self) -> &Url {
&self.root_node_modules_url
}
fn node_modules_path(&self) -> Option<&Path> {
Some(self.root_node_modules_path.as_ref())
}
@ -333,8 +330,7 @@ async fn sync_resolution_with_fs(
let mut cache_futures = FuturesUnordered::new();
let mut newest_packages_by_name: HashMap<&String, &NpmResolutionPackage> =
HashMap::with_capacity(package_partitions.packages.len());
let bin_entries =
Rc::new(RefCell::new(super::common::bin_entries::BinEntries::new()));
let bin_entries = Rc::new(RefCell::new(bin_entries::BinEntries::new()));
let mut lifecycle_scripts =
super::common::lifecycle_scripts::LifecycleScripts::new(
lifecycle_scripts,
@ -662,7 +658,28 @@ async fn sync_resolution_with_fs(
// 7. Set up `node_modules/.bin` entries for packages that need it.
{
let bin_entries = std::mem::take(&mut *bin_entries.borrow_mut());
bin_entries.finish(snapshot, &bin_node_modules_dir_path)?;
bin_entries.finish(
snapshot,
&bin_node_modules_dir_path,
|setup_outcome| {
match setup_outcome {
bin_entries::EntrySetupOutcome::MissingEntrypoint {
package,
package_path,
..
} if super::common::lifecycle_scripts::has_lifecycle_scripts(
package,
package_path,
) && lifecycle_scripts.can_run_scripts(&package.id.nv)
&& !lifecycle_scripts.has_run_scripts(package) =>
{
// ignore, it might get fixed when the lifecycle scripts run.
// if not, we'll warn then
}
outcome => outcome.warn_if_failed(),
}
},
)?;
}
// 8. Create symlinks for the workspace packages
@ -712,7 +729,7 @@ async fn sync_resolution_with_fs(
.finish(
snapshot,
&package_partitions.packages,
Some(root_node_modules_dir_path),
root_node_modules_dir_path,
progress_bar,
)
.await?;
@ -1039,12 +1056,18 @@ fn junction_or_symlink_dir(
if symlink_err.kind() == std::io::ErrorKind::PermissionDenied =>
{
USE_JUNCTIONS.store(true, std::sync::atomic::Ordering::Relaxed);
junction::create(old_path, new_path).map_err(Into::into)
junction::create(old_path, new_path)
.context("Failed creating junction in node_modules folder")
}
Err(symlink_err) => {
log::warn!(
"{} Unexpected error symlinking node_modules: {symlink_err}",
colors::yellow("Warning")
);
USE_JUNCTIONS.store(true, std::sync::atomic::Ordering::Relaxed);
junction::create(old_path, new_path)
.context("Failed creating junction in node_modules folder")
}
Err(symlink_err) => Err(
AnyError::from(symlink_err)
.context("Failed creating symlink in node_modules folder"),
),
}
}

View file

@ -4,6 +4,7 @@ mod byonm;
mod common;
mod managed;
use std::borrow::Cow;
use std::path::Path;
use std::path::PathBuf;
use std::sync::Arc;
@ -15,13 +16,16 @@ use deno_core::error::AnyError;
use deno_core::serde_json;
use deno_npm::npm_rc::ResolvedNpmRc;
use deno_npm::registry::NpmPackageInfo;
use deno_resolver::npm::ByonmInNpmPackageChecker;
use deno_resolver::npm::ByonmNpmResolver;
use deno_resolver::npm::ByonmResolvePkgFolderFromDenoReqError;
use deno_runtime::deno_node::NodeRequireResolver;
use deno_runtime::deno_node::NodePermissions;
use deno_runtime::ops::process::NpmProcessStateProvider;
use deno_semver::package::PackageNv;
use deno_semver::package::PackageReq;
use managed::cache::registry_info::get_package_url;
use managed::create_managed_in_npm_pkg_checker;
use node_resolver::InNpmPackageChecker;
use node_resolver::NpmResolver;
use thiserror::Error;
@ -29,7 +33,8 @@ use crate::file_fetcher::FileFetcher;
pub use self::byonm::CliByonmNpmResolver;
pub use self::byonm::CliByonmNpmResolverCreateOptions;
pub use self::managed::CliNpmResolverManagedCreateOptions;
pub use self::managed::CliManagedInNpmPkgCheckerCreateOptions;
pub use self::managed::CliManagedNpmResolverCreateOptions;
pub use self::managed::CliNpmResolverManagedSnapshotOption;
pub use self::managed::ManagedCliNpmResolver;
@ -42,7 +47,7 @@ pub enum ResolvePkgFolderFromDenoReqError {
}
pub enum CliNpmResolverCreateOptions {
Managed(CliNpmResolverManagedCreateOptions),
Managed(CliManagedNpmResolverCreateOptions),
Byonm(CliByonmNpmResolverCreateOptions),
}
@ -68,6 +73,22 @@ pub async fn create_cli_npm_resolver(
}
}
pub enum CreateInNpmPkgCheckerOptions<'a> {
Managed(CliManagedInNpmPkgCheckerCreateOptions<'a>),
Byonm,
}
pub fn create_in_npm_pkg_checker(
options: CreateInNpmPkgCheckerOptions,
) -> Arc<dyn InNpmPackageChecker> {
match options {
CreateInNpmPkgCheckerOptions::Managed(options) => {
create_managed_in_npm_pkg_checker(options)
}
CreateInNpmPkgCheckerOptions::Byonm => Arc::new(ByonmInNpmPackageChecker),
}
}
pub enum InnerCliNpmResolverRef<'a> {
Managed(&'a ManagedCliNpmResolver),
#[allow(dead_code)]
@ -76,7 +97,6 @@ pub enum InnerCliNpmResolverRef<'a> {
pub trait CliNpmResolver: NpmResolver {
fn into_npm_resolver(self: Arc<Self>) -> Arc<dyn NpmResolver>;
fn into_require_resolver(self: Arc<Self>) -> Arc<dyn NodeRequireResolver>;
fn into_process_state_provider(
self: Arc<Self>,
) -> Arc<dyn NpmProcessStateProvider>;
@ -107,6 +127,12 @@ pub trait CliNpmResolver: NpmResolver {
referrer: &ModuleSpecifier,
) -> Result<PathBuf, ResolvePkgFolderFromDenoReqError>;
fn ensure_read_permission<'a>(
&self,
permissions: &mut dyn NodePermissions,
path: &'a Path,
) -> Result<Cow<'a, Path>, AnyError>;
/// Returns a hash returning the state of the npm resolver
/// or `None` if the state currently can't be determined.
fn check_state_hash(&self) -> Option<u64>;
@ -189,3 +215,15 @@ impl NpmFetchResolver {
info
}
}
pub const NPM_CONFIG_USER_AGENT_ENV_VAR: &str = "npm_config_user_agent";
pub fn get_npm_config_user_agent() -> String {
format!(
"deno/{} npm/? deno/{} {} {}",
env!("CARGO_PKG_VERSION"),
env!("CARGO_PKG_VERSION"),
std::env::consts::OS,
std::env::consts::ARCH
)
}

View file

@ -2,7 +2,6 @@
use std::sync::atomic::AtomicUsize;
use std::sync::atomic::Ordering;
use std::time;
use deno_core::error::generic_error;
use deno_core::error::type_error;
@ -13,6 +12,7 @@ use deno_core::ModuleSpecifier;
use deno_core::OpState;
use deno_runtime::deno_permissions::ChildPermissionsArg;
use deno_runtime::deno_permissions::PermissionsContainer;
use deno_runtime::deno_web::StartTime;
use tokio::sync::mpsc::UnboundedSender;
use uuid::Uuid;
@ -56,7 +56,7 @@ struct PermissionsHolder(Uuid, PermissionsContainer);
pub fn op_pledge_test_permissions(
state: &mut OpState,
#[serde] args: ChildPermissionsArg,
) -> Result<Uuid, AnyError> {
) -> Result<Uuid, deno_runtime::deno_permissions::ChildPermissionError> {
let token = Uuid::new_v4();
let parent_permissions = state.borrow_mut::<PermissionsContainer>();
let worker_permissions = parent_permissions.create_child_permissions(args)?;
@ -147,8 +147,8 @@ fn op_dispatch_bench_event(state: &mut OpState, #[serde] event: BenchEvent) {
#[op2(fast)]
#[number]
fn op_bench_now(state: &mut OpState) -> Result<u64, AnyError> {
let ns = state.borrow::<time::Instant>().elapsed().as_nanos();
fn op_bench_now(state: &mut OpState) -> Result<u64, std::num::TryFromIntError> {
let ns = state.borrow::<StartTime>().elapsed().as_nanos();
let ns_u64 = u64::try_from(ns)?;
Ok(ns_u64)
}

View file

@ -46,7 +46,7 @@ pub fn op_jupyter_input(
state: &mut OpState,
#[string] prompt: String,
is_password: bool,
) -> Result<Option<String>, AnyError> {
) -> Option<String> {
let (last_execution_request, stdin_connection_proxy) = {
(
state.borrow::<Arc<Mutex<Option<JupyterMessage>>>>().clone(),
@ -58,11 +58,11 @@ pub fn op_jupyter_input(
if let Some(last_request) = maybe_last_request {
let JupyterMessageContent::ExecuteRequest(msg) = &last_request.content
else {
return Ok(None);
return None;
};
if !msg.allow_stdin {
return Ok(None);
return None;
}
let content = InputRequest {
@ -73,7 +73,7 @@ pub fn op_jupyter_input(
let msg = JupyterMessage::new(content, Some(&last_request));
let Ok(()) = stdin_connection_proxy.lock().tx.send(msg) else {
return Ok(None);
return None;
};
// Need to spawn a separate thread here, because `blocking_recv()` can't
@ -82,17 +82,25 @@ pub fn op_jupyter_input(
stdin_connection_proxy.lock().rx.blocking_recv()
});
let Ok(Some(response)) = join_handle.join() else {
return Ok(None);
return None;
};
let JupyterMessageContent::InputReply(msg) = response.content else {
return Ok(None);
return None;
};
return Ok(Some(msg.value));
return Some(msg.value);
}
Ok(None)
None
}
#[derive(Debug, thiserror::Error)]
pub enum JupyterBroadcastError {
#[error(transparent)]
SerdeJson(serde_json::Error),
#[error(transparent)]
ZeroMq(AnyError),
}
#[op2(async)]
@ -102,7 +110,7 @@ pub async fn op_jupyter_broadcast(
#[serde] content: serde_json::Value,
#[serde] metadata: serde_json::Value,
#[serde] buffers: Vec<deno_core::JsBuffer>,
) -> Result<(), AnyError> {
) -> Result<(), JupyterBroadcastError> {
let (iopub_connection, last_execution_request) = {
let s = state.borrow();
@ -125,36 +133,35 @@ pub async fn op_jupyter_broadcast(
content,
err
);
err
JupyterBroadcastError::SerdeJson(err)
})?;
let jupyter_message = JupyterMessage::new(content, Some(&last_request))
.with_metadata(metadata)
.with_buffers(buffers.into_iter().map(|b| b.to_vec().into()).collect());
iopub_connection.lock().send(jupyter_message).await?;
iopub_connection
.lock()
.send(jupyter_message)
.await
.map_err(JupyterBroadcastError::ZeroMq)?;
}
Ok(())
}
#[op2(fast)]
pub fn op_print(
state: &mut OpState,
#[string] msg: &str,
is_err: bool,
) -> Result<(), AnyError> {
pub fn op_print(state: &mut OpState, #[string] msg: &str, is_err: bool) {
let sender = state.borrow_mut::<mpsc::UnboundedSender<StreamContent>>();
if is_err {
if let Err(err) = sender.send(StreamContent::stderr(msg)) {
log::error!("Failed to send stderr message: {}", err);
}
return Ok(());
return;
}
if let Err(err) = sender.send(StreamContent::stdout(msg)) {
log::error!("Failed to send stdout message: {}", err);
}
Ok(())
}

View file

@ -51,7 +51,7 @@ struct PermissionsHolder(Uuid, PermissionsContainer);
pub fn op_pledge_test_permissions(
state: &mut OpState,
#[serde] args: ChildPermissionsArg,
) -> Result<Uuid, AnyError> {
) -> Result<Uuid, deno_runtime::deno_permissions::ChildPermissionError> {
let token = Uuid::new_v4();
let parent_permissions = state.borrow_mut::<PermissionsContainer>();
let worker_permissions = parent_permissions.create_child_permissions(args)?;
@ -150,7 +150,7 @@ fn op_register_test_step(
#[smi] parent_id: usize,
#[smi] root_id: usize,
#[string] root_name: String,
) -> Result<usize, AnyError> {
) -> usize {
let id = NEXT_ID.fetch_add(1, Ordering::SeqCst);
let origin = state.borrow::<ModuleSpecifier>().to_string();
let description = TestStepDescription {
@ -169,7 +169,7 @@ fn op_register_test_step(
};
let sender = state.borrow_mut::<TestEventSender>();
sender.send(TestEvent::StepRegister(description)).ok();
Ok(id)
id
}
#[op2(fast)]

View file

@ -4,6 +4,7 @@ use async_trait::async_trait;
use dashmap::DashMap;
use dashmap::DashSet;
use deno_ast::MediaType;
use deno_ast::ModuleKind;
use deno_config::workspace::MappedResolution;
use deno_config::workspace::MappedResolutionDiagnostic;
use deno_config::workspace::MappedResolutionError;
@ -11,6 +12,7 @@ use deno_config::workspace::WorkspaceResolver;
use deno_core::anyhow::anyhow;
use deno_core::anyhow::Context;
use deno_core::error::AnyError;
use deno_core::url::Url;
use deno_core::ModuleSourceCode;
use deno_core::ModuleSpecifier;
use deno_graph::source::ResolutionMode;
@ -29,6 +31,7 @@ use deno_runtime::deno_fs;
use deno_runtime::deno_fs::FileSystem;
use deno_runtime::deno_node::is_builtin_node_module;
use deno_runtime::deno_node::NodeResolver;
use deno_runtime::deno_node::PackageJsonResolver;
use deno_semver::npm::NpmPackageReqReference;
use deno_semver::package::PackageReq;
use node_resolver::errors::ClosestPkgJsonError;
@ -38,21 +41,22 @@ use node_resolver::errors::PackageFolderResolveErrorKind;
use node_resolver::errors::PackageFolderResolveIoError;
use node_resolver::errors::PackageNotFoundError;
use node_resolver::errors::PackageResolveErrorKind;
use node_resolver::errors::UrlToNodeResolutionError;
use node_resolver::errors::PackageSubpathResolveError;
use node_resolver::InNpmPackageChecker;
use node_resolver::NodeModuleKind;
use node_resolver::NodeResolution;
use node_resolver::NodeResolutionMode;
use node_resolver::PackageJson;
use std::borrow::Cow;
use std::path::Path;
use std::path::PathBuf;
use std::sync::Arc;
use thiserror::Error;
use crate::args::JsxImportSourceConfig;
use crate::args::DENO_DISABLE_PEDANTIC_NODE_WARNINGS;
use crate::node::CliNodeCodeTranslator;
use crate::npm::CliNpmResolver;
use crate::npm::InnerCliNpmResolverRef;
use crate::util::path::specifier_has_extension;
use crate::util::sync::AtomicFlag;
use crate::util::text_encoding::from_utf8_lossy_owned;
@ -104,36 +108,32 @@ impl deno_resolver::fs::DenoResolverFs for CliDenoResolverFs {
#[derive(Debug)]
pub struct CliNodeResolver {
cjs_resolutions: Arc<CjsResolutionStore>,
cjs_tracker: Arc<CjsTracker>,
fs: Arc<dyn deno_fs::FileSystem>,
in_npm_pkg_checker: Arc<dyn InNpmPackageChecker>,
node_resolver: Arc<NodeResolver>,
npm_resolver: Arc<dyn CliNpmResolver>,
}
impl CliNodeResolver {
pub fn new(
cjs_resolutions: Arc<CjsResolutionStore>,
cjs_tracker: Arc<CjsTracker>,
fs: Arc<dyn deno_fs::FileSystem>,
in_npm_pkg_checker: Arc<dyn InNpmPackageChecker>,
node_resolver: Arc<NodeResolver>,
npm_resolver: Arc<dyn CliNpmResolver>,
) -> Self {
Self {
cjs_resolutions,
cjs_tracker,
fs,
in_npm_pkg_checker,
node_resolver,
npm_resolver,
}
}
pub fn in_npm_package(&self, specifier: &ModuleSpecifier) -> bool {
self.npm_resolver.in_npm_package(specifier)
}
pub fn get_closest_package_json(
&self,
referrer: &ModuleSpecifier,
) -> Result<Option<Arc<PackageJson>>, ClosestPkgJsonError> {
self.node_resolver.get_closest_package_json(referrer)
self.in_npm_pkg_checker.in_npm_package(specifier)
}
pub fn resolve_if_for_npm_pkg(
@ -153,8 +153,7 @@ impl CliNodeResolver {
| NodeResolveErrorKind::UnsupportedEsmUrlScheme(_)
| NodeResolveErrorKind::DataUrlReferrer(_)
| NodeResolveErrorKind::TypesNotFound(_)
| NodeResolveErrorKind::FinalizeResolution(_)
| NodeResolveErrorKind::UrlToNodeResolution(_) => Err(err.into()),
| NodeResolveErrorKind::FinalizeResolution(_) => Err(err.into()),
NodeResolveErrorKind::PackageResolve(err) => {
let err = err.into_kind();
match err {
@ -216,7 +215,11 @@ impl CliNodeResolver {
referrer: &ModuleSpecifier,
mode: NodeResolutionMode,
) -> Result<NodeResolution, NodeResolveError> {
let referrer_kind = if self.cjs_resolutions.is_known_cjs(referrer) {
let referrer_kind = if self
.cjs_tracker
.is_maybe_cjs(referrer, MediaType::from_specifier(referrer))
.map_err(|err| NodeResolveErrorKind::PackageResolve(err.into()))?
{
NodeModuleKind::Cjs
} else {
NodeModuleKind::Esm
@ -226,7 +229,7 @@ impl CliNodeResolver {
self
.node_resolver
.resolve(specifier, referrer, referrer_kind, mode)?;
Ok(self.handle_node_resolution(res))
Ok(res)
}
pub fn resolve_req_reference(
@ -234,7 +237,7 @@ impl CliNodeResolver {
req_ref: &NpmPackageReqReference,
referrer: &ModuleSpecifier,
mode: NodeResolutionMode,
) -> Result<NodeResolution, AnyError> {
) -> Result<ModuleSpecifier, AnyError> {
self.resolve_req_with_sub_path(
req_ref.req(),
req_ref.sub_path(),
@ -249,7 +252,7 @@ impl CliNodeResolver {
sub_path: Option<&str>,
referrer: &ModuleSpecifier,
mode: NodeResolutionMode,
) -> Result<NodeResolution, AnyError> {
) -> Result<ModuleSpecifier, AnyError> {
let package_folder = self
.npm_resolver
.resolve_pkg_folder_from_deno_module_req(req, referrer)?;
@ -260,7 +263,7 @@ impl CliNodeResolver {
mode,
);
match resolution_result {
Ok(resolution) => Ok(resolution),
Ok(url) => Ok(url),
Err(err) => {
if self.npm_resolver.as_byonm().is_some() {
let package_json_path = package_folder.join("package.json");
@ -271,7 +274,7 @@ impl CliNodeResolver {
));
}
}
Err(err)
Err(err.into())
}
}
}
@ -282,16 +285,13 @@ impl CliNodeResolver {
sub_path: Option<&str>,
maybe_referrer: Option<&ModuleSpecifier>,
mode: NodeResolutionMode,
) -> Result<NodeResolution, AnyError> {
let res = self
.node_resolver
.resolve_package_subpath_from_deno_module(
package_folder,
sub_path,
maybe_referrer,
mode,
)?;
Ok(self.handle_node_resolution(res))
) -> Result<ModuleSpecifier, PackageSubpathResolveError> {
self.node_resolver.resolve_package_subpath_from_deno_module(
package_folder,
sub_path,
maybe_referrer,
mode,
)
}
pub fn handle_if_in_node_modules(
@ -306,71 +306,45 @@ impl CliNodeResolver {
// so canoncalize then check if it's in the node_modules directory.
// If so, check if we need to store this specifier as being a CJS
// resolution.
let specifier =
crate::node::resolve_specifier_into_node_modules(specifier);
if self.in_npm_package(&specifier) {
let resolution =
self.node_resolver.url_to_node_resolution(specifier)?;
let resolution = self.handle_node_resolution(resolution);
return Ok(Some(resolution.into_url()));
}
let specifier = crate::node::resolve_specifier_into_node_modules(
specifier,
self.fs.as_ref(),
);
return Ok(Some(specifier));
}
Ok(None)
}
pub fn url_to_node_resolution(
&self,
specifier: ModuleSpecifier,
) -> Result<NodeResolution, UrlToNodeResolutionError> {
self.node_resolver.url_to_node_resolution(specifier)
}
fn handle_node_resolution(
&self,
resolution: NodeResolution,
) -> NodeResolution {
if let NodeResolution::CommonJs(specifier) = &resolution {
// remember that this was a common js resolution
self.mark_cjs_resolution(specifier.clone());
}
resolution
}
pub fn mark_cjs_resolution(&self, specifier: ModuleSpecifier) {
self.cjs_resolutions.insert(specifier);
}
}
// todo(dsherret): move to module_loader.rs
#[derive(Debug, Error)]
#[error("{media_type} files are not supported in npm packages: {specifier}")]
pub struct NotSupportedKindInNpmError {
pub media_type: MediaType,
pub specifier: Url,
}
// todo(dsherret): move to module_loader.rs (it seems to be here due to use in standalone)
#[derive(Clone)]
pub struct NpmModuleLoader {
cjs_resolutions: Arc<CjsResolutionStore>,
node_code_translator: Arc<CliNodeCodeTranslator>,
cjs_tracker: Arc<CjsTracker>,
fs: Arc<dyn deno_fs::FileSystem>,
node_resolver: Arc<CliNodeResolver>,
node_code_translator: Arc<CliNodeCodeTranslator>,
}
impl NpmModuleLoader {
pub fn new(
cjs_resolutions: Arc<CjsResolutionStore>,
node_code_translator: Arc<CliNodeCodeTranslator>,
cjs_tracker: Arc<CjsTracker>,
fs: Arc<dyn deno_fs::FileSystem>,
node_resolver: Arc<CliNodeResolver>,
node_code_translator: Arc<CliNodeCodeTranslator>,
) -> Self {
Self {
cjs_resolutions,
cjs_tracker,
node_code_translator,
fs,
node_resolver,
}
}
pub fn if_in_npm_package(&self, specifier: &ModuleSpecifier) -> bool {
self.node_resolver.in_npm_package(specifier)
|| self.cjs_resolutions.is_known_cjs(specifier)
}
pub async fn load(
&self,
specifier: &ModuleSpecifier,
@ -413,20 +387,30 @@ impl NpmModuleLoader {
}
})?;
let code = if self.cjs_resolutions.is_known_cjs(specifier) {
let media_type = MediaType::from_specifier(specifier);
if media_type.is_emittable() {
return Err(AnyError::from(NotSupportedKindInNpmError {
media_type,
specifier: specifier.clone(),
}));
}
let code = if self.cjs_tracker.is_maybe_cjs(specifier, media_type)? {
// translate cjs to esm if it's cjs and inject node globals
let code = from_utf8_lossy_owned(code);
ModuleSourceCode::String(
self
.node_code_translator
.translate_cjs_to_esm(specifier, Some(code))
.translate_cjs_to_esm(specifier, Some(Cow::Owned(code)))
.await?
.into_owned()
.into(),
)
} else {
// esm and json code is untouched
ModuleSourceCode::Bytes(code.into_boxed_slice().into())
};
Ok(ModuleCodeStringSource {
code,
found_url: specifier.clone(),
@ -435,21 +419,165 @@ impl NpmModuleLoader {
}
}
pub struct CjsTrackerOptions {
pub unstable_detect_cjs: bool,
}
/// Keeps track of what module specifiers were resolved as CJS.
#[derive(Debug, Default)]
pub struct CjsResolutionStore(DashSet<ModuleSpecifier>);
///
/// Modules that are `.js` or `.ts` are only known to be CJS or
/// ESM after they're loaded based on their contents. So these files
/// will be "maybe CJS" until they're loaded.
#[derive(Debug)]
pub struct CjsTracker {
in_npm_pkg_checker: Arc<dyn InNpmPackageChecker>,
pkg_json_resolver: Arc<PackageJsonResolver>,
unstable_detect_cjs: bool,
known: DashMap<ModuleSpecifier, ModuleKind>,
}
impl CjsResolutionStore {
pub fn is_known_cjs(&self, specifier: &ModuleSpecifier) -> bool {
if specifier.scheme() != "file" {
return false;
impl CjsTracker {
pub fn new(
in_npm_pkg_checker: Arc<dyn InNpmPackageChecker>,
pkg_json_resolver: Arc<PackageJsonResolver>,
options: CjsTrackerOptions,
) -> Self {
Self {
in_npm_pkg_checker,
pkg_json_resolver,
unstable_detect_cjs: options.unstable_detect_cjs,
known: Default::default(),
}
specifier_has_extension(specifier, "cjs") || self.0.contains(specifier)
}
pub fn insert(&self, specifier: ModuleSpecifier) {
self.0.insert(specifier);
/// Checks whether the file might be treated as CJS, but it's not for sure
/// yet because the source hasn't been loaded to see whether it contains
/// imports or exports.
pub fn is_maybe_cjs(
&self,
specifier: &ModuleSpecifier,
media_type: MediaType,
) -> Result<bool, ClosestPkgJsonError> {
self.treat_as_cjs_with_is_script(specifier, media_type, None)
}
/// Gets whether the file is CJS. If true, this is for sure
/// cjs because `is_script` is provided.
///
/// `is_script` should be `true` when the contents of the file at the
/// provided specifier are known to be a script and not an ES module.
pub fn is_cjs_with_known_is_script(
&self,
specifier: &ModuleSpecifier,
media_type: MediaType,
is_script: bool,
) -> Result<bool, ClosestPkgJsonError> {
self.treat_as_cjs_with_is_script(specifier, media_type, Some(is_script))
}
fn treat_as_cjs_with_is_script(
&self,
specifier: &ModuleSpecifier,
media_type: MediaType,
is_script: Option<bool>,
) -> Result<bool, ClosestPkgJsonError> {
let kind = match self
.get_known_kind_with_is_script(specifier, media_type, is_script)
{
Some(kind) => kind,
None => self.check_based_on_pkg_json(specifier)?,
};
Ok(kind.is_cjs())
}
pub fn get_known_kind(
&self,
specifier: &ModuleSpecifier,
media_type: MediaType,
) -> Option<ModuleKind> {
self.get_known_kind_with_is_script(specifier, media_type, None)
}
fn get_known_kind_with_is_script(
&self,
specifier: &ModuleSpecifier,
media_type: MediaType,
is_script: Option<bool>,
) -> Option<ModuleKind> {
if specifier.scheme() != "file" {
return Some(ModuleKind::Esm);
}
match media_type {
MediaType::Mts | MediaType::Mjs | MediaType::Dmts => Some(ModuleKind::Esm),
MediaType::Cjs | MediaType::Cts | MediaType::Dcts => Some(ModuleKind::Cjs),
MediaType::Dts => {
// dts files are always determined based on the package.json because
// they contain imports/exports even when considered CJS
if let Some(value) = self.known.get(specifier).map(|v| *v) {
Some(value)
} else {
let value = self.check_based_on_pkg_json(specifier).ok();
if let Some(value) = value {
self.known.insert(specifier.clone(), value);
}
Some(value.unwrap_or(ModuleKind::Esm))
}
}
MediaType::Wasm |
MediaType::Json => Some(ModuleKind::Esm),
MediaType::JavaScript
| MediaType::Jsx
| MediaType::TypeScript
| MediaType::Tsx
// treat these as unknown
| MediaType::Css
| MediaType::SourceMap
| MediaType::Unknown => {
if let Some(value) = self.known.get(specifier).map(|v| *v) {
if value.is_cjs() && is_script == Some(false) {
// we now know this is actually esm
self.known.insert(specifier.clone(), ModuleKind::Esm);
Some(ModuleKind::Esm)
} else {
Some(value)
}
} else if is_script == Some(false) {
// we know this is esm
self.known.insert(specifier.clone(), ModuleKind::Esm);
Some(ModuleKind::Esm)
} else {
None
}
}
}
}
fn check_based_on_pkg_json(
&self,
specifier: &ModuleSpecifier,
) -> Result<ModuleKind, ClosestPkgJsonError> {
if self.in_npm_pkg_checker.in_npm_package(specifier) {
if let Some(pkg_json) =
self.pkg_json_resolver.get_closest_package_json(specifier)?
{
let is_file_location_cjs = pkg_json.typ != "module";
Ok(ModuleKind::from_is_cjs(is_file_location_cjs))
} else {
Ok(ModuleKind::Cjs)
}
} else if self.unstable_detect_cjs {
if let Some(pkg_json) =
self.pkg_json_resolver.get_closest_package_json(specifier)?
{
let is_cjs_type = pkg_json.typ == "commonjs";
Ok(ModuleKind::from_is_cjs(is_cjs_type))
} else {
Ok(ModuleKind::Esm)
}
} else {
Ok(ModuleKind::Esm)
}
}
}
@ -633,8 +761,7 @@ impl Resolver for CliGraphResolver {
Some(referrer),
to_node_mode(mode),
)
.map_err(ResolveError::Other)
.map(|res| res.into_url()),
.map_err(|e| ResolveError::Other(e.into())),
MappedResolution::PackageJson {
dep_result,
alias,
@ -665,19 +792,17 @@ impl Resolver for CliGraphResolver {
)
.map_err(|e| ResolveError::Other(e.into()))
.and_then(|pkg_folder| {
Ok(
self
.node_resolver
.as_ref()
.unwrap()
.resolve_package_sub_path_from_deno_module(
pkg_folder,
sub_path.as_deref(),
Some(referrer),
to_node_mode(mode),
)?
.into_url(),
)
self
.node_resolver
.as_ref()
.unwrap()
.resolve_package_sub_path_from_deno_module(
pkg_folder,
sub_path.as_deref(),
Some(referrer),
to_node_mode(mode),
)
.map_err(|e| ResolveError::Other(e.into()))
}),
})
}
@ -717,23 +842,20 @@ impl Resolver for CliGraphResolver {
npm_req_ref.req(),
)
{
return Ok(
node_resolver
.resolve_package_sub_path_from_deno_module(
pkg_folder,
npm_req_ref.sub_path(),
Some(referrer),
to_node_mode(mode),
)?
.into_url(),
);
return node_resolver
.resolve_package_sub_path_from_deno_module(
pkg_folder,
npm_req_ref.sub_path(),
Some(referrer),
to_node_mode(mode),
)
.map_err(|e| ResolveError::Other(e.into()));
}
// do npm resolution for byonm
if is_byonm {
return node_resolver
.resolve_req_reference(&npm_req_ref, referrer, to_node_mode(mode))
.map(|res| res.into_url())
.map_err(|err| err.into());
}
}
@ -751,9 +873,7 @@ impl Resolver for CliGraphResolver {
.map_err(ResolveError::Other)?;
if let Some(res) = maybe_resolution {
match res {
NodeResolution::Esm(url) | NodeResolution::CommonJs(url) => {
return Ok(url)
}
NodeResolution::Module(url) => return Ok(url),
NodeResolution::BuiltIn(_) => {
// don't resolve bare specifiers for built-in modules via node resolution
}

View file

@ -21,6 +21,7 @@ use std::process::Command;
use std::sync::Arc;
use deno_ast::MediaType;
use deno_ast::ModuleKind;
use deno_ast::ModuleSpecifier;
use deno_config::workspace::PackageJsonDepResolution;
use deno_config::workspace::ResolverWorkspaceJsrPackage;
@ -67,6 +68,7 @@ use crate::file_fetcher::FileFetcher;
use crate::http_util::HttpClientProvider;
use crate::npm::CliNpmResolver;
use crate::npm::InnerCliNpmResolverRef;
use crate::resolver::CjsTracker;
use crate::shared::ReleaseChannel;
use crate::standalone::virtual_fs::VfsEntry;
use crate::util::archive;
@ -257,6 +259,10 @@ impl StandaloneModules {
}
}
pub fn has_file(&self, path: &Path) -> bool {
self.vfs.file_entry(path).is_ok()
}
pub fn read<'a>(
&'a self,
specifier: &'a ModuleSpecifier,
@ -353,6 +359,7 @@ pub fn extract_standalone(
}
pub struct DenoCompileBinaryWriter<'a> {
cjs_tracker: &'a CjsTracker,
deno_dir: &'a DenoDir,
emitter: &'a Emitter,
file_fetcher: &'a FileFetcher,
@ -365,6 +372,7 @@ pub struct DenoCompileBinaryWriter<'a> {
impl<'a> DenoCompileBinaryWriter<'a> {
#[allow(clippy::too_many_arguments)]
pub fn new(
cjs_tracker: &'a CjsTracker,
deno_dir: &'a DenoDir,
emitter: &'a Emitter,
file_fetcher: &'a FileFetcher,
@ -374,6 +382,7 @@ impl<'a> DenoCompileBinaryWriter<'a> {
npm_system_info: NpmSystemInfo,
) -> Self {
Self {
cjs_tracker,
deno_dir,
emitter,
file_fetcher,
@ -599,19 +608,21 @@ impl<'a> DenoCompileBinaryWriter<'a> {
}
let (maybe_source, media_type) = match module {
deno_graph::Module::Js(m) => {
// todo(https://github.com/denoland/deno_media_type/pull/12): use is_emittable()
let is_emittable = matches!(
m.media_type,
MediaType::TypeScript
| MediaType::Mts
| MediaType::Cts
| MediaType::Jsx
| MediaType::Tsx
);
let source = if is_emittable {
let source = if m.media_type.is_emittable() {
let is_cjs = self.cjs_tracker.is_cjs_with_known_is_script(
&m.specifier,
m.media_type,
m.is_script,
)?;
let module_kind = ModuleKind::from_is_cjs(is_cjs);
let source = self
.emitter
.emit_parsed_source(&m.specifier, m.media_type, &m.source)
.emit_parsed_source(
&m.specifier,
m.media_type,
module_kind,
&m.source,
)
.await?;
source.into_bytes()
} else {
@ -745,8 +756,9 @@ impl<'a> DenoCompileBinaryWriter<'a> {
} else {
// DO NOT include the user's registry url as it may contain credentials,
// but also don't make this dependent on the registry url
let global_cache_root_path = npm_resolver.global_cache_root_folder();
let mut builder = VfsBuilder::new(global_cache_root_path)?;
let global_cache_root_path = npm_resolver.global_cache_root_path();
let mut builder =
VfsBuilder::new(global_cache_root_path.to_path_buf())?;
let mut packages =
npm_resolver.all_system_packages(&self.npm_system_info);
packages.sort_by(|a, b| a.id.cmp(&b.id)); // determinism

View file

@ -19,6 +19,7 @@ use deno_core::error::type_error;
use deno_core::error::AnyError;
use deno_core::futures::FutureExt;
use deno_core::v8_set_flags;
use deno_core::FastString;
use deno_core::FeatureChecker;
use deno_core::ModuleLoader;
use deno_core::ModuleSourceCode;
@ -30,7 +31,9 @@ use deno_npm::npm_rc::ResolvedNpmRc;
use deno_package_json::PackageJsonDepValue;
use deno_runtime::deno_fs;
use deno_runtime::deno_node::create_host_defined_options;
use deno_runtime::deno_node::NodeRequireLoader;
use deno_runtime::deno_node::NodeResolver;
use deno_runtime::deno_node::PackageJsonResolver;
use deno_runtime::deno_permissions::Permissions;
use deno_runtime::deno_permissions::PermissionsContainer;
use deno_runtime::deno_tls::rustls::RootCertStore;
@ -43,6 +46,7 @@ use deno_semver::npm::NpmPackageReqReference;
use import_map::parse_from_json;
use node_resolver::analyze::NodeCodeTranslator;
use node_resolver::NodeResolutionMode;
use serialization::DenoCompileModuleSource;
use std::borrow::Cow;
use std::rc::Rc;
use std::sync::Arc;
@ -61,12 +65,18 @@ use crate::cache::NodeAnalysisCache;
use crate::cache::RealDenoCacheEnv;
use crate::http_util::HttpClientProvider;
use crate::node::CliCjsCodeAnalyzer;
use crate::node::CliNodeCodeTranslator;
use crate::npm::create_cli_npm_resolver;
use crate::npm::create_in_npm_pkg_checker;
use crate::npm::CliByonmNpmResolverCreateOptions;
use crate::npm::CliManagedInNpmPkgCheckerCreateOptions;
use crate::npm::CliManagedNpmResolverCreateOptions;
use crate::npm::CliNpmResolver;
use crate::npm::CliNpmResolverCreateOptions;
use crate::npm::CliNpmResolverManagedCreateOptions;
use crate::npm::CliNpmResolverManagedSnapshotOption;
use crate::resolver::CjsResolutionStore;
use crate::npm::CreateInNpmPkgCheckerOptions;
use crate::resolver::CjsTracker;
use crate::resolver::CjsTrackerOptions;
use crate::resolver::CliDenoResolverFs;
use crate::resolver::CliNodeResolver;
use crate::resolver::NpmModuleLoader;
@ -75,7 +85,7 @@ use crate::util::progress_bar::ProgressBarStyle;
use crate::util::v8::construct_v8_flags;
use crate::worker::CliMainWorkerFactory;
use crate::worker::CliMainWorkerOptions;
use crate::worker::ModuleLoaderAndSourceMapGetter;
use crate::worker::CreateModuleLoaderResult;
use crate::worker::ModuleLoaderFactory;
pub mod binary;
@ -91,10 +101,14 @@ use self::binary::Metadata;
use self::file_system::DenoCompileFileSystem;
struct SharedModuleLoaderState {
cjs_tracker: Arc<CjsTracker>,
fs: Arc<dyn deno_fs::FileSystem>,
modules: StandaloneModules,
workspace_resolver: WorkspaceResolver,
node_code_translator: Arc<CliNodeCodeTranslator>,
node_resolver: Arc<CliNodeResolver>,
npm_module_loader: Arc<NpmModuleLoader>,
npm_resolver: Arc<dyn CliNpmResolver>,
workspace_resolver: WorkspaceResolver,
}
#[derive(Clone)]
@ -102,6 +116,12 @@ struct EmbeddedModuleLoader {
shared: Arc<SharedModuleLoaderState>,
}
impl std::fmt::Debug for EmbeddedModuleLoader {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
f.debug_struct("EmbeddedModuleLoader").finish()
}
}
pub const MODULE_NOT_FOUND: &str = "Module not found";
pub const UNSUPPORTED_SCHEME: &str = "Unsupported scheme";
@ -159,8 +179,7 @@ impl ModuleLoader for EmbeddedModuleLoader {
sub_path.as_deref(),
Some(&referrer),
NodeResolutionMode::Execution,
)?
.into_url(),
)?,
),
Ok(MappedResolution::PackageJson {
dep_result,
@ -168,16 +187,14 @@ impl ModuleLoader for EmbeddedModuleLoader {
alias,
..
}) => match dep_result.as_ref().map_err(|e| AnyError::from(e.clone()))? {
PackageJsonDepValue::Req(req) => self
.shared
.node_resolver
.resolve_req_with_sub_path(
PackageJsonDepValue::Req(req) => {
self.shared.node_resolver.resolve_req_with_sub_path(
req,
sub_path.as_deref(),
&referrer,
NodeResolutionMode::Execution,
)
.map(|res| res.into_url()),
}
PackageJsonDepValue::Workspace(version_req) => {
let pkg_folder = self
.shared
@ -195,8 +212,7 @@ impl ModuleLoader for EmbeddedModuleLoader {
sub_path.as_deref(),
Some(&referrer),
NodeResolutionMode::Execution,
)?
.into_url(),
)?,
)
}
},
@ -205,15 +221,11 @@ impl ModuleLoader for EmbeddedModuleLoader {
if let Ok(reference) =
NpmPackageReqReference::from_specifier(&specifier)
{
return self
.shared
.node_resolver
.resolve_req_reference(
&reference,
&referrer,
NodeResolutionMode::Execution,
)
.map(|res| res.into_url());
return self.shared.node_resolver.resolve_req_reference(
&reference,
&referrer,
NodeResolutionMode::Execution,
);
}
if specifier.scheme() == "jsr" {
@ -317,17 +329,72 @@ impl ModuleLoader for EmbeddedModuleLoader {
match self.shared.modules.read(original_specifier) {
Ok(Some(module)) => {
let media_type = module.media_type;
let (module_specifier, module_type, module_source) =
module.into_for_v8();
deno_core::ModuleLoadResponse::Sync(Ok(
deno_core::ModuleSource::new_with_redirect(
module_type,
module_source,
original_specifier,
module_specifier,
None,
),
))
module.into_parts();
let is_maybe_cjs = match self
.shared
.cjs_tracker
.is_maybe_cjs(original_specifier, media_type)
{
Ok(is_maybe_cjs) => is_maybe_cjs,
Err(err) => {
return deno_core::ModuleLoadResponse::Sync(Err(type_error(
format!("{:?}", err),
)));
}
};
if is_maybe_cjs {
let original_specifier = original_specifier.clone();
let module_specifier = module_specifier.clone();
let shared = self.shared.clone();
deno_core::ModuleLoadResponse::Async(
async move {
let source = match module_source {
DenoCompileModuleSource::String(string) => {
Cow::Borrowed(string)
}
DenoCompileModuleSource::Bytes(module_code_bytes) => {
match module_code_bytes {
Cow::Owned(bytes) => Cow::Owned(
crate::util::text_encoding::from_utf8_lossy_owned(bytes),
),
Cow::Borrowed(bytes) => String::from_utf8_lossy(bytes),
}
}
};
let source = shared
.node_code_translator
.translate_cjs_to_esm(&module_specifier, Some(source))
.await?;
let module_source = match source {
Cow::Owned(source) => ModuleSourceCode::String(source.into()),
Cow::Borrowed(source) => {
ModuleSourceCode::String(FastString::from_static(source))
}
};
Ok(deno_core::ModuleSource::new_with_redirect(
module_type,
module_source,
&original_specifier,
&module_specifier,
None,
))
}
.boxed_local(),
)
} else {
let module_source = module_source.into_for_v8();
deno_core::ModuleLoadResponse::Sync(Ok(
deno_core::ModuleSource::new_with_redirect(
module_type,
module_source,
original_specifier,
module_specifier,
None,
),
))
}
}
Ok(None) => deno_core::ModuleLoadResponse::Sync(Err(type_error(
format!("{MODULE_NOT_FOUND}: {}", original_specifier),
@ -339,32 +406,61 @@ impl ModuleLoader for EmbeddedModuleLoader {
}
}
impl NodeRequireLoader for EmbeddedModuleLoader {
fn ensure_read_permission<'a>(
&self,
permissions: &mut dyn deno_runtime::deno_node::NodePermissions,
path: &'a std::path::Path,
) -> Result<Cow<'a, std::path::Path>, AnyError> {
if self.shared.modules.has_file(path) {
// allow reading if the file is in the snapshot
return Ok(Cow::Borrowed(path));
}
self
.shared
.npm_resolver
.ensure_read_permission(permissions, path)
}
fn load_text_file_lossy(
&self,
path: &std::path::Path,
) -> Result<String, AnyError> {
Ok(self.shared.fs.read_text_file_lossy_sync(path, None)?)
}
}
struct StandaloneModuleLoaderFactory {
shared: Arc<SharedModuleLoaderState>,
}
impl StandaloneModuleLoaderFactory {
pub fn create_result(&self) -> CreateModuleLoaderResult {
let loader = Rc::new(EmbeddedModuleLoader {
shared: self.shared.clone(),
});
CreateModuleLoaderResult {
module_loader: loader.clone(),
node_require_loader: loader,
}
}
}
impl ModuleLoaderFactory for StandaloneModuleLoaderFactory {
fn create_for_main(
&self,
_root_permissions: PermissionsContainer,
) -> ModuleLoaderAndSourceMapGetter {
ModuleLoaderAndSourceMapGetter {
module_loader: Rc::new(EmbeddedModuleLoader {
shared: self.shared.clone(),
}),
}
) -> CreateModuleLoaderResult {
self.create_result()
}
fn create_for_worker(
&self,
_parent_permissions: PermissionsContainer,
_permissions: PermissionsContainer,
) -> ModuleLoaderAndSourceMapGetter {
ModuleLoaderAndSourceMapGetter {
module_loader: Rc::new(EmbeddedModuleLoader {
shared: self.shared.clone(),
}),
}
) -> CreateModuleLoaderResult {
self.create_result()
}
}
@ -410,106 +506,155 @@ pub async fn run(data: StandaloneData) -> Result<i32, AnyError> {
let main_module = root_dir_url.join(&metadata.entrypoint_key).unwrap();
let npm_global_cache_dir = root_path.join(".deno_compile_node_modules");
let cache_setting = CacheSetting::Only;
let npm_resolver = match metadata.node_modules {
let pkg_json_resolver = Arc::new(PackageJsonResolver::new(
deno_runtime::deno_node::DenoFsNodeResolverEnv::new(fs.clone()),
));
let (in_npm_pkg_checker, npm_resolver) = match metadata.node_modules {
Some(binary::NodeModules::Managed { node_modules_dir }) => {
// create an npmrc that uses the fake npm_registry_url to resolve packages
let npmrc = Arc::new(ResolvedNpmRc {
default_config: deno_npm::npm_rc::RegistryConfigWithUrl {
registry_url: npm_registry_url.clone(),
config: Default::default(),
},
scopes: Default::default(),
registry_configs: Default::default(),
});
let npm_cache_dir = Arc::new(NpmCacheDir::new(
&DenoCacheEnvFsAdapter(fs.as_ref()),
npm_global_cache_dir,
npmrc.get_all_known_registries_urls(),
));
let snapshot = npm_snapshot.unwrap();
let maybe_node_modules_path = node_modules_dir
.map(|node_modules_dir| root_path.join(node_modules_dir));
create_cli_npm_resolver(CliNpmResolverCreateOptions::Managed(
CliNpmResolverManagedCreateOptions {
snapshot: CliNpmResolverManagedSnapshotOption::Specified(Some(
snapshot,
)),
maybe_lockfile: None,
fs: fs.clone(),
http_client_provider: http_client_provider.clone(),
npm_global_cache_dir,
cache_setting,
text_only_progress_bar: progress_bar,
maybe_node_modules_path,
npm_system_info: Default::default(),
npm_install_deps_provider: Arc::new(
// this is only used for installing packages, which isn't necessary with deno compile
NpmInstallDepsProvider::empty(),
),
// create an npmrc that uses the fake npm_registry_url to resolve packages
npmrc: Arc::new(ResolvedNpmRc {
default_config: deno_npm::npm_rc::RegistryConfigWithUrl {
registry_url: npm_registry_url.clone(),
config: Default::default(),
},
scopes: Default::default(),
registry_configs: Default::default(),
}),
lifecycle_scripts: Default::default(),
},
))
.await?
let in_npm_pkg_checker =
create_in_npm_pkg_checker(CreateInNpmPkgCheckerOptions::Managed(
CliManagedInNpmPkgCheckerCreateOptions {
root_cache_dir_url: npm_cache_dir.root_dir_url(),
maybe_node_modules_path: maybe_node_modules_path.as_deref(),
},
));
let npm_resolver =
create_cli_npm_resolver(CliNpmResolverCreateOptions::Managed(
CliManagedNpmResolverCreateOptions {
snapshot: CliNpmResolverManagedSnapshotOption::Specified(Some(
snapshot,
)),
maybe_lockfile: None,
fs: fs.clone(),
http_client_provider: http_client_provider.clone(),
npm_cache_dir,
cache_setting,
text_only_progress_bar: progress_bar,
maybe_node_modules_path,
npm_system_info: Default::default(),
npm_install_deps_provider: Arc::new(
// this is only used for installing packages, which isn't necessary with deno compile
NpmInstallDepsProvider::empty(),
),
npmrc,
lifecycle_scripts: Default::default(),
},
))
.await?;
(in_npm_pkg_checker, npm_resolver)
}
Some(binary::NodeModules::Byonm {
root_node_modules_dir,
}) => {
let root_node_modules_dir =
root_node_modules_dir.map(|p| vfs.root().join(p));
create_cli_npm_resolver(CliNpmResolverCreateOptions::Byonm(
CliByonmNpmResolverCreateOptions {
let in_npm_pkg_checker =
create_in_npm_pkg_checker(CreateInNpmPkgCheckerOptions::Byonm);
let npm_resolver = create_cli_npm_resolver(
CliNpmResolverCreateOptions::Byonm(CliByonmNpmResolverCreateOptions {
fs: CliDenoResolverFs(fs.clone()),
pkg_json_resolver: pkg_json_resolver.clone(),
root_node_modules_dir,
},
))
.await?
}),
)
.await?;
(in_npm_pkg_checker, npm_resolver)
}
None => {
create_cli_npm_resolver(CliNpmResolverCreateOptions::Managed(
CliNpmResolverManagedCreateOptions {
snapshot: CliNpmResolverManagedSnapshotOption::Specified(None),
maybe_lockfile: None,
fs: fs.clone(),
http_client_provider: http_client_provider.clone(),
npm_global_cache_dir,
cache_setting,
text_only_progress_bar: progress_bar,
maybe_node_modules_path: None,
npm_system_info: Default::default(),
npm_install_deps_provider: Arc::new(
// this is only used for installing packages, which isn't necessary with deno compile
NpmInstallDepsProvider::empty(),
),
// Packages from different registries are already inlined in the binary,
// so no need to create actual `.npmrc` configuration.
npmrc: create_default_npmrc(),
lifecycle_scripts: Default::default(),
},
))
.await?
// Packages from different registries are already inlined in the binary,
// so no need to create actual `.npmrc` configuration.
let npmrc = create_default_npmrc();
let npm_cache_dir = Arc::new(NpmCacheDir::new(
&DenoCacheEnvFsAdapter(fs.as_ref()),
npm_global_cache_dir,
npmrc.get_all_known_registries_urls(),
));
let in_npm_pkg_checker =
create_in_npm_pkg_checker(CreateInNpmPkgCheckerOptions::Managed(
CliManagedInNpmPkgCheckerCreateOptions {
root_cache_dir_url: npm_cache_dir.root_dir_url(),
maybe_node_modules_path: None,
},
));
let npm_resolver =
create_cli_npm_resolver(CliNpmResolverCreateOptions::Managed(
CliManagedNpmResolverCreateOptions {
snapshot: CliNpmResolverManagedSnapshotOption::Specified(None),
maybe_lockfile: None,
fs: fs.clone(),
http_client_provider: http_client_provider.clone(),
npm_cache_dir,
cache_setting,
text_only_progress_bar: progress_bar,
maybe_node_modules_path: None,
npm_system_info: Default::default(),
npm_install_deps_provider: Arc::new(
// this is only used for installing packages, which isn't necessary with deno compile
NpmInstallDepsProvider::empty(),
),
npmrc: create_default_npmrc(),
lifecycle_scripts: Default::default(),
},
))
.await?;
(in_npm_pkg_checker, npm_resolver)
}
};
let has_node_modules_dir = npm_resolver.root_node_modules_path().is_some();
let node_resolver = Arc::new(NodeResolver::new(
deno_runtime::deno_node::DenoFsNodeResolverEnv::new(fs.clone()),
in_npm_pkg_checker.clone(),
npm_resolver.clone().into_npm_resolver(),
pkg_json_resolver.clone(),
));
let cjs_tracker = Arc::new(CjsTracker::new(
in_npm_pkg_checker.clone(),
pkg_json_resolver.clone(),
CjsTrackerOptions {
unstable_detect_cjs: metadata.unstable_config.detect_cjs,
},
));
let cjs_resolutions = Arc::new(CjsResolutionStore::default());
let cache_db = Caches::new(deno_dir_provider.clone());
let node_analysis_cache = NodeAnalysisCache::new(cache_db.node_analysis_db());
let cli_node_resolver = Arc::new(CliNodeResolver::new(
cjs_resolutions.clone(),
cjs_tracker.clone(),
fs.clone(),
in_npm_pkg_checker.clone(),
node_resolver.clone(),
npm_resolver.clone(),
));
let cjs_esm_code_analyzer = CliCjsCodeAnalyzer::new(
node_analysis_cache,
cjs_tracker.clone(),
fs.clone(),
cli_node_resolver.clone(),
None,
false,
);
let node_code_translator = Arc::new(NodeCodeTranslator::new(
cjs_esm_code_analyzer,
deno_runtime::deno_node::DenoFsNodeResolverEnv::new(fs.clone()),
in_npm_pkg_checker,
node_resolver.clone(),
npm_resolver.clone().into_npm_resolver(),
pkg_json_resolver.clone(),
));
let workspace_resolver = {
let import_map = match metadata.workspace_resolver.import_map {
@ -562,15 +707,18 @@ pub async fn run(data: StandaloneData) -> Result<i32, AnyError> {
};
let module_loader_factory = StandaloneModuleLoaderFactory {
shared: Arc::new(SharedModuleLoaderState {
cjs_tracker: cjs_tracker.clone(),
fs: fs.clone(),
modules,
workspace_resolver,
node_code_translator: node_code_translator.clone(),
node_resolver: cli_node_resolver.clone(),
npm_module_loader: Arc::new(NpmModuleLoader::new(
cjs_resolutions.clone(),
node_code_translator,
cjs_tracker.clone(),
fs.clone(),
cli_node_resolver,
node_code_translator,
)),
npm_resolver: npm_resolver.clone(),
workspace_resolver,
}),
};
@ -609,7 +757,6 @@ pub async fn run(data: StandaloneData) -> Result<i32, AnyError> {
});
let worker_factory = CliMainWorkerFactory::new(
Arc::new(BlobStore::default()),
cjs_resolutions,
// Code cache is not supported for standalone binary yet.
None,
feature_checker,
@ -620,6 +767,7 @@ pub async fn run(data: StandaloneData) -> Result<i32, AnyError> {
Box::new(module_loader_factory),
node_resolver,
npm_resolver,
pkg_json_resolver,
root_cert_store_provider,
permissions,
StorageKeyResolver::empty(),
@ -635,7 +783,6 @@ pub async fn run(data: StandaloneData) -> Result<i32, AnyError> {
inspect_wait: false,
strace_ops: None,
is_inspecting: false,
is_npm_main: main_module.scheme() == "npm",
skip_op_registration: true,
location: metadata.location,
argv0: NpmPackageReqReference::from_specifier(&main_module)
@ -652,7 +799,6 @@ pub async fn run(data: StandaloneData) -> Result<i32, AnyError> {
node_ipc: None,
serve_port: None,
serve_host: None,
unstable_detect_cjs: metadata.unstable_config.detect_cjs,
},
);

View file

@ -173,6 +173,7 @@ pub struct RemoteModulesStoreBuilder {
impl RemoteModulesStoreBuilder {
pub fn add(&mut self, specifier: &Url, media_type: MediaType, data: Vec<u8>) {
log::debug!("Adding '{}' ({})", specifier, media_type);
let specifier = specifier.to_string();
self.specifiers.push((specifier, self.data_byte_len));
self.data_byte_len += 1 + 8 + data.len() as u64; // media type (1 byte), data length (8 bytes), data
@ -182,6 +183,7 @@ impl RemoteModulesStoreBuilder {
pub fn add_redirects(&mut self, redirects: &BTreeMap<Url, Url>) {
self.redirects.reserve(redirects.len());
for (from, to) in redirects {
log::debug!("Adding redirect '{}' -> '{}'", from, to);
let from = from.to_string();
let to = to.to_string();
self.redirects_len += (4 + from.len() + 4 + to.len()) as u64;
@ -212,14 +214,13 @@ impl RemoteModulesStoreBuilder {
}
}
pub struct DenoCompileModuleData<'a> {
pub specifier: &'a Url,
pub media_type: MediaType,
pub data: Cow<'static, [u8]>,
pub enum DenoCompileModuleSource {
String(&'static str),
Bytes(Cow<'static, [u8]>),
}
impl<'a> DenoCompileModuleData<'a> {
pub fn into_for_v8(self) -> (&'a Url, ModuleType, ModuleSourceCode) {
impl DenoCompileModuleSource {
pub fn into_for_v8(self) -> ModuleSourceCode {
fn into_bytes(data: Cow<'static, [u8]>) -> ModuleSourceCode {
ModuleSourceCode::Bytes(match data {
Cow::Borrowed(d) => d.into(),
@ -227,16 +228,31 @@ impl<'a> DenoCompileModuleData<'a> {
})
}
fn into_string_unsafe(data: Cow<'static, [u8]>) -> ModuleSourceCode {
match self {
// todo(https://github.com/denoland/deno_core/pull/943): store whether
// the string is ascii or not ahead of time so we can avoid the is_ascii()
// check in FastString::from_static
Self::String(s) => ModuleSourceCode::String(FastString::from_static(s)),
Self::Bytes(b) => into_bytes(b),
}
}
}
pub struct DenoCompileModuleData<'a> {
pub specifier: &'a Url,
pub media_type: MediaType,
pub data: Cow<'static, [u8]>,
}
impl<'a> DenoCompileModuleData<'a> {
pub fn into_parts(self) -> (&'a Url, ModuleType, DenoCompileModuleSource) {
fn into_string_unsafe(data: Cow<'static, [u8]>) -> DenoCompileModuleSource {
match data {
Cow::Borrowed(d) => ModuleSourceCode::String(
Cow::Borrowed(d) => DenoCompileModuleSource::String(
// SAFETY: we know this is a valid utf8 string
unsafe { FastString::from_static(std::str::from_utf8_unchecked(d)) },
unsafe { std::str::from_utf8_unchecked(d) },
),
Cow::Owned(d) => ModuleSourceCode::Bytes(d.into_boxed_slice().into()),
Cow::Owned(d) => DenoCompileModuleSource::Bytes(Cow::Owned(d)),
}
}
@ -255,11 +271,14 @@ impl<'a> DenoCompileModuleData<'a> {
(ModuleType::JavaScript, into_string_unsafe(self.data))
}
MediaType::Json => (ModuleType::Json, into_string_unsafe(self.data)),
MediaType::Wasm => (ModuleType::Wasm, into_bytes(self.data)),
// just assume javascript if we made it here
MediaType::TsBuildInfo | MediaType::SourceMap | MediaType::Unknown => {
(ModuleType::JavaScript, into_bytes(self.data))
MediaType::Wasm => {
(ModuleType::Wasm, DenoCompileModuleSource::Bytes(self.data))
}
// just assume javascript if we made it here
MediaType::Css | MediaType::SourceMap | MediaType::Unknown => (
ModuleType::JavaScript,
DenoCompileModuleSource::Bytes(self.data),
),
};
(self.specifier, media_type, source)
}
@ -354,17 +373,17 @@ impl RemoteModulesStore {
pub fn read<'a>(
&'a self,
specifier: &'a Url,
original_specifier: &'a Url,
) -> Result<Option<DenoCompileModuleData<'a>>, AnyError> {
let mut count = 0;
let mut current = specifier;
let mut specifier = original_specifier;
loop {
if count > 10 {
bail!("Too many redirects resolving '{}'", specifier);
bail!("Too many redirects resolving '{}'", original_specifier);
}
match self.specifiers.get(current) {
match self.specifiers.get(specifier) {
Some(RemoteModulesStoreSpecifierValue::Redirect(to)) => {
current = to;
specifier = to;
count += 1;
}
Some(RemoteModulesStoreSpecifierValue::Data(offset)) => {
@ -549,7 +568,7 @@ fn serialize_media_type(media_type: MediaType) -> u8 {
MediaType::Tsx => 10,
MediaType::Json => 11,
MediaType::Wasm => 12,
MediaType::TsBuildInfo => 13,
MediaType::Css => 13,
MediaType::SourceMap => 14,
MediaType::Unknown => 15,
}
@ -570,7 +589,7 @@ fn deserialize_media_type(value: u8) -> Result<MediaType, AnyError> {
10 => Ok(MediaType::Tsx),
11 => Ok(MediaType::Json),
12 => Ok(MediaType::Wasm),
13 => Ok(MediaType::TsBuildInfo),
13 => Ok(MediaType::Css),
14 => Ok(MediaType::SourceMap),
15 => Ok(MediaType::Unknown),
_ => bail!("Unknown media type value: {}", value),

View file

@ -155,6 +155,12 @@ fn prepare_env_vars(
initial_cwd.to_string_lossy().to_string(),
);
}
if !env_vars.contains_key(crate::npm::NPM_CONFIG_USER_AGENT_ENV_VAR) {
env_vars.insert(
crate::npm::NPM_CONFIG_USER_AGENT_ENV_VAR.into(),
crate::npm::get_npm_config_user_agent(),
);
}
if let Some(node_modules_dir) = node_modules_dir {
prepend_to_path(
&mut env_vars,
@ -204,7 +210,7 @@ impl ShellCommand for NpmCommand {
mut context: ShellCommandContext,
) -> LocalBoxFuture<'static, ExecuteResult> {
if context.args.first().map(|s| s.as_str()) == Some("run")
&& context.args.len() > 2
&& context.args.len() >= 2
// for now, don't run any npm scripts that have a flag because
// we don't handle stuff like `--workspaces` properly
&& !context.args.iter().any(|s| s.starts_with('-'))
@ -267,10 +273,12 @@ impl ShellCommand for NodeCommand {
)
.execute(context);
}
args.extend(["run", "-A"].into_iter().map(|s| s.to_string()));
args.extend(context.args.iter().cloned());
let mut state = context.state;
state.apply_env_var(USE_PKG_JSON_HIDDEN_ENV_VAR_NAME, "1");
ExecutableCommand::new("deno".to_string(), std::env::current_exe().unwrap())
.execute(ShellCommandContext {

View file

@ -193,7 +193,7 @@ async fn bench_specifier_inner(
.await?;
// We execute the main module as a side module so that import.meta.main is not set.
worker.execute_side_module_possibly_with_npm().await?;
worker.execute_side_module().await?;
let mut worker = worker.into_main_worker();

View file

@ -32,6 +32,7 @@ use crate::graph_util::ModuleGraphBuilder;
use crate::npm::CliNpmResolver;
use crate::tsc;
use crate::tsc::Diagnostics;
use crate::tsc::TypeCheckingCjsTracker;
use crate::util::extract;
use crate::util::path::to_percent_decoded_str;
@ -99,6 +100,7 @@ pub struct CheckOptions {
pub struct TypeChecker {
caches: Arc<Caches>,
cjs_tracker: Arc<TypeCheckingCjsTracker>,
cli_options: Arc<CliOptions>,
module_graph_builder: Arc<ModuleGraphBuilder>,
node_resolver: Arc<NodeResolver>,
@ -108,6 +110,7 @@ pub struct TypeChecker {
impl TypeChecker {
pub fn new(
caches: Arc<Caches>,
cjs_tracker: Arc<TypeCheckingCjsTracker>,
cli_options: Arc<CliOptions>,
module_graph_builder: Arc<ModuleGraphBuilder>,
node_resolver: Arc<NodeResolver>,
@ -115,6 +118,7 @@ impl TypeChecker {
) -> Self {
Self {
caches,
cjs_tracker,
cli_options,
module_graph_builder,
node_resolver,
@ -244,6 +248,7 @@ impl TypeChecker {
graph: graph.clone(),
hash_data,
maybe_npm: Some(tsc::RequestNpmState {
cjs_tracker: self.cjs_tracker.clone(),
node_resolver: self.node_resolver.clone(),
npm_resolver: self.npm_resolver.clone(),
}),
@ -346,7 +351,7 @@ fn get_check_hash(
}
}
MediaType::Json
| MediaType::TsBuildInfo
| MediaType::Css
| MediaType::SourceMap
| MediaType::Wasm
| MediaType::Unknown => continue,
@ -428,7 +433,7 @@ fn get_tsc_roots(
}
MediaType::Json
| MediaType::Wasm
| MediaType::TsBuildInfo
| MediaType::Css
| MediaType::SourceMap
| MediaType::Unknown => None,
},
@ -536,7 +541,7 @@ fn has_ts_check(media_type: MediaType, file_text: &str) -> bool {
| MediaType::Tsx
| MediaType::Json
| MediaType::Wasm
| MediaType::TsBuildInfo
| MediaType::Css
| MediaType::SourceMap
| MediaType::Unknown => false,
}

View file

@ -53,16 +53,6 @@ pub async fn compile(
);
}
if cli_options.unstable_detect_cjs() {
log::warn!(
concat!(
"{} --unstable-detect-cjs is not properly supported in deno compile. ",
"The compiled executable may encounter runtime errors.",
),
crate::colors::yellow("Warning"),
);
}
let output_path = resolve_compile_executable_output_path(
http_client,
&compile_flags,

View file

@ -6,12 +6,12 @@ use crate::args::FileFlags;
use crate::args::Flags;
use crate::cdp;
use crate::factory::CliFactory;
use crate::npm::CliNpmResolver;
use crate::tools::fmt::format_json;
use crate::tools::test::is_supported_test_path;
use crate::util::text_encoding::source_map_from_code;
use deno_ast::MediaType;
use deno_ast::ModuleKind;
use deno_ast::ModuleSpecifier;
use deno_config::glob::FileCollector;
use deno_config::glob::FilePatterns;
@ -25,6 +25,7 @@ use deno_core::serde_json;
use deno_core::sourcemap::SourceMap;
use deno_core::url::Url;
use deno_core::LocalInspectorSession;
use node_resolver::InNpmPackageChecker;
use regex::Regex;
use std::fs;
use std::fs::File;
@ -327,6 +328,7 @@ fn generate_coverage_report(
coverage_report.found_lines =
if let Some(source_map) = maybe_source_map.as_ref() {
let script_source_lines = script_source.lines().collect::<Vec<_>>();
let mut found_lines = line_counts
.iter()
.enumerate()
@ -334,7 +336,23 @@ fn generate_coverage_report(
// get all the mappings from this destination line to a different src line
let mut results = source_map
.tokens()
.filter(move |token| token.get_dst_line() as usize == index)
.filter(|token| {
let dst_line = token.get_dst_line() as usize;
dst_line == index && {
let dst_col = token.get_dst_col() as usize;
let content = script_source_lines
.get(dst_line)
.and_then(|line| {
line.get(dst_col..std::cmp::min(dst_col + 2, line.len()))
})
.unwrap_or("");
!content.is_empty()
&& content != "/*"
&& content != "*/"
&& content != "//"
}
})
.map(move |token| (token.get_src_line() as usize, *count))
.collect::<Vec<_>>();
// only keep the results that point at different src lines
@ -444,7 +462,7 @@ fn filter_coverages(
coverages: Vec<cdp::ScriptCoverage>,
include: Vec<String>,
exclude: Vec<String>,
npm_resolver: &dyn CliNpmResolver,
in_npm_pkg_checker: &dyn InNpmPackageChecker,
) -> Vec<cdp::ScriptCoverage> {
let include: Vec<Regex> =
include.iter().map(|e| Regex::new(e).unwrap()).collect();
@ -468,7 +486,7 @@ fn filter_coverages(
|| doc_test_re.is_match(e.url.as_str())
|| Url::parse(&e.url)
.ok()
.map(|url| npm_resolver.in_npm_package(&url))
.map(|url| in_npm_pkg_checker.in_npm_package(&url))
.unwrap_or(false);
let is_included = include.iter().any(|p| p.is_match(&e.url));
@ -479,7 +497,7 @@ fn filter_coverages(
.collect::<Vec<cdp::ScriptCoverage>>()
}
pub async fn cover_files(
pub fn cover_files(
flags: Arc<Flags>,
coverage_flags: CoverageFlags,
) -> Result<(), AnyError> {
@ -489,9 +507,10 @@ pub async fn cover_files(
let factory = CliFactory::from_flags(flags);
let cli_options = factory.cli_options()?;
let npm_resolver = factory.npm_resolver().await?;
let in_npm_pkg_checker = factory.in_npm_pkg_checker()?;
let file_fetcher = factory.file_fetcher()?;
let emitter = factory.emitter()?;
let cjs_tracker = factory.cjs_tracker()?;
assert!(!coverage_flags.files.include.is_empty());
@ -511,7 +530,7 @@ pub async fn cover_files(
script_coverages,
coverage_flags.include,
coverage_flags.exclude,
npm_resolver.as_ref(),
in_npm_pkg_checker.as_ref(),
);
if script_coverages.is_empty() {
return Err(generic_error("No covered files included in the report"));
@ -568,6 +587,8 @@ pub async fn cover_files(
let transpiled_code = match file.media_type {
MediaType::JavaScript
| MediaType::Unknown
| MediaType::Css
| MediaType::Wasm
| MediaType::Cjs
| MediaType::Mjs
| MediaType::Json => None,
@ -577,7 +598,10 @@ pub async fn cover_files(
| MediaType::Mts
| MediaType::Cts
| MediaType::Tsx => {
Some(match emitter.maybe_cached_emit(&file.specifier, &file.source) {
let module_kind = ModuleKind::from_is_cjs(
cjs_tracker.is_maybe_cjs(&file.specifier, file.media_type)?,
);
Some(match emitter.maybe_cached_emit(&file.specifier, module_kind, &file.source) {
Some(code) => code,
None => {
return Err(anyhow!(
@ -588,7 +612,7 @@ pub async fn cover_files(
}
})
}
MediaType::Wasm | MediaType::TsBuildInfo | MediaType::SourceMap => {
MediaType::SourceMap => {
unreachable!()
}
};

View file

@ -22,9 +22,9 @@ use deno_core::serde_json;
use deno_doc as doc;
use deno_doc::html::UrlResolveKind;
use deno_graph::source::NullFileSystem;
use deno_graph::EsParser;
use deno_graph::GraphKind;
use deno_graph::ModuleAnalyzer;
use deno_graph::ModuleParser;
use deno_graph::ModuleSpecifier;
use doc::html::ShortPath;
use doc::DocDiagnostic;
@ -37,7 +37,7 @@ const JSON_SCHEMA_VERSION: u8 = 1;
async fn generate_doc_nodes_for_builtin_types(
doc_flags: DocFlags,
parser: &dyn ModuleParser,
parser: &dyn EsParser,
analyzer: &dyn ModuleAnalyzer,
) -> Result<IndexMap<ModuleSpecifier, Vec<doc::DocNode>>, AnyError> {
let source_file_specifier =
@ -96,7 +96,7 @@ pub async fn doc(
let module_info_cache = factory.module_info_cache()?;
let parsed_source_cache = factory.parsed_source_cache();
let capturing_parser = parsed_source_cache.as_capturing_parser();
let analyzer = module_info_cache.as_module_analyzer(parsed_source_cache);
let analyzer = module_info_cache.as_module_analyzer();
let doc_nodes_by_url = match doc_flags.source_files {
DocSourceFileFlag::Builtin => {

View file

@ -353,6 +353,21 @@ fn format_yaml(
file_text: &str,
fmt_options: &FmtOptionsConfig,
) -> Result<Option<String>, AnyError> {
let ignore_file = file_text
.lines()
.take_while(|line| line.starts_with('#'))
.any(|line| {
line
.strip_prefix('#')
.unwrap()
.trim()
.starts_with("deno-fmt-ignore-file")
});
if ignore_file {
return Ok(None);
}
let formatted_str =
pretty_yaml::format_text(file_text, &get_resolved_yaml_config(fmt_options))
.map_err(AnyError::from)?;
@ -1017,7 +1032,7 @@ fn get_resolved_markup_fmt_config(
max_attrs_per_line: None,
prefer_attrs_single_line: false,
html_normal_self_closing: None,
html_void_self_closing: Some(true),
html_void_self_closing: None,
component_self_closing: None,
svg_self_closing: None,
mathml_self_closing: None,

View file

@ -530,7 +530,7 @@ impl<'a> GraphDisplayContext<'a> {
fn build_module_info(&mut self, module: &Module, type_dep: bool) -> TreeNode {
enum PackageOrSpecifier {
Package(NpmResolutionPackage),
Package(Box<NpmResolutionPackage>),
Specifier(ModuleSpecifier),
}
@ -538,7 +538,7 @@ impl<'a> GraphDisplayContext<'a> {
let package_or_specifier = match module.npm() {
Some(npm) => match self.npm_info.resolve_package(npm.nv_reference.nv()) {
Some(package) => Package(package.clone()),
Some(package) => Package(Box::new(package.clone())),
None => Specifier(module.specifier().clone()), // should never happen
},
None => Specifier(module.specifier().clone()),
@ -645,10 +645,12 @@ impl<'a> GraphDisplayContext<'a> {
let message = match err {
HttpsChecksumIntegrity(_) => "(checksum integrity error)",
Decode(_) => "(loading decode error)",
Loader(err) => match deno_core::error::get_custom_error_class(err) {
Some("NotCapable") => "(not capable, requires --allow-import)",
_ => "(loading error)",
},
Loader(err) => {
match deno_runtime::errors::get_error_class_name(err) {
Some("NotCapable") => "(not capable, requires --allow-import)",
_ => "(loading error)",
}
}
Jsr(_) => "(loading error)",
NodeUnknownBuiltinModule(_) => "(unknown node built-in error)",
Npm(_) => "(npm loading error)",

View file

@ -24,32 +24,29 @@ pub fn init_project(init_flags: InitFlags) -> Result<(), AnyError> {
create_file(
&dir,
"main.ts",
r#"import { type Route, route, serveDir } from "@std/http";
r#"import { serveDir } from "@std/http";
const routes: Route[] = [
{
pattern: new URLPattern({ pathname: "/" }),
handler: () => new Response("Home page"),
},
{
pattern: new URLPattern({ pathname: "/users/:id" }),
handler: (_req, _info, params) => new Response(params?.pathname.groups.id),
},
{
pattern: new URLPattern({ pathname: "/static/*" }),
handler: (req) => serveDir(req),
},
];
function defaultHandler(_req: Request) {
return new Response("Not found", { status: 404 });
}
const handler = route(routes, defaultHandler);
const userPagePattern = new URLPattern({ pathname: "/users/:id" });
const staticPathPattern = new URLPattern({ pathname: "/static/*" });
export default {
fetch(req) {
return handler(req);
const url = new URL(req.url);
if (url.pathname === "/") {
return new Response("Home page");
}
const userPageMatch = userPagePattern.exec(url);
if (userPageMatch) {
return new Response(userPageMatch.pathname.groups.id);
}
if (staticPathPattern.test(url)) {
return serveDir(req);
}
return new Response("Not found", { status: 404 });
},
} satisfies Deno.ServeDefaultExport;
"#,

View file

@ -1,5 +1,6 @@
// Copyright 2018-2024 the Deno authors. All rights reserved. MIT license.
use std::path::Path;
use std::path::PathBuf;
use std::sync::Arc;
@ -11,7 +12,9 @@ use deno_core::futures::StreamExt;
use deno_path_util::url_to_file_path;
use deno_semver::jsr::JsrPackageReqReference;
use deno_semver::npm::NpmPackageReqReference;
use deno_semver::package::PackageNv;
use deno_semver::package::PackageReq;
use deno_semver::Version;
use deno_semver::VersionReq;
use jsonc_parser::cst::CstObject;
use jsonc_parser::cst::CstObjectProp;
@ -333,6 +336,14 @@ fn load_configs(
Ok((cli_factory, npm_config, deno_config))
}
fn path_distance(a: &Path, b: &Path) -> usize {
let diff = pathdiff::diff_paths(a, b);
let Some(diff) = diff else {
return usize::MAX;
};
diff.components().count()
}
pub async fn add(
flags: Arc<Flags>,
add_flags: AddFlags,
@ -357,6 +368,21 @@ pub async fn add(
}
}
let start_dir = cli_factory.cli_options()?.start_dir.dir_path();
// only prefer to add npm deps to `package.json` if there isn't a closer deno.json.
// example: if deno.json is in the CWD and package.json is in the parent, we should add
// npm deps to deno.json, since it's closer
let prefer_npm_config = match (npm_config.as_ref(), deno_config.as_ref()) {
(Some(npm), Some(deno)) => {
let npm_distance = path_distance(&npm.path, &start_dir);
let deno_distance = path_distance(&deno.path, &start_dir);
npm_distance <= deno_distance
}
(Some(_), None) => true,
(None, _) => false,
};
let http_client = cli_factory.http_client_provider();
let deps_http_cache = cli_factory.global_http_cache()?;
let mut deps_file_fetcher = FileFetcher::new(
@ -431,15 +457,32 @@ pub async fn add(
match package_and_version {
PackageAndVersion::NotFound {
package: package_name,
found_npm_package,
help,
package_req,
} => {
if found_npm_package {
bail!("{} was not found, but a matching npm package exists. Did you mean `{}`?", crate::colors::red(package_name), crate::colors::yellow(format!("deno {cmd_name} npm:{package_req}")));
} else {
bail!("{} was not found.", crate::colors::red(package_name));
} => match help {
Some(NotFoundHelp::NpmPackage) => {
bail!(
"{} was not found, but a matching npm package exists. Did you mean `{}`?",
crate::colors::red(package_name),
crate::colors::yellow(format!("deno {cmd_name} npm:{package_req}"))
);
}
}
Some(NotFoundHelp::JsrPackage) => {
bail!(
"{} was not found, but a matching jsr package exists. Did you mean `{}`?",
crate::colors::red(package_name),
crate::colors::yellow(format!("deno {cmd_name} jsr:{package_req}"))
)
}
Some(NotFoundHelp::PreReleaseVersion(version)) => {
bail!(
"{} has only pre-release versions available. Try specifying a version: `{}`",
crate::colors::red(&package_name),
crate::colors::yellow(format!("deno {cmd_name} {package_name}@^{version}"))
)
}
None => bail!("{} was not found.", crate::colors::red(package_name)),
},
PackageAndVersion::Selected(selected) => {
selected_packages.push(selected);
}
@ -455,7 +498,7 @@ pub async fn add(
selected_package.selected_version
);
if selected_package.package_name.starts_with("npm:") {
if selected_package.package_name.starts_with("npm:") && prefer_npm_config {
if let Some(npm) = &mut npm_config {
npm.add(selected_package, dev);
} else {
@ -487,76 +530,144 @@ struct SelectedPackage {
selected_version: String,
}
enum NotFoundHelp {
NpmPackage,
JsrPackage,
PreReleaseVersion(Version),
}
enum PackageAndVersion {
NotFound {
package: String,
found_npm_package: bool,
package_req: PackageReq,
help: Option<NotFoundHelp>,
},
Selected(SelectedPackage),
}
fn best_version<'a>(
versions: impl Iterator<Item = &'a Version>,
) -> Option<&'a Version> {
let mut maybe_best_version: Option<&Version> = None;
for version in versions {
let is_best_version = maybe_best_version
.as_ref()
.map(|best_version| (*best_version).cmp(version).is_lt())
.unwrap_or(true);
if is_best_version {
maybe_best_version = Some(version);
}
}
maybe_best_version
}
trait PackageInfoProvider {
const SPECIFIER_PREFIX: &str;
/// The help to return if a package is found by this provider
const HELP: NotFoundHelp;
async fn req_to_nv(&self, req: &PackageReq) -> Option<PackageNv>;
async fn latest_version<'a>(&self, req: &PackageReq) -> Option<Version>;
}
impl PackageInfoProvider for Arc<JsrFetchResolver> {
const HELP: NotFoundHelp = NotFoundHelp::JsrPackage;
const SPECIFIER_PREFIX: &str = "jsr";
async fn req_to_nv(&self, req: &PackageReq) -> Option<PackageNv> {
(**self).req_to_nv(req).await
}
async fn latest_version<'a>(&self, req: &PackageReq) -> Option<Version> {
let info = self.package_info(&req.name).await?;
best_version(
info
.versions
.iter()
.filter(|(_, version_info)| !version_info.yanked)
.map(|(version, _)| version),
)
.cloned()
}
}
impl PackageInfoProvider for Arc<NpmFetchResolver> {
const HELP: NotFoundHelp = NotFoundHelp::NpmPackage;
const SPECIFIER_PREFIX: &str = "npm";
async fn req_to_nv(&self, req: &PackageReq) -> Option<PackageNv> {
(**self).req_to_nv(req).await
}
async fn latest_version<'a>(&self, req: &PackageReq) -> Option<Version> {
let info = self.package_info(&req.name).await?;
best_version(info.versions.keys()).cloned()
}
}
async fn find_package_and_select_version_for_req(
jsr_resolver: Arc<JsrFetchResolver>,
npm_resolver: Arc<NpmFetchResolver>,
add_package_req: AddRmPackageReq,
) -> Result<PackageAndVersion, AnyError> {
match add_package_req.value {
AddRmPackageReqValue::Jsr(req) => {
let jsr_prefixed_name = format!("jsr:{}", &req.name);
let Some(nv) = jsr_resolver.req_to_nv(&req).await else {
if npm_resolver.req_to_nv(&req).await.is_some() {
async fn select<T: PackageInfoProvider, S: PackageInfoProvider>(
main_resolver: T,
fallback_resolver: S,
add_package_req: AddRmPackageReq,
) -> Result<PackageAndVersion, AnyError> {
let req = match &add_package_req.value {
AddRmPackageReqValue::Jsr(req) => req,
AddRmPackageReqValue::Npm(req) => req,
};
let prefixed_name = format!("{}:{}", T::SPECIFIER_PREFIX, req.name);
let help_if_found_in_fallback = S::HELP;
let Some(nv) = main_resolver.req_to_nv(req).await else {
if fallback_resolver.req_to_nv(req).await.is_some() {
// it's in the other registry
return Ok(PackageAndVersion::NotFound {
package: prefixed_name,
help: Some(help_if_found_in_fallback),
package_req: req.clone(),
});
}
if req.version_req.version_text() == "*" {
if let Some(pre_release_version) =
main_resolver.latest_version(req).await
{
return Ok(PackageAndVersion::NotFound {
package: jsr_prefixed_name,
found_npm_package: true,
package_req: req,
package: prefixed_name,
package_req: req.clone(),
help: Some(NotFoundHelp::PreReleaseVersion(
pre_release_version.clone(),
)),
});
}
}
return Ok(PackageAndVersion::NotFound {
package: jsr_prefixed_name,
found_npm_package: false,
package_req: req,
});
};
let range_symbol = if req.version_req.version_text().starts_with('~') {
"~"
} else if req.version_req.version_text() == nv.version.to_string() {
""
} else {
"^"
};
Ok(PackageAndVersion::Selected(SelectedPackage {
import_name: add_package_req.alias,
package_name: jsr_prefixed_name,
version_req: format!("{}{}", range_symbol, &nv.version),
selected_version: nv.version.to_string(),
}))
return Ok(PackageAndVersion::NotFound {
package: prefixed_name,
help: None,
package_req: req.clone(),
});
};
let range_symbol = if req.version_req.version_text().starts_with('~') {
"~"
} else if req.version_req.version_text() == nv.version.to_string() {
""
} else {
"^"
};
Ok(PackageAndVersion::Selected(SelectedPackage {
import_name: add_package_req.alias,
package_name: prefixed_name,
version_req: format!("{}{}", range_symbol, &nv.version),
selected_version: nv.version.to_string(),
}))
}
match &add_package_req.value {
AddRmPackageReqValue::Jsr(_) => {
select(jsr_resolver, npm_resolver, add_package_req).await
}
AddRmPackageReqValue::Npm(req) => {
let npm_prefixed_name = format!("npm:{}", &req.name);
let Some(nv) = npm_resolver.req_to_nv(&req).await else {
return Ok(PackageAndVersion::NotFound {
package: npm_prefixed_name,
found_npm_package: false,
package_req: req,
});
};
let range_symbol = if req.version_req.version_text().starts_with('~') {
"~"
} else if req.version_req.version_text() == nv.version.to_string() {
""
} else {
"^"
};
Ok(PackageAndVersion::Selected(SelectedPackage {
import_name: add_package_req.alias,
package_name: npm_prefixed_name,
version_req: format!("{}{}", range_symbol, &nv.version),
selected_version: nv.version.to_string(),
}))
AddRmPackageReqValue::Npm(_) => {
select(npm_resolver, jsr_resolver, add_package_req).await
}
}
}

View file

@ -44,7 +44,11 @@ pub async fn cache_top_level_deps(
let mut seen_reqs = std::collections::HashSet::new();
for entry in import_map.imports().entries() {
for entry in import_map.imports().entries().chain(
import_map
.scopes()
.flat_map(|scope| scope.imports.entries()),
) {
let Some(specifier) = entry.value else {
continue;
};

View file

@ -120,7 +120,7 @@ fn resolve_content_maybe_unfurling(
| MediaType::Unknown
| MediaType::Json
| MediaType::Wasm
| MediaType::TsBuildInfo => {
| MediaType::Css => {
// not unfurlable data
return Ok(data);
}

View file

@ -25,6 +25,7 @@ use deno_ast::swc::visit::noop_visit_type;
use deno_ast::swc::visit::Visit;
use deno_ast::swc::visit::VisitWith;
use deno_ast::ImportsNotUsedAsValues;
use deno_ast::ModuleKind;
use deno_ast::ModuleSpecifier;
use deno_ast::ParseDiagnosticsError;
use deno_ast::ParsedSource;
@ -641,6 +642,10 @@ impl ReplSession {
jsx_fragment_factory: self.jsx.frag_factory.clone(),
jsx_import_source: self.jsx.import_source.clone(),
var_decl_imports: true,
verbatim_module_syntax: false,
},
&deno_ast::TranspileModuleOptions {
module_kind: Some(ModuleKind::Esm),
},
&deno_ast::EmitOptions {
source_map: deno_ast::SourceMapOption::None,
@ -651,7 +656,6 @@ impl ReplSession {
},
)?
.into_source()
.into_string()?
.text;
let value = self

View file

@ -1,9 +1,11 @@
// Copyright 2018-2024 the Deno authors. All rights reserved. MIT license.
use crate::cdp;
use crate::emit::Emitter;
use crate::util::file_watcher::WatcherCommunicator;
use crate::util::file_watcher::WatcherRestartMode;
use std::collections::HashMap;
use std::path::PathBuf;
use std::sync::Arc;
use deno_ast::MediaType;
use deno_ast::ModuleKind;
use deno_core::error::generic_error;
use deno_core::error::AnyError;
use deno_core::futures::StreamExt;
@ -12,11 +14,14 @@ use deno_core::serde_json::{self};
use deno_core::url::Url;
use deno_core::LocalInspectorSession;
use deno_terminal::colors;
use std::collections::HashMap;
use std::path::PathBuf;
use std::sync::Arc;
use tokio::select;
use crate::cdp;
use crate::emit::Emitter;
use crate::resolver::CjsTracker;
use crate::util::file_watcher::WatcherCommunicator;
use crate::util::file_watcher::WatcherRestartMode;
fn explain(status: &cdp::Status) -> &'static str {
match status {
cdp::Status::Ok => "OK",
@ -58,6 +63,7 @@ pub struct HmrRunner {
session: LocalInspectorSession,
watcher_communicator: Arc<WatcherCommunicator>,
script_ids: HashMap<String, String>,
cjs_tracker: Arc<CjsTracker>,
emitter: Arc<Emitter>,
}
@ -139,7 +145,8 @@ impl crate::worker::HmrRunner for HmrRunner {
};
let source_code = self.emitter.load_and_emit_for_hmr(
&module_url
&module_url,
ModuleKind::from_is_cjs(self.cjs_tracker.is_maybe_cjs(&module_url, MediaType::from_specifier(&module_url))?),
).await?;
let mut tries = 1;
@ -172,12 +179,14 @@ impl crate::worker::HmrRunner for HmrRunner {
impl HmrRunner {
pub fn new(
cjs_tracker: Arc<CjsTracker>,
emitter: Arc<Emitter>,
session: LocalInspectorSession,
watcher_communicator: Arc<WatcherCommunicator>,
) -> Self {
Self {
session,
cjs_tracker,
emitter,
watcher_communicator,
script_ids: HashMap::new(),

View file

@ -30,6 +30,16 @@ To grant permissions, set them before the script argument. For example:
}
}
fn set_npm_user_agent() {
static ONCE: std::sync::Once = std::sync::Once::new();
ONCE.call_once(|| {
std::env::set_var(
crate::npm::NPM_CONFIG_USER_AGENT_ENV_VAR,
crate::npm::get_npm_config_user_agent(),
);
});
}
pub async fn run_script(
mode: WorkerExecutionMode,
flags: Arc<Flags>,
@ -58,6 +68,10 @@ pub async fn run_script(
let main_module = cli_options.resolve_main_module()?;
if main_module.scheme() == "npm" {
set_npm_user_agent();
}
maybe_npm_install(&factory).await?;
let worker_factory = factory.create_cli_main_worker_factory().await?;
@ -119,6 +133,10 @@ async fn run_with_watch(
let cli_options = factory.cli_options()?;
let main_module = cli_options.resolve_main_module()?;
if main_module.scheme() == "npm" {
set_npm_user_agent();
}
maybe_npm_install(&factory).await?;
let _ = watcher_communicator.watch_paths(cli_options.watch_paths());

View file

@ -44,12 +44,15 @@ pub async fn serve(
maybe_npm_install(&factory).await?;
let worker_factory = factory.create_cli_main_worker_factory().await?;
let hmr = serve_flags
.watch
.map(|watch_flags| watch_flags.hmr)
.unwrap_or(false);
do_serve(
worker_factory,
main_module.clone(),
serve_flags.worker_count,
false,
hmr,
)
.await
}
@ -109,8 +112,6 @@ async fn do_serve(
}
}
Ok(exit_code)
// main.await?
}
async fn run_worker(
@ -119,7 +120,7 @@ async fn run_worker(
main_module: ModuleSpecifier,
hmr: bool,
) -> Result<i32, AnyError> {
let mut worker = worker_factory
let mut worker: crate::worker::CliMainWorker = worker_factory
.create_main_worker(
deno_runtime::WorkerExecutionMode::Serve {
is_main: false,

View file

@ -631,7 +631,7 @@ async fn configure_main_worker(
"Deno[Deno.internal].core.setLeakTracingEnabled(true);",
)?;
}
let res = worker.execute_side_module_possibly_with_npm().await;
let res = worker.execute_side_module().await;
let mut worker = worker.into_main_worker();
match res {
Ok(()) => Ok(()),

View file

@ -801,13 +801,18 @@ delete Object.prototype.__proto__;
if (logDebug) {
debug(`host.getScriptSnapshot("${specifier}")`);
}
const sourceFile = sourceFileCache.get(specifier);
if (sourceFile) {
if (!assetScopes.has(specifier)) {
assetScopes.set(specifier, lastRequestScope);
if (specifier.startsWith(ASSETS_URL_PREFIX)) {
const sourceFile = this.getSourceFile(
specifier,
ts.ScriptTarget.ESNext,
);
if (sourceFile) {
if (!assetScopes.has(specifier)) {
assetScopes.set(specifier, lastRequestScope);
}
// This case only occurs for assets.
return ts.ScriptSnapshot.fromString(sourceFile.text);
}
// This case only occurs for assets.
return ts.ScriptSnapshot.fromString(sourceFile.text);
}
let sourceText = sourceTextCache.get(specifier);
if (sourceText == undefined) {
@ -845,6 +850,8 @@ delete Object.prototype.__proto__;
jqueryMessage,
"Cannot_find_name_0_Do_you_need_to_install_type_definitions_for_jQuery_Try_npm_i_save_dev_types_Slash_2592":
jqueryMessage,
"Module_0_was_resolved_to_1_but_allowArbitraryExtensions_is_not_set_6263":
"Module '{0}' was resolved to '{1}', but importing these modules is not supported.",
};
})());
@ -1337,18 +1344,12 @@ delete Object.prototype.__proto__;
"console",
"Console",
"ErrorConstructor",
"exports",
"gc",
"Global",
"ImportMeta",
"localStorage",
"module",
"NodeModule",
"NodeRequire",
"process",
"queueMicrotask",
"RequestInit",
"require",
"ResponseInit",
"sessionStorage",
"setImmediate",

View file

@ -323,7 +323,7 @@ impl Diagnostics {
// todo(dsherret): use a short lived cache to prevent parsing
// source maps so often
if let Ok(source_map) =
SourceMap::from_slice(&fast_check_module.source_map)
SourceMap::from_slice(fast_check_module.source_map.as_bytes())
{
if let Some(start) = d.start.as_mut() {
let maybe_token = source_map

View file

@ -556,14 +556,23 @@ declare namespace Deno {
*/
env?: "inherit" | boolean | string[];
/** Specifies if the `sys` permission should be requested or revoked.
* If set to `"inherit"`, the current `sys` permission will be inherited.
* If set to `true`, the global `sys` permission will be requested.
* If set to `false`, the global `sys` permission will be revoked.
/** Specifies if the `ffi` permission should be requested or revoked.
* If set to `"inherit"`, the current `ffi` permission will be inherited.
* If set to `true`, the global `ffi` permission will be requested.
* If set to `false`, the global `ffi` permission will be revoked.
*
* @default {false}
*/
sys?: "inherit" | boolean | string[];
ffi?: "inherit" | boolean | Array<string | URL>;
/** Specifies if the `import` permission should be requested or revoked.
* If set to `"inherit"` the current `import` permission will be inherited.
* If set to `true`, the global `import` permission will be requested.
* If set to `false`, the global `import` permission will be revoked.
* If set to `Array<string>`, the `import` permissions will be requested with the
* specified domains.
*/
import?: "inherit" | boolean | Array<string>;
/** Specifies if the `net` permission should be requested or revoked.
* if set to `"inherit"`, the current `net` permission will be inherited.
@ -638,15 +647,6 @@ declare namespace Deno {
*/
net?: "inherit" | boolean | string[];
/** Specifies if the `ffi` permission should be requested or revoked.
* If set to `"inherit"`, the current `ffi` permission will be inherited.
* If set to `true`, the global `ffi` permission will be requested.
* If set to `false`, the global `ffi` permission will be revoked.
*
* @default {false}
*/
ffi?: "inherit" | boolean | Array<string | URL>;
/** Specifies if the `read` permission should be requested or revoked.
* If set to `"inherit"`, the current `read` permission will be inherited.
* If set to `true`, the global `read` permission will be requested.
@ -667,6 +667,15 @@ declare namespace Deno {
*/
run?: "inherit" | boolean | Array<string | URL>;
/** Specifies if the `sys` permission should be requested or revoked.
* If set to `"inherit"`, the current `sys` permission will be inherited.
* If set to `true`, the global `sys` permission will be requested.
* If set to `false`, the global `sys` permission will be revoked.
*
* @default {false}
*/
sys?: "inherit" | boolean | string[];
/** Specifies if the `write` permission should be requested or revoked.
* If set to `"inherit"`, the current `write` permission will be inherited.
* If set to `true`, the global `write` permission will be requested.

View file

@ -3,9 +3,11 @@
use crate::args::TsConfig;
use crate::args::TypeCheckMode;
use crate::cache::FastInsecureHasher;
use crate::cache::ModuleInfoCache;
use crate::node;
use crate::npm::CliNpmResolver;
use crate::npm::ResolvePkgFolderFromDenoReqError;
use crate::resolver::CjsTracker;
use crate::util::checksum;
use crate::util::path::mapped_specifier_for_tsc;
@ -32,13 +34,13 @@ use deno_graph::GraphKind;
use deno_graph::Module;
use deno_graph::ModuleGraph;
use deno_graph::ResolutionResolved;
use deno_runtime::deno_fs;
use deno_runtime::deno_node::NodeResolver;
use deno_semver::npm::NpmPackageReqReference;
use node_resolver::errors::NodeJsErrorCode;
use node_resolver::errors::NodeJsErrorCoded;
use node_resolver::errors::ResolvePkgSubpathFromDenoModuleError;
use node_resolver::errors::PackageSubpathResolveError;
use node_resolver::NodeModuleKind;
use node_resolver::NodeResolution;
use node_resolver::NodeResolutionMode;
use once_cell::sync::Lazy;
use std::borrow::Cow;
@ -302,8 +304,76 @@ pub struct EmittedFile {
pub media_type: MediaType,
}
pub fn into_specifier_and_media_type(
specifier: Option<ModuleSpecifier>,
) -> (ModuleSpecifier, MediaType) {
match specifier {
Some(specifier) => {
let media_type = MediaType::from_specifier(&specifier);
(specifier, media_type)
}
None => (
Url::parse("internal:///missing_dependency.d.ts").unwrap(),
MediaType::Dts,
),
}
}
#[derive(Debug)]
pub struct TypeCheckingCjsTracker {
cjs_tracker: Arc<CjsTracker>,
module_info_cache: Arc<ModuleInfoCache>,
}
impl TypeCheckingCjsTracker {
pub fn new(
cjs_tracker: Arc<CjsTracker>,
module_info_cache: Arc<ModuleInfoCache>,
) -> Self {
Self {
cjs_tracker,
module_info_cache,
}
}
pub fn is_cjs(
&self,
specifier: &ModuleSpecifier,
media_type: MediaType,
code: &Arc<str>,
) -> bool {
if let Some(module_kind) =
self.cjs_tracker.get_known_kind(specifier, media_type)
{
module_kind.is_cjs()
} else {
let maybe_is_script = self
.module_info_cache
.as_module_analyzer()
.analyze_sync(specifier, media_type, code)
.ok()
.map(|info| info.is_script);
maybe_is_script
.and_then(|is_script| {
self
.cjs_tracker
.is_cjs_with_known_is_script(specifier, media_type, is_script)
.ok()
})
.unwrap_or_else(|| {
self
.cjs_tracker
.is_maybe_cjs(specifier, media_type)
.unwrap_or(false)
})
}
}
}
#[derive(Debug)]
pub struct RequestNpmState {
pub cjs_tracker: Arc<TypeCheckingCjsTracker>,
pub node_resolver: Arc<NodeResolver>,
pub npm_resolver: Arc<dyn CliNpmResolver>,
}
@ -456,7 +526,7 @@ pub fn as_ts_script_kind(media_type: MediaType) -> i32 {
MediaType::Tsx => 4,
MediaType::Json => 6,
MediaType::SourceMap
| MediaType::TsBuildInfo
| MediaType::Css
| MediaType::Wasm
| MediaType::Unknown => 0,
}
@ -489,25 +559,22 @@ fn op_load_inner(
) -> Result<Option<LoadResponse>, AnyError> {
fn load_from_node_modules(
specifier: &ModuleSpecifier,
node_resolver: Option<&NodeResolver>,
npm_state: Option<&RequestNpmState>,
media_type: &mut MediaType,
is_cjs: &mut bool,
) -> Result<String, AnyError> {
*media_type = MediaType::from_specifier(specifier);
*is_cjs = node_resolver
.map(|node_resolver| {
match node_resolver.url_to_node_resolution(specifier.clone()) {
Ok(NodeResolution::CommonJs(_)) => true,
Ok(NodeResolution::Esm(_))
| Ok(NodeResolution::BuiltIn(_))
| Err(_) => false,
}
})
.unwrap_or(false);
let file_path = specifier.to_file_path().unwrap();
let code = std::fs::read_to_string(&file_path)
.with_context(|| format!("Unable to load {}", file_path.display()))?;
Ok(code)
let code: Arc<str> = code.into();
*is_cjs = npm_state
.map(|npm_state| {
npm_state.cjs_tracker.is_cjs(specifier, *media_type, &code)
})
.unwrap_or(false);
// todo(dsherret): how to avoid cloning here?
Ok(code.to_string())
}
let state = state.borrow_mut::<State>();
@ -560,6 +627,9 @@ fn op_load_inner(
match module {
Module::Js(module) => {
media_type = module.media_type;
if matches!(media_type, MediaType::Cjs | MediaType::Cts) {
is_cjs = true;
}
let source = module
.fast_check_module()
.map(|m| &*m.source)
@ -573,11 +643,13 @@ fn op_load_inner(
Module::Npm(_) | Module::Node(_) => None,
Module::External(module) => {
// means it's Deno code importing an npm module
let specifier =
node::resolve_specifier_into_node_modules(&module.specifier);
let specifier = node::resolve_specifier_into_node_modules(
&module.specifier,
&deno_fs::RealFs,
);
Some(Cow::Owned(load_from_node_modules(
&specifier,
state.maybe_npm.as_ref().map(|n| n.node_resolver.as_ref()),
state.maybe_npm.as_ref(),
&mut media_type,
&mut is_cjs,
)?))
@ -590,7 +662,7 @@ fn op_load_inner(
{
Some(Cow::Owned(load_from_node_modules(
specifier,
Some(npm.node_resolver.as_ref()),
Some(npm),
&mut media_type,
&mut is_cjs,
)?))
@ -739,7 +811,13 @@ fn op_resolve_inner(
}
}
};
(specifier_str, media_type.as_ts_extension())
(
specifier_str,
match media_type {
MediaType::Css => ".js", // surface these as .js for typescript
media_type => media_type.as_ts_extension(),
},
)
}
None => (
MISSING_DEPENDENCY_SPECIFIER.to_string(),
@ -810,29 +888,27 @@ fn resolve_graph_specifier_types(
Some(referrer),
NodeResolutionMode::Types,
);
let maybe_resolution = match res_result {
Ok(res) => Some(res),
let maybe_url = match res_result {
Ok(url) => Some(url),
Err(err) => match err.code() {
NodeJsErrorCode::ERR_TYPES_NOT_FOUND
| NodeJsErrorCode::ERR_MODULE_NOT_FOUND => None,
_ => return Err(err.into()),
},
};
Ok(Some(NodeResolution::into_specifier_and_media_type(
maybe_resolution,
)))
Ok(Some(into_specifier_and_media_type(maybe_url)))
} else {
Ok(None)
}
}
Some(Module::External(module)) => {
// we currently only use "External" for when the module is in an npm package
Ok(state.maybe_npm.as_ref().map(|npm| {
let specifier =
node::resolve_specifier_into_node_modules(&module.specifier);
NodeResolution::into_specifier_and_media_type(
npm.node_resolver.url_to_node_resolution(specifier).ok(),
)
Ok(state.maybe_npm.as_ref().map(|_| {
let specifier = node::resolve_specifier_into_node_modules(
&module.specifier,
&deno_fs::RealFs,
);
into_specifier_and_media_type(Some(specifier))
}))
}
Some(Module::Node(_)) | None => Ok(None),
@ -844,7 +920,7 @@ enum ResolveNonGraphSpecifierTypesError {
#[error(transparent)]
ResolvePkgFolderFromDenoReq(#[from] ResolvePkgFolderFromDenoReqError),
#[error(transparent)]
ResolvePkgSubpathFromDenoModule(#[from] ResolvePkgSubpathFromDenoModuleError),
PackageSubpathResolve(#[from] PackageSubpathResolveError),
}
fn resolve_non_graph_specifier_types(
@ -863,7 +939,7 @@ fn resolve_non_graph_specifier_types(
let node_resolver = &npm.node_resolver;
if node_resolver.in_npm_package(referrer) {
// we're in an npm package, so use node resolution
Ok(Some(NodeResolution::into_specifier_and_media_type(
Ok(Some(into_specifier_and_media_type(
node_resolver
.resolve(
raw_specifier,
@ -871,7 +947,8 @@ fn resolve_non_graph_specifier_types(
referrer_kind,
NodeResolutionMode::Types,
)
.ok(),
.ok()
.map(|res| res.into_url()),
)))
} else if let Ok(npm_req_ref) =
NpmPackageReqReference::from_str(raw_specifier)
@ -890,17 +967,15 @@ fn resolve_non_graph_specifier_types(
Some(referrer),
NodeResolutionMode::Types,
);
let maybe_resolution = match res_result {
Ok(res) => Some(res),
let maybe_url = match res_result {
Ok(url) => Some(url),
Err(err) => match err.code() {
NodeJsErrorCode::ERR_TYPES_NOT_FOUND
| NodeJsErrorCode::ERR_MODULE_NOT_FOUND => None,
_ => return Err(err.into()),
},
};
Ok(Some(NodeResolution::into_specifier_and_media_type(
maybe_resolution,
)))
Ok(Some(into_specifier_and_media_type(maybe_url)))
} else {
Ok(None)
}

View file

@ -64,7 +64,7 @@ fn extract_inner(
}) {
Ok(parsed) => {
let mut c = ExportCollector::default();
c.visit_program(parsed.program_ref());
c.visit_program(parsed.program().as_ref());
c
}
Err(_) => ExportCollector::default(),
@ -570,14 +570,14 @@ fn generate_pseudo_file(
})?;
let top_level_atoms = swc_utils::collect_decls_with_ctxt::<Atom, _>(
parsed.program_ref(),
&parsed.program_ref(),
parsed.top_level_context(),
);
let transformed =
parsed
.program_ref()
.clone()
.to_owned()
.fold_with(&mut as_folder(Transform {
specifier: &file.specifier,
base_file_specifier,
@ -1416,7 +1416,7 @@ console.log(Foo);
})
.unwrap();
collector.visit_program(parsed.program_ref());
parsed.program_ref().visit_with(&mut collector);
collector
}

View file

@ -565,7 +565,9 @@ pub fn symlink_dir(oldpath: &Path, newpath: &Path) -> Result<(), Error> {
use std::os::windows::fs::symlink_dir;
symlink_dir(oldpath, newpath).map_err(|err| {
if let Some(code) = err.raw_os_error() {
if code as u32 == winapi::shared::winerror::ERROR_PRIVILEGE_NOT_HELD {
if code as u32 == winapi::shared::winerror::ERROR_PRIVILEGE_NOT_HELD
|| code as u32 == winapi::shared::winerror::ERROR_INVALID_FUNCTION
{
return err_mapper(err, Some(ErrorKind::PermissionDenied));
}
}

View file

@ -65,6 +65,8 @@ pub fn init(maybe_level: Option<log::Level>) {
.filter_module("swc_ecma_parser", log::LevelFilter::Error)
// Suppress span lifecycle logs since they are too verbose
.filter_module("tracing::span", log::LevelFilter::Off)
// for deno_compile, this is too verbose
.filter_module("editpe", log::LevelFilter::Error)
.format(|buf, record| {
let mut target = record.target().to_string();
if let Some(line_no) = record.line() {

View file

@ -42,21 +42,6 @@ pub fn get_extension(file_path: &Path) -> Option<String> {
.map(|e| e.to_lowercase());
}
pub fn specifier_has_extension(
specifier: &ModuleSpecifier,
searching_ext: &str,
) -> bool {
let Some((_, ext)) = specifier.path().rsplit_once('.') else {
return false;
};
let searching_ext = searching_ext.strip_prefix('.').unwrap_or(searching_ext);
debug_assert!(!searching_ext.contains('.')); // exts like .d.ts are not implemented here
if ext.len() != searching_ext.len() {
return false;
}
ext.eq_ignore_ascii_case(searching_ext)
}
pub fn get_atomic_dir_path(file_path: &Path) -> PathBuf {
let rand = gen_rand_path_component();
let new_file_name = format!(
@ -350,18 +335,6 @@ mod test {
}
}
#[test]
fn test_specifier_has_extension() {
fn get(specifier: &str, ext: &str) -> bool {
specifier_has_extension(&ModuleSpecifier::parse(specifier).unwrap(), ext)
}
assert!(get("file:///a/b/c.ts", "ts"));
assert!(get("file:///a/b/c.ts", ".ts"));
assert!(!get("file:///a/b/c.ts", ".cts"));
assert!(get("file:///a/b/c.CtS", ".cts"));
}
#[test]
fn test_to_percent_decoded_str() {
let str = to_percent_decoded_str("%F0%9F%A6%95");

View file

@ -193,10 +193,16 @@ impl ProgressBarRenderer for TextOnlyProgressBarRenderer {
}
};
// TODO(@marvinhagemeister): We're trying to reconstruct the original
// specifier from the resolved one, but we lack the information about
// private registries URLs and other things here.
let message = display_entry
.message
.replace("https://registry.npmjs.org/", "npm:")
.replace("https://jsr.io/", "jsr:");
.replace("https://jsr.io/", "jsr:")
.replace("%2f", "/")
.replace("%2F", "/");
display_str.push_str(
&colors::gray(format!(" - {}{}\n", message, bytes_text)).to_string(),
);

View file

@ -97,6 +97,7 @@ fn find_source_map_range(code: &[u8]) -> Option<Range<usize>> {
}
/// Converts an `Arc<str>` to an `Arc<[u8]>`.
#[allow(dead_code)]
pub fn arc_str_to_bytes(arc_str: Arc<str>) -> Arc<[u8]> {
let raw = Arc::into_raw(arc_str);
// SAFETY: This is safe because they have the same memory layout.

View file

@ -14,16 +14,17 @@ use deno_core::v8;
use deno_core::CompiledWasmModuleStore;
use deno_core::Extension;
use deno_core::FeatureChecker;
use deno_core::ModuleId;
use deno_core::ModuleLoader;
use deno_core::PollEventLoopOptions;
use deno_core::SharedArrayBufferStore;
use deno_runtime::code_cache;
use deno_runtime::deno_broadcast_channel::InMemoryBroadcastChannel;
use deno_runtime::deno_fs;
use deno_runtime::deno_node;
use deno_runtime::deno_node::NodeExtInitServices;
use deno_runtime::deno_node::NodeRequireLoader;
use deno_runtime::deno_node::NodeRequireLoaderRc;
use deno_runtime::deno_node::NodeResolver;
use deno_runtime::deno_node::PackageJsonResolver;
use deno_runtime::deno_permissions::PermissionsContainer;
use deno_runtime::deno_tls::RootCertStoreProvider;
use deno_runtime::deno_web::BlobStore;
@ -42,7 +43,6 @@ use deno_runtime::WorkerExecutionMode;
use deno_runtime::WorkerLogLevel;
use deno_semver::npm::NpmPackageReqReference;
use deno_terminal::colors;
use node_resolver::NodeResolution;
use node_resolver::NodeResolutionMode;
use tokio::select;
@ -51,28 +51,27 @@ use crate::args::DenoSubcommand;
use crate::args::StorageKeyResolver;
use crate::errors;
use crate::npm::CliNpmResolver;
use crate::resolver::CjsResolutionStore;
use crate::util::checksum;
use crate::util::file_watcher::WatcherCommunicator;
use crate::util::file_watcher::WatcherRestartMode;
use crate::util::path::specifier_has_extension;
use crate::version;
pub struct ModuleLoaderAndSourceMapGetter {
pub struct CreateModuleLoaderResult {
pub module_loader: Rc<dyn ModuleLoader>,
pub node_require_loader: Rc<dyn NodeRequireLoader>,
}
pub trait ModuleLoaderFactory: Send + Sync {
fn create_for_main(
&self,
root_permissions: PermissionsContainer,
) -> ModuleLoaderAndSourceMapGetter;
) -> CreateModuleLoaderResult;
fn create_for_worker(
&self,
parent_permissions: PermissionsContainer,
permissions: PermissionsContainer,
) -> ModuleLoaderAndSourceMapGetter;
) -> CreateModuleLoaderResult;
}
#[async_trait::async_trait(?Send)]
@ -109,7 +108,6 @@ pub struct CliMainWorkerOptions {
pub inspect_wait: bool,
pub strace_ops: Option<Vec<String>>,
pub is_inspecting: bool,
pub is_npm_main: bool,
pub location: Option<Url>,
pub argv0: Option<String>,
pub node_debug: Option<String>,
@ -122,13 +120,11 @@ pub struct CliMainWorkerOptions {
pub node_ipc: Option<i64>,
pub serve_port: Option<u16>,
pub serve_host: Option<String>,
pub unstable_detect_cjs: bool,
}
struct SharedWorkerState {
blob_store: Arc<BlobStore>,
broadcast_channel: InMemoryBroadcastChannel,
cjs_resolution_store: Arc<CjsResolutionStore>,
code_cache: Option<Arc<dyn code_cache::CodeCache>>,
compiled_wasm_module_store: CompiledWasmModuleStore,
feature_checker: Arc<FeatureChecker>,
@ -139,6 +135,7 @@ struct SharedWorkerState {
module_loader_factory: Box<dyn ModuleLoaderFactory>,
node_resolver: Arc<NodeResolver>,
npm_resolver: Arc<dyn CliNpmResolver>,
pkg_json_resolver: Arc<PackageJsonResolver>,
root_cert_store_provider: Arc<dyn RootCertStoreProvider>,
root_permissions: PermissionsContainer,
shared_array_buffer_store: SharedArrayBufferStore,
@ -148,11 +145,15 @@ struct SharedWorkerState {
}
impl SharedWorkerState {
pub fn create_node_init_services(&self) -> NodeExtInitServices {
pub fn create_node_init_services(
&self,
node_require_loader: NodeRequireLoaderRc,
) -> NodeExtInitServices {
NodeExtInitServices {
node_require_resolver: self.npm_resolver.clone().into_require_resolver(),
node_require_loader,
node_resolver: self.node_resolver.clone(),
npm_resolver: self.npm_resolver.clone().into_npm_resolver(),
pkg_json_resolver: self.pkg_json_resolver.clone(),
}
}
@ -163,7 +164,6 @@ impl SharedWorkerState {
pub struct CliMainWorker {
main_module: ModuleSpecifier,
is_main_cjs: bool,
worker: MainWorker,
shared: Arc<SharedWorkerState>,
}
@ -185,17 +185,7 @@ impl CliMainWorker {
log::debug!("main_module {}", self.main_module);
if self.is_main_cjs {
deno_node::load_cjs_module(
&mut self.worker.js_runtime,
&self.main_module.to_file_path().unwrap().to_string_lossy(),
true,
self.shared.options.inspect_brk,
)?;
} else {
self.execute_main_module_possibly_with_npm().await?;
}
self.execute_main_module().await?;
self.worker.dispatch_load_event()?;
loop {
@ -283,22 +273,7 @@ impl CliMainWorker {
/// Execute the given main module emitting load and unload events before and after execution
/// respectively.
pub async fn execute(&mut self) -> Result<(), AnyError> {
if self.inner.is_main_cjs {
deno_node::load_cjs_module(
&mut self.inner.worker.js_runtime,
&self
.inner
.main_module
.to_file_path()
.unwrap()
.to_string_lossy(),
true,
self.inner.shared.options.inspect_brk,
)?;
} else {
self.inner.execute_main_module_possibly_with_npm().await?;
}
self.inner.execute_main_module().await?;
self.inner.worker.dispatch_load_event()?;
self.pending_unload = true;
@ -339,24 +314,13 @@ impl CliMainWorker {
executor.execute().await
}
pub async fn execute_main_module_possibly_with_npm(
&mut self,
) -> Result<(), AnyError> {
pub async fn execute_main_module(&mut self) -> Result<(), AnyError> {
let id = self.worker.preload_main_module(&self.main_module).await?;
self.evaluate_module_possibly_with_npm(id).await
self.worker.evaluate_module(id).await
}
pub async fn execute_side_module_possibly_with_npm(
&mut self,
) -> Result<(), AnyError> {
pub async fn execute_side_module(&mut self) -> Result<(), AnyError> {
let id = self.worker.preload_side_module(&self.main_module).await?;
self.evaluate_module_possibly_with_npm(id).await
}
async fn evaluate_module_possibly_with_npm(
&mut self,
id: ModuleId,
) -> Result<(), AnyError> {
self.worker.evaluate_module(id).await
}
@ -426,7 +390,6 @@ impl CliMainWorkerFactory {
#[allow(clippy::too_many_arguments)]
pub fn new(
blob_store: Arc<BlobStore>,
cjs_resolution_store: Arc<CjsResolutionStore>,
code_cache: Option<Arc<dyn code_cache::CodeCache>>,
feature_checker: Arc<FeatureChecker>,
fs: Arc<dyn deno_fs::FileSystem>,
@ -436,6 +399,7 @@ impl CliMainWorkerFactory {
module_loader_factory: Box<dyn ModuleLoaderFactory>,
node_resolver: Arc<NodeResolver>,
npm_resolver: Arc<dyn CliNpmResolver>,
pkg_json_resolver: Arc<PackageJsonResolver>,
root_cert_store_provider: Arc<dyn RootCertStoreProvider>,
root_permissions: PermissionsContainer,
storage_key_resolver: StorageKeyResolver,
@ -446,7 +410,6 @@ impl CliMainWorkerFactory {
shared: Arc::new(SharedWorkerState {
blob_store,
broadcast_channel: Default::default(),
cjs_resolution_store,
code_cache,
compiled_wasm_module_store: Default::default(),
feature_checker,
@ -457,6 +420,7 @@ impl CliMainWorkerFactory {
module_loader_factory,
node_resolver,
npm_resolver,
pkg_json_resolver,
root_cert_store_provider,
root_permissions,
shared_array_buffer_store: Default::default(),
@ -492,10 +456,13 @@ impl CliMainWorkerFactory {
stdio: deno_runtime::deno_io::Stdio,
) -> Result<CliMainWorker, AnyError> {
let shared = &self.shared;
let ModuleLoaderAndSourceMapGetter { module_loader } = shared
let CreateModuleLoaderResult {
module_loader,
node_require_loader,
} = shared
.module_loader_factory
.create_for_main(permissions.clone());
let (main_module, is_main_cjs) = if let Ok(package_ref) =
let main_module = if let Ok(package_ref) =
NpmPackageReqReference::from_specifier(&main_module)
{
if let Some(npm_resolver) = shared.npm_resolver.as_managed() {
@ -515,9 +482,8 @@ impl CliMainWorkerFactory {
package_ref.req(),
&referrer,
)?;
let node_resolution = self
let main_module = self
.resolve_binary_entrypoint(&package_folder, package_ref.sub_path())?;
let is_main_cjs = matches!(node_resolution, NodeResolution::CommonJs(_));
if let Some(lockfile) = &shared.maybe_lockfile {
// For npm binary commands, ensure that the lockfile gets updated
@ -526,36 +492,9 @@ impl CliMainWorkerFactory {
lockfile.write_if_changed()?;
}
(node_resolution.into_url(), is_main_cjs)
} else if shared.options.is_npm_main
|| shared.node_resolver.in_npm_package(&main_module)
{
let node_resolution =
shared.node_resolver.url_to_node_resolution(main_module)?;
let is_main_cjs = matches!(node_resolution, NodeResolution::CommonJs(_));
(node_resolution.into_url(), is_main_cjs)
main_module
} else {
let is_maybe_cjs_js_ext = self.shared.options.unstable_detect_cjs
&& specifier_has_extension(&main_module, "js")
&& self
.shared
.node_resolver
.get_closest_package_json(&main_module)
.ok()
.flatten()
.map(|pkg_json| pkg_json.typ == "commonjs")
.unwrap_or(false);
let is_cjs = if is_maybe_cjs_js_ext {
// fill the cjs resolution store by preparing the module load
module_loader
.prepare_load(&main_module, None, false)
.await?;
self.shared.cjs_resolution_store.is_known_cjs(&main_module)
} else {
main_module.scheme() == "file"
&& specifier_has_extension(&main_module, "cjs")
};
(main_module, is_cjs)
main_module
};
let maybe_inspector_server = shared.maybe_inspector_server.clone();
@ -597,7 +536,9 @@ impl CliMainWorkerFactory {
root_cert_store_provider: Some(shared.root_cert_store_provider.clone()),
module_loader,
fs: shared.fs.clone(),
node_services: Some(shared.create_node_init_services()),
node_services: Some(
shared.create_node_init_services(node_require_loader),
),
npm_process_state_provider: Some(shared.npm_process_state_provider()),
blob_store: shared.blob_store.clone(),
broadcast_channel: shared.broadcast_channel.clone(),
@ -682,7 +623,6 @@ impl CliMainWorkerFactory {
Ok(CliMainWorker {
main_module,
is_main_cjs,
worker,
shared: shared.clone(),
})
@ -692,19 +632,19 @@ impl CliMainWorkerFactory {
&self,
package_folder: &Path,
sub_path: Option<&str>,
) -> Result<NodeResolution, AnyError> {
) -> Result<ModuleSpecifier, AnyError> {
match self
.shared
.node_resolver
.resolve_binary_export(package_folder, sub_path)
{
Ok(node_resolution) => Ok(node_resolution),
Ok(specifier) => Ok(specifier),
Err(original_err) => {
// if the binary entrypoint was not found, fallback to regular node resolution
let result =
self.resolve_binary_entrypoint_fallback(package_folder, sub_path);
match result {
Ok(Some(resolution)) => Ok(resolution),
Ok(Some(specifier)) => Ok(specifier),
Ok(None) => Err(original_err.into()),
Err(fallback_err) => {
bail!("{:#}\n\nFallback failed: {:#}", original_err, fallback_err)
@ -719,7 +659,7 @@ impl CliMainWorkerFactory {
&self,
package_folder: &Path,
sub_path: Option<&str>,
) -> Result<Option<NodeResolution>, AnyError> {
) -> Result<Option<ModuleSpecifier>, AnyError> {
// only fallback if the user specified a sub path
if sub_path.is_none() {
// it's confusing to users if the package doesn't have any binary
@ -728,7 +668,7 @@ impl CliMainWorkerFactory {
return Ok(None);
}
let resolution = self
let specifier = self
.shared
.node_resolver
.resolve_package_subpath_from_deno_module(
@ -737,19 +677,14 @@ impl CliMainWorkerFactory {
/* referrer */ None,
NodeResolutionMode::Execution,
)?;
match &resolution {
NodeResolution::BuiltIn(_) => Ok(None),
NodeResolution::CommonJs(specifier) | NodeResolution::Esm(specifier) => {
if specifier
.to_file_path()
.map(|p| p.exists())
.unwrap_or(false)
{
Ok(Some(resolution))
} else {
bail!("Cannot find module '{}'", specifier)
}
}
if specifier
.to_file_path()
.map(|p| p.exists())
.unwrap_or(false)
{
Ok(Some(specifier))
} else {
bail!("Cannot find module '{}'", specifier)
}
}
}
@ -761,11 +696,13 @@ fn create_web_worker_callback(
Arc::new(move |args| {
let maybe_inspector_server = shared.maybe_inspector_server.clone();
let ModuleLoaderAndSourceMapGetter { module_loader } =
shared.module_loader_factory.create_for_worker(
args.parent_permissions.clone(),
args.permissions.clone(),
);
let CreateModuleLoaderResult {
module_loader,
node_require_loader,
} = shared.module_loader_factory.create_for_worker(
args.parent_permissions.clone(),
args.permissions.clone(),
);
let create_web_worker_cb =
create_web_worker_callback(shared.clone(), stdio.clone());
@ -795,7 +732,9 @@ fn create_web_worker_callback(
root_cert_store_provider: Some(shared.root_cert_store_provider.clone()),
module_loader,
fs: shared.fs.clone(),
node_services: Some(shared.create_node_init_services()),
node_services: Some(
shared.create_node_init_services(node_require_loader),
),
blob_store: shared.blob_store.clone(),
broadcast_channel: shared.broadcast_channel.clone(),
shared_array_buffer_store: Some(shared.shared_array_buffer_store.clone()),

View file

@ -2,7 +2,7 @@
[package]
name = "deno_broadcast_channel"
version = "0.168.0"
version = "0.171.0"
authors.workspace = true
edition.workspace = true
license.workspace = true

View file

@ -2,7 +2,7 @@
[package]
name = "deno_cache"
version = "0.106.0"
version = "0.109.0"
authors.workspace = true
edition.workspace = true
license.workspace = true

6
ext/cache/lib.rs vendored
View file

@ -33,7 +33,9 @@ pub enum CacheError {
}
#[derive(Clone)]
pub struct CreateCache<C: Cache + 'static>(pub Arc<dyn Fn() -> C>);
pub struct CreateCache<C: Cache + 'static>(
pub Arc<dyn Fn() -> Result<C, CacheError>>,
);
deno_core::extension!(deno_cache,
deps = [ deno_webidl, deno_web, deno_url, deno_fetch ],
@ -231,7 +233,7 @@ where
if let Some(cache) = state.try_borrow::<CA>() {
Ok(cache.clone())
} else if let Some(create_cache) = state.try_borrow::<CreateCache<CA>>() {
let cache = create_cache.0();
let cache = create_cache.0()?;
state.put(cache);
Ok(state.borrow::<CA>().clone())
} else {

23
ext/cache/sqlite.rs vendored
View file

@ -42,7 +42,7 @@ pub struct SqliteBackedCache {
}
impl SqliteBackedCache {
pub fn new(cache_storage_dir: PathBuf) -> Self {
pub fn new(cache_storage_dir: PathBuf) -> Result<Self, CacheError> {
{
std::fs::create_dir_all(&cache_storage_dir)
.expect("failed to create cache dir");
@ -57,18 +57,14 @@ impl SqliteBackedCache {
PRAGMA synchronous=NORMAL;
PRAGMA optimize;
";
connection
.execute_batch(initial_pragmas)
.expect("failed to execute pragmas");
connection
.execute(
"CREATE TABLE IF NOT EXISTS cache_storage (
connection.execute_batch(initial_pragmas)?;
connection.execute(
"CREATE TABLE IF NOT EXISTS cache_storage (
id INTEGER PRIMARY KEY,
cache_name TEXT NOT NULL UNIQUE
)",
(),
)
.expect("failed to create cache_storage table");
(),
)?;
connection
.execute(
"CREATE TABLE IF NOT EXISTS request_response_list (
@ -86,12 +82,11 @@ impl SqliteBackedCache {
UNIQUE (cache_id, request_url)
)",
(),
)
.expect("failed to create request_response_list table");
SqliteBackedCache {
)?;
Ok(SqliteBackedCache {
connection: Arc::new(Mutex::new(connection)),
cache_storage_dir,
}
})
}
}
}

View file

@ -2,7 +2,7 @@
[package]
name = "deno_canvas"
version = "0.43.0"
version = "0.46.0"
authors.workspace = true
edition.workspace = true
license.workspace = true

View file

@ -2,7 +2,7 @@
[package]
name = "deno_console"
version = "0.174.0"
version = "0.177.0"
authors.workspace = true
edition.workspace = true
license.workspace = true

View file

@ -2,7 +2,7 @@
[package]
name = "deno_cron"
version = "0.54.0"
version = "0.57.0"
authors.workspace = true
edition.workspace = true
license.workspace = true

View file

@ -2,7 +2,7 @@
[package]
name = "deno_crypto"
version = "0.188.0"
version = "0.191.0"
authors.workspace = true
edition.workspace = true
license.workspace = true

View file

@ -269,12 +269,6 @@ class Request {
/** @type {AbortSignal} */
get [_signal]() {
const signal = this[_signalCache];
// This signal not been created yet, and the request is still in progress
if (signal === undefined) {
const signal = newSignal();
this[_signalCache] = signal;
return signal;
}
// This signal has not been created yet, but the request has already completed
if (signal === false) {
const signal = newSignal();
@ -282,6 +276,18 @@ class Request {
signal[signalAbort](signalAbortError);
return signal;
}
// This signal not been created yet, and the request is still in progress
if (signal === undefined) {
const signal = newSignal();
this[_signalCache] = signal;
this[_request].onCancel?.(() => {
signal[signalAbort](signalAbortError);
});
return signal;
}
return signal;
}
get [_mimeType]() {

View file

@ -2,7 +2,7 @@
[package]
name = "deno_fetch"
version = "0.198.0"
version = "0.201.0"
authors.workspace = true
edition.workspace = true
license.workspace = true

View file

@ -39,6 +39,7 @@ use deno_core::OpState;
use deno_core::RcRef;
use deno_core::Resource;
use deno_core::ResourceId;
use deno_permissions::PermissionCheckError;
use deno_tls::rustls::RootCertStore;
use deno_tls::Proxy;
use deno_tls::RootCertStoreProvider;
@ -149,7 +150,7 @@ pub enum FetchError {
#[error(transparent)]
Resource(deno_core::error::AnyError),
#[error(transparent)]
Permission(deno_core::error::AnyError),
Permission(#[from] PermissionCheckError),
#[error("NetworkError when attempting to fetch resource")]
NetworkError,
#[error("Fetching files only supports the GET method: received {0}")]
@ -346,13 +347,13 @@ pub trait FetchPermissions {
&mut self,
url: &Url,
api_name: &str,
) -> Result<(), deno_core::error::AnyError>;
) -> Result<(), PermissionCheckError>;
#[must_use = "the resolved return value to mitigate time-of-check to time-of-use issues"]
fn check_read<'a>(
&mut self,
p: &'a Path,
api_name: &str,
) -> Result<Cow<'a, Path>, deno_core::error::AnyError>;
) -> Result<Cow<'a, Path>, PermissionCheckError>;
}
impl FetchPermissions for deno_permissions::PermissionsContainer {
@ -361,7 +362,7 @@ impl FetchPermissions for deno_permissions::PermissionsContainer {
&mut self,
url: &Url,
api_name: &str,
) -> Result<(), deno_core::error::AnyError> {
) -> Result<(), PermissionCheckError> {
deno_permissions::PermissionsContainer::check_net_url(self, url, api_name)
}
@ -370,7 +371,7 @@ impl FetchPermissions for deno_permissions::PermissionsContainer {
&mut self,
path: &'a Path,
api_name: &str,
) -> Result<Cow<'a, Path>, deno_core::error::AnyError> {
) -> Result<Cow<'a, Path>, PermissionCheckError> {
deno_permissions::PermissionsContainer::check_read_path(
self,
path,
@ -414,9 +415,7 @@ where
"file" => {
let path = url.to_file_path().map_err(|_| FetchError::NetworkError)?;
let permissions = state.borrow_mut::<FP>();
let path = permissions
.check_read(&path, "fetch()")
.map_err(FetchError::Permission)?;
let path = permissions.check_read(&path, "fetch()")?;
let url = match path {
Cow::Owned(path) => Url::from_file_path(path).unwrap(),
Cow::Borrowed(_) => url,
@ -442,9 +441,7 @@ where
}
"http" | "https" => {
let permissions = state.borrow_mut::<FP>();
permissions
.check_net_url(&url, "fetch()")
.map_err(FetchError::Resource)?;
permissions.check_net_url(&url, "fetch()")?;
let maybe_authority = extract_authority(&mut url);
let uri = url
@ -863,9 +860,7 @@ where
if let Some(proxy) = args.proxy.clone() {
let permissions = state.borrow_mut::<FP>();
let url = Url::parse(&proxy.url)?;
permissions
.check_net_url(&url, "Deno.createHttpClient()")
.map_err(FetchError::Permission)?;
permissions.check_net_url(&url, "Deno.createHttpClient()")?;
}
let options = state.borrow::<Options>();

View file

@ -2,7 +2,7 @@
[package]
name = "deno_ffi"
version = "0.161.0"
version = "0.164.0"
authors.workspace = true
edition.workspace = true
license.workspace = true

Some files were not shown because too many files have changed in this diff Show more