Initalize
This commit is contained in:
196
node_modules/cosmiconfig/CHANGELOG.md
generated
vendored
Normal file
196
node_modules/cosmiconfig/CHANGELOG.md
generated
vendored
Normal file
@@ -0,0 +1,196 @@
|
||||
# Changelog
|
||||
|
||||
## 6.0.0
|
||||
|
||||
- **Breaking change:** The package now has named exports. See examples below.
|
||||
- **Breaking change:** Separate async and sync APIs, accessible from different named exports. If you used `explorer.searchSync()` or `explorer.loadSync()`, you'll now create a sync explorer with `cosmiconfigSync()`, then use `explorerSync.search()` and `explorerSync.load()`.
|
||||
|
||||
```js
|
||||
// OLD: cosmiconfig v5
|
||||
import cosmiconfig from 'cosmiconfig';
|
||||
|
||||
const explorer = cosmiconfig('example');
|
||||
const searchAsyncResult = await explorer.search();
|
||||
const loadAsyncResult = await explorer.load('./file/to/load');
|
||||
const searchSyncResult = explorer.searchSync();
|
||||
const loadSyncResult = explorer.loadSync('./file/to/load');
|
||||
|
||||
// NEW: cosmiconfig v6
|
||||
import { cosmiconfig, cosmiconfigSync } from 'cosmiconfig';
|
||||
|
||||
const explorer = cosmiconfig('example');
|
||||
const searchAsyncResult = await explorer.search();
|
||||
const loadAsyncResult = await explorer.load('./file/to/load');
|
||||
|
||||
const explorerSync = cosmiconfigSync('example');
|
||||
const searchSyncResult = explorerSync.search();
|
||||
const loadSyncResult = explorerSync.load('./file/to/load');
|
||||
```
|
||||
- **Breaking change:** Remove support for Node 4 and 6. Requires Node 8+.
|
||||
- **Breaking change:** Use npm package [yaml](https://www.npmjs.com/package/yaml) to parse YAML instead of npm package [js-yaml](https://www.npmjs.com/package/js-yaml).
|
||||
- **Breaking change:** Remove `cosmiconfig.loaders` and add named export `defaultLoaders` that exports the default loaders used for each extension.
|
||||
|
||||
```js
|
||||
import { defaultLoaders } from 'cosmiconfig';
|
||||
|
||||
console.log(Object.entries(defaultLoaders))
|
||||
// [
|
||||
// [ '.js', [Function: loadJs] ],
|
||||
// [ '.json', [Function: loadJson] ],
|
||||
// [ '.yaml', [Function: loadYaml] ],
|
||||
// [ '.yml', [Function: loadYaml] ],
|
||||
// [ 'noExt', [Function: loadYaml] ]
|
||||
// ]
|
||||
```
|
||||
- Migrate from Flowtype to Typescript.
|
||||
- Lazy load all default loaders.
|
||||
|
||||
## 5.2.1
|
||||
|
||||
- Chore: Upgrade `js-yaml` to avoid npm audit warning.
|
||||
|
||||
## 5.2.0
|
||||
|
||||
- Added: `packageProp` values can be arrays of strings, to allow for property names that include periods. (This was possible before, but not documented or deliberately supported.)
|
||||
- Chore: Replaced the `lodash.get` dependency with a locally defined function.
|
||||
- Chore: Upgrade `js-yaml` to avoid npm audit warning.
|
||||
|
||||
## 5.1.0
|
||||
|
||||
- Added: `packageProp` values can include periods to describe paths to nested objects within `package.json`.
|
||||
|
||||
## 5.0.7
|
||||
|
||||
- Fixed: JS loader bypasses Node's `require` cache, fixing a bug where updates to `.js` config files would not load even when Cosmiconfig was told not to cache.
|
||||
|
||||
## 5.0.6
|
||||
|
||||
- Fixed: Better error message if the end user tries an extension Cosmiconfig is not configured to understand.
|
||||
|
||||
## 5.0.5
|
||||
|
||||
- Fixed: `load` and `loadSync` work with paths relative to `process.cwd()`.
|
||||
|
||||
## 5.0.4
|
||||
|
||||
- Fixed: `rc` files with `.js` extensions included in default `searchPlaces`.
|
||||
|
||||
## 5.0.3
|
||||
|
||||
- Docs: Minor corrections to documentation. *Released to update package documentation on npm*.
|
||||
|
||||
## 5.0.2
|
||||
|
||||
- Fixed: Allow `searchSync` and `loadSync` to load JS configuration files whose export is a Promise.
|
||||
|
||||
## 5.0.1
|
||||
|
||||
The API has been completely revamped to increase clarity and enable a very wide range of new usage. **Please read the readme for all the details.**
|
||||
|
||||
While the defaults remain just as useful as before — and you can still pass no options at all — now you can also do all kinds of wild and crazy things.
|
||||
|
||||
- The `loaders` option allows you specify custom functions to derive config objects from files. Your loader functions could parse ES2015 modules or TypeScript, JSON5, even INI or XML. Whatever suits you.
|
||||
- The `searchPlaces` option allows you to specify exactly where cosmiconfig looks within each directory it searches.
|
||||
- The combination of `loaders` and `searchPlaces` means that you should be able to load pretty much any kind of configuration file you want, from wherever you want it to look.
|
||||
|
||||
Additionally, the overloaded `load()` function has been split up into several clear and focused functions:
|
||||
|
||||
- `search()` now searches up the directory tree, and `load()` loads a configuration file that you don't need to search for.
|
||||
- The `sync` option has been replaced with separate synchronous functions: `searchSync()` and `loadSync()`.
|
||||
- `clearFileCache()` and `clearDirectoryCache()` have been renamed to `clearLoadCache()` and `clearSearchPath()` respectively.
|
||||
|
||||
More details:
|
||||
|
||||
- The default JS loader uses `require`, instead of `require-from-string`. So you *could* use `require` hooks to control the loading of JS files (e.g. pass them through esm or Babel). In most cases it is probably preferable to use a custom loader.
|
||||
- The options `rc`, `js`, and `rcExtensions` have all been removed. You can accomplish the same and more with `searchPlaces`.
|
||||
- The default `searchPlaces` include `rc` files with extensions, e.g. `.thingrc.json`, `.thingrc.yaml`, `.thingrc.yml`. This is the equivalent of switching the default value of the old `rcExtensions` option to `true`.
|
||||
- The option `rcStrictJson` has been removed. To get the same effect, you can specify `noExt: cosmiconfig.loadJson` in your `loaders` object.
|
||||
- `packageProp` no longer accepts `false`. If you don't want to look in `package.json`, write a `searchPlaces` array that does not include it.
|
||||
- By default, empty files are ignored by `search()`. The new option `ignoreEmptySearchPlaces` allows you to load them, instead, in case you want to do something with empty files.
|
||||
- The option `configPath` has been removed. Just pass your filepaths directory to `load()`.
|
||||
- Removed the `format` option. Formats are now all handled via the file extensions specified in `loaders`.
|
||||
|
||||
(If you're wondering with happened to 5.0.0 ... it was a silly publishing mistake.)
|
||||
|
||||
## 4.0.0
|
||||
|
||||
- Licensing improvement: updated `parse-json` from `3.0.0` to `4.0.0`(see [sindresorhus/parse-json#12][parse-json-pr-12]).
|
||||
- Changed: error message format for `JSON` parse errors(see [#101][pr-101]). If you were relying on the format of JSON-parsing error messages, this will be a breaking change for you.
|
||||
- Changed: set default for `searchPath` as `process.cwd()` in `explorer.load`.
|
||||
|
||||
## 3.1.0
|
||||
|
||||
- Added: infer format based on filePath
|
||||
|
||||
## 3.0.1
|
||||
|
||||
- Fixed: memory leak due to bug in `require-from-string`.
|
||||
- Added: for JSON files, append position to end of error message.
|
||||
|
||||
## 3.0.0
|
||||
|
||||
- Removed: support for loading config path using the `--config` flag. cosmiconfig will not parse command line arguments. Your application can parse command line arguments and pass them to cosmiconfig.
|
||||
- Removed: `argv` config option.
|
||||
- Removed: support for Node versions < 4.
|
||||
- Added: `sync` option.
|
||||
- Fixed: Throw a clear error on getting empty config file.
|
||||
- Fixed: when a `options.configPath` is `package.json`, return the package prop, not the entire JSON file.
|
||||
|
||||
## 2.2.2
|
||||
|
||||
- Fixed: `options.configPath` and `--config` flag are respected.
|
||||
|
||||
## 2.2.0, 2.2.1
|
||||
|
||||
- 2.2.0 included a number of improvements but somehow broke stylelint. The changes were reverted in 2.2.1, to be restored later.
|
||||
|
||||
## 2.1.3
|
||||
|
||||
- Licensing improvement: switched from `json-parse-helpfulerror` to `parse-json`.
|
||||
|
||||
## 2.1.2
|
||||
|
||||
- Fixed: bug where an `ENOENT` error would be thrown is `searchPath` referenced a non-existent file.
|
||||
- Fixed: JSON parsing errors in Node v7.
|
||||
|
||||
## 2.1.1
|
||||
|
||||
- Fixed: swapped `graceful-fs` for regular `fs`, fixing a garbage collection problem.
|
||||
|
||||
## 2.1.0
|
||||
|
||||
- Added: Node 0.12 support.
|
||||
|
||||
## 2.0.2
|
||||
|
||||
- Fixed: Node version specified in `package.json`.
|
||||
|
||||
## 2.0.1
|
||||
|
||||
- Fixed: no more infinite loop in Windows.
|
||||
|
||||
## 2.0.0
|
||||
|
||||
- Changed: module now creates cosmiconfig instances with `load` methods (see README).
|
||||
- Added: caching (enabled by the change above).
|
||||
- Removed: support for Node versions <4.
|
||||
|
||||
## 1.1.0
|
||||
|
||||
- Add `rcExtensions` option.
|
||||
|
||||
## 1.0.2
|
||||
|
||||
- Fix handling of `require()`'s within JS module configs.
|
||||
|
||||
## 1.0.1
|
||||
|
||||
- Switch Promise implementation to pinkie-promise.
|
||||
|
||||
## 1.0.0
|
||||
|
||||
- Initial release.
|
||||
|
||||
[parse-json-pr-12]: https://github.com/sindresorhus/parse-json/pull/12
|
||||
|
||||
[pr-101]: https://github.com/davidtheclark/cosmiconfig/pull/101
|
||||
22
node_modules/cosmiconfig/LICENSE
generated
vendored
Normal file
22
node_modules/cosmiconfig/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,22 @@
|
||||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) 2015 David Clark
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
|
||||
576
node_modules/cosmiconfig/README.md
generated
vendored
Normal file
576
node_modules/cosmiconfig/README.md
generated
vendored
Normal file
@@ -0,0 +1,576 @@
|
||||
# cosmiconfig
|
||||
|
||||
[](https://travis-ci.org/davidtheclark/cosmiconfig) [](https://ci.appveyor.com/project/davidtheclark/cosmiconfig/branch/master)
|
||||
[](https://codecov.io/gh/davidtheclark/cosmiconfig)
|
||||
|
||||
Cosmiconfig searches for and loads configuration for your program.
|
||||
|
||||
It features smart defaults based on conventional expectations in the JavaScript ecosystem.
|
||||
But it's also flexible enough to search wherever you'd like to search, and load whatever you'd like to load.
|
||||
|
||||
By default, Cosmiconfig will start where you tell it to start and search up the directory tree for the following:
|
||||
|
||||
- a `package.json` property
|
||||
- a JSON or YAML, extensionless "rc file"
|
||||
- an "rc file" with the extensions `.json`, `.yaml`, `.yml`, or `.js`.
|
||||
- a `.config.js` CommonJS module
|
||||
|
||||
For example, if your module's name is "myapp", cosmiconfig will search up the directory tree for configuration in the following places:
|
||||
|
||||
- a `myapp` property in `package.json`
|
||||
- a `.myapprc` file in JSON or YAML format
|
||||
- a `.myapprc.json` file
|
||||
- a `.myapprc.yaml`, `.myapprc.yml`, or `.myapprc.js` file
|
||||
- a `myapp.config.js` file exporting a JS object
|
||||
|
||||
Cosmiconfig continues to search up the directory tree, checking each of these places in each directory, until it finds some acceptable configuration (or hits the home directory).
|
||||
|
||||
👀 **Looking for the v5 docs?**
|
||||
v6 involves slight changes to Cosmiconfig's API, clarifying the difference between synchronous and asynchronous usage.
|
||||
If you have trouble switching from v5 to v6, please file an issue.
|
||||
If you are still using v5, those v5 docs are available [in the `5.x.x` tagged code](https://github.com/davidtheclark/cosmiconfig/tree/5.2.1).
|
||||
|
||||
## Table of contents
|
||||
|
||||
- [Installation](#installation)
|
||||
- [Usage](#usage)
|
||||
- [Result](#result)
|
||||
- [Asynchronous API](#asynchronous-api)
|
||||
- [cosmiconfig()](#cosmiconfig)
|
||||
- [explorer.search()](#explorersearch)
|
||||
- [explorer.load()](#explorerload)
|
||||
- [explorer.clearLoadCache()](#explorerclearloadcache)
|
||||
- [explorer.clearSearchCache()](#explorerclearsearchcache)
|
||||
- [explorer.clearCaches()](#explorerclearcaches)
|
||||
- [Synchronsous API](#synchronsous-api)
|
||||
- [cosmiconfigSync()](#cosmiconfigsync)
|
||||
- [explorerSync.search()](#explorersyncsearch)
|
||||
- [explorerSync.load()](#explorersyncload)
|
||||
- [explorerSync.clearLoadCache()](#explorersyncclearloadcache)
|
||||
- [explorerSync.clearSearchCache()](#explorersyncclearsearchcache)
|
||||
- [explorerSync.clearCaches()](#explorersyncclearcaches)
|
||||
- [cosmiconfigOptions](#cosmiconfigoptions)
|
||||
- [searchPlaces](#searchplaces)
|
||||
- [loaders](#loaders)
|
||||
- [packageProp](#packageprop)
|
||||
- [stopDir](#stopdir)
|
||||
- [cache](#cache)
|
||||
- [transform](#transform)
|
||||
- [ignoreEmptySearchPlaces](#ignoreemptysearchplaces)
|
||||
- [Caching](#caching)
|
||||
- [Differences from rc](#differences-from-rc)
|
||||
- [Contributing & Development](#contributing--development)
|
||||
|
||||
## Installation
|
||||
|
||||
```
|
||||
npm install cosmiconfig
|
||||
```
|
||||
|
||||
Tested in Node 8+.
|
||||
|
||||
## Usage
|
||||
|
||||
Create a Cosmiconfig explorer, then either `search` for or directly `load` a configuration file.
|
||||
|
||||
```js
|
||||
const { cosmiconfig, cosmiconfigSync } = require('cosmiconfig');
|
||||
// ...
|
||||
const explorer = cosmiconfig(moduleName);
|
||||
|
||||
// Search for a configuration by walking up directories.
|
||||
// See documentation for search, below.
|
||||
explorer.search()
|
||||
.then((result) => {
|
||||
// result.config is the parsed configuration object.
|
||||
// result.filepath is the path to the config file that was found.
|
||||
// result.isEmpty is true if there was nothing to parse in the config file.
|
||||
})
|
||||
.catch((error) => {
|
||||
// Do something constructive.
|
||||
});
|
||||
|
||||
// Load a configuration directly when you know where it should be.
|
||||
// The result object is the same as for search.
|
||||
// See documentation for load, below.
|
||||
explorer.load(pathToConfig).then(..);
|
||||
|
||||
// You can also search and load synchronously.
|
||||
const explorerSync = cosmiconfigSync(moduleName);
|
||||
|
||||
const searchedFor = explorerSync.search();
|
||||
const loaded = explorerSync.load(pathToConfig);
|
||||
```
|
||||
|
||||
## Result
|
||||
|
||||
The result object you get from `search` or `load` has the following properties:
|
||||
|
||||
- **config:** The parsed configuration object. `undefined` if the file is empty.
|
||||
- **filepath:** The path to the configuration file that was found.
|
||||
- **isEmpty:** `true` if the configuration file is empty. This property will not be present if the configuration file is not empty.
|
||||
|
||||
## Asynchronous API
|
||||
|
||||
### cosmiconfig()
|
||||
|
||||
```js
|
||||
const { cosmiconfig } = require('cosmiconfig');
|
||||
const explorer = cosmiconfig(moduleName[, cosmiconfigOptions])
|
||||
```
|
||||
|
||||
Creates a cosmiconfig instance ("explorer") configured according to the arguments, and initializes its caches.
|
||||
|
||||
#### moduleName
|
||||
|
||||
Type: `string`. **Required.**
|
||||
|
||||
Your module name. This is used to create the default [`searchPlaces`] and [`packageProp`].
|
||||
|
||||
If your [`searchPlaces`] value will include files, as it does by default (e.g. `${moduleName}rc`), your `moduleName` must consist of characters allowed in filenames. That means you should not copy scoped package names, such as `@my-org/my-package`, directly into `moduleName`.
|
||||
|
||||
**[`cosmiconfigOptions`] are documented below.**
|
||||
You may not need them, and should first read about the functions you'll use.
|
||||
|
||||
### explorer.search()
|
||||
|
||||
```js
|
||||
explorer.search([searchFrom]).then(result => {..})
|
||||
```
|
||||
|
||||
Searches for a configuration file. Returns a Promise that resolves with a [result] or with `null`, if no configuration file is found.
|
||||
|
||||
You can do the same thing synchronously with [`explorerSync.search()`].
|
||||
|
||||
Let's say your module name is `goldengrahams` so you initialized with `const explorer = cosmiconfig('goldengrahams');`.
|
||||
Here's how your default [`search()`] will work:
|
||||
|
||||
- Starting from `process.cwd()` (or some other directory defined by the `searchFrom` argument to [`search()`]), look for configuration objects in the following places:
|
||||
1. A `goldengrahams` property in a `package.json` file.
|
||||
2. A `.goldengrahamsrc` file with JSON or YAML syntax.
|
||||
3. A `.goldengrahamsrc.json` file.
|
||||
4. A `.goldengrahamsrc.yaml`, `.goldengrahamsrc.yml`, or `.goldengrahamsrc.js` file.
|
||||
5. A `goldengrahams.config.js` JS file exporting the object.
|
||||
- If none of those searches reveal a configuration object, move up one directory level and try again.
|
||||
So the search continues in `./`, `../`, `../../`, `../../../`, etc., checking the same places in each directory.
|
||||
- Continue searching until arriving at your home directory (or some other directory defined by the cosmiconfig option [`stopDir`]).
|
||||
- If at any point a parsable configuration is found, the [`search()`] Promise resolves with its [result] \(or, with [`explorerSync.search()`], the [result] is returned).
|
||||
- If no configuration object is found, the [`search()`] Promise resolves with `null` (or, with [`explorerSync.search()`], `null` is returned).
|
||||
- If a configuration object is found *but is malformed* (causing a parsing error), the [`search()`] Promise rejects with that error (so you should `.catch()` it). (Or, with [`explorerSync.search()`], the error is thrown.)
|
||||
|
||||
**If you know exactly where your configuration file should be, you can use [`load()`], instead.**
|
||||
|
||||
**The search process is highly customizable.**
|
||||
Use the cosmiconfig options [`searchPlaces`] and [`loaders`] to precisely define where you want to look for configurations and how you want to load them.
|
||||
|
||||
#### searchFrom
|
||||
|
||||
Type: `string`.
|
||||
Default: `process.cwd()`.
|
||||
|
||||
A filename.
|
||||
[`search()`] will start its search here.
|
||||
|
||||
If the value is a directory, that's where the search starts.
|
||||
If it's a file, the search starts in that file's directory.
|
||||
|
||||
### explorer.load()
|
||||
|
||||
```js
|
||||
explorer.load(loadPath).then(result => {..})
|
||||
```
|
||||
|
||||
Loads a configuration file. Returns a Promise that resolves with a [result] or rejects with an error (if the file does not exist or cannot be loaded).
|
||||
|
||||
Use `load` if you already know where the configuration file is and you just need to load it.
|
||||
|
||||
```js
|
||||
explorer.load('load/this/file.json'); // Tries to load load/this/file.json.
|
||||
```
|
||||
|
||||
If you load a `package.json` file, the result will be derived from whatever property is specified as your [`packageProp`].
|
||||
|
||||
You can do the same thing synchronously with [`explorerSync.load()`].
|
||||
|
||||
### explorer.clearLoadCache()
|
||||
|
||||
Clears the cache used in [`load()`].
|
||||
|
||||
### explorer.clearSearchCache()
|
||||
|
||||
Clears the cache used in [`search()`].
|
||||
|
||||
### explorer.clearCaches()
|
||||
|
||||
Performs both [`clearLoadCache()`] and [`clearSearchCache()`].
|
||||
|
||||
## Synchronsous API
|
||||
|
||||
### cosmiconfigSync()
|
||||
|
||||
```js
|
||||
const { cosmiconfigSync } = require('cosmiconfig');
|
||||
const explorerSync = cosmiconfigSync(moduleName[, cosmiconfigOptions])
|
||||
```
|
||||
|
||||
Creates a *synchronous* cosmiconfig instance ("explorerSync") configured according to the arguments, and initializes its caches.
|
||||
|
||||
See [`cosmiconfig()`].
|
||||
|
||||
### explorerSync.search()
|
||||
|
||||
```js
|
||||
const result = explorerSync.search([searchFrom]);
|
||||
```
|
||||
|
||||
Synchronous version of [`explorer.search()`].
|
||||
|
||||
Returns a [result] or `null`.
|
||||
|
||||
### explorerSync.load()
|
||||
|
||||
```js
|
||||
const result = explorerSync.load(loadPath);
|
||||
```
|
||||
|
||||
Synchronous version of [`explorer.load()`].
|
||||
|
||||
Returns a [result].
|
||||
|
||||
### explorerSync.clearLoadCache()
|
||||
|
||||
Clears the cache used in [`load()`].
|
||||
|
||||
### explorerSync.clearSearchCache()
|
||||
|
||||
Clears the cache used in [`search()`].
|
||||
|
||||
### explorerSync.clearCaches()
|
||||
|
||||
Performs both [`clearLoadCache()`] and [`clearSearchCache()`].
|
||||
|
||||
## cosmiconfigOptions
|
||||
|
||||
Type: `Object`.
|
||||
|
||||
Possible options are documented below.
|
||||
|
||||
### searchPlaces
|
||||
|
||||
Type: `Array<string>`.
|
||||
Default: See below.
|
||||
|
||||
An array of places that [`search()`] will check in each directory as it moves up the directory tree.
|
||||
Each place is relative to the directory being searched, and the places are checked in the specified order.
|
||||
|
||||
**Default `searchPlaces`:**
|
||||
|
||||
```js
|
||||
[
|
||||
'package.json',
|
||||
`.${moduleName}rc`,
|
||||
`.${moduleName}rc.json`,
|
||||
`.${moduleName}rc.yaml`,
|
||||
`.${moduleName}rc.yml`,
|
||||
`.${moduleName}rc.js`,
|
||||
`${moduleName}.config.js`,
|
||||
]
|
||||
```
|
||||
|
||||
Create your own array to search more, fewer, or altogether different places.
|
||||
|
||||
Every item in `searchPlaces` needs to have a loader in [`loaders`] that corresponds to its extension.
|
||||
(Common extensions are covered by default loaders.)
|
||||
Read more about [`loaders`] below.
|
||||
|
||||
`package.json` is a special value: When it is included in `searchPlaces`, Cosmiconfig will always parse it as JSON and load a property within it, not the whole file.
|
||||
That property is defined with the [`packageProp`] option, and defaults to your module name.
|
||||
|
||||
Examples, with a module named `porgy`:
|
||||
|
||||
```js
|
||||
// Disallow extensions on rc files:
|
||||
[
|
||||
'package.json',
|
||||
'.porgyrc',
|
||||
'porgy.config.js'
|
||||
]
|
||||
|
||||
// ESLint searches for configuration in these places:
|
||||
[
|
||||
'.eslintrc.js',
|
||||
'.eslintrc.yaml',
|
||||
'.eslintrc.yml',
|
||||
'.eslintrc.json',
|
||||
'.eslintrc',
|
||||
'package.json'
|
||||
]
|
||||
|
||||
// Babel looks in fewer places:
|
||||
[
|
||||
'package.json',
|
||||
'.babelrc'
|
||||
]
|
||||
|
||||
// Maybe you want to look for a wide variety of JS flavors:
|
||||
[
|
||||
'porgy.config.js',
|
||||
'porgy.config.mjs',
|
||||
'porgy.config.ts',
|
||||
'porgy.config.coffee'
|
||||
]
|
||||
// ^^ You will need to designate custom loaders to tell
|
||||
// Cosmiconfig how to handle these special JS flavors.
|
||||
|
||||
// Look within a .config/ subdirectory of every searched directory:
|
||||
[
|
||||
'package.json',
|
||||
'.porgyrc',
|
||||
'.config/.porgyrc',
|
||||
'.porgyrc.json',
|
||||
'.config/.porgyrc.json'
|
||||
]
|
||||
```
|
||||
|
||||
### loaders
|
||||
|
||||
Type: `Object`.
|
||||
Default: See below.
|
||||
|
||||
An object that maps extensions to the loader functions responsible for loading and parsing files with those extensions.
|
||||
|
||||
Cosmiconfig exposes its default loaders on a named export `defaultLoaders`.
|
||||
|
||||
**Default `loaders`:**
|
||||
|
||||
```js
|
||||
const { defaultLoaders } = require('cosmiconfig');
|
||||
|
||||
console.log(Object.entries(defaultLoaders))
|
||||
// [
|
||||
// [ '.js', [Function: loadJs] ],
|
||||
// [ '.json', [Function: loadJson] ],
|
||||
// [ '.yaml', [Function: loadYaml] ],
|
||||
// [ '.yml', [Function: loadYaml] ],
|
||||
// [ 'noExt', [Function: loadYaml] ]
|
||||
// ]
|
||||
```
|
||||
|
||||
(YAML is a superset of JSON; which means YAML parsers can parse JSON; which is how extensionless files can be either YAML *or* JSON with only one parser.)
|
||||
|
||||
**If you provide a `loaders` object, your object will be *merged* with the defaults.**
|
||||
So you can override one or two without having to override them all.
|
||||
|
||||
**Keys in `loaders`** are extensions (starting with a period), or `noExt` to specify the loader for files *without* extensions, like `.myapprc`.
|
||||
|
||||
**Values in `loaders`** are a loader function (described below) whose values are loader functions.
|
||||
|
||||
**The most common use case for custom loaders value is to load extensionless `rc` files as strict JSON**, instead of JSON *or* YAML (the default).
|
||||
To accomplish that, provide the following `loaders` value:
|
||||
|
||||
```js
|
||||
{
|
||||
noExt: defaultLoaders['.json']
|
||||
}
|
||||
```
|
||||
|
||||
If you want to load files that are not handled by the loader functions Cosmiconfig exposes, you can write a custom loader function or use one from NPM if it exists.
|
||||
|
||||
**Third-party loaders:**
|
||||
|
||||
- [@endemolshinegroup/cosmiconfig-typescript-loader](https://github.com/EndemolShineGroup/cosmiconfig-typescript-loader)
|
||||
|
||||
**Use cases for custom loader function:**
|
||||
|
||||
- Allow configuration syntaxes that aren't handled by Cosmiconfig's defaults, like JSON5, INI, or XML.
|
||||
- Allow ES2015 modules from `.mjs` configuration files.
|
||||
- Parse JS files with Babel before deriving the configuration.
|
||||
|
||||
**Custom loader functions** have the following signature:
|
||||
|
||||
```js
|
||||
// Sync
|
||||
(filepath: string, content: string) => Object | null
|
||||
|
||||
// Async
|
||||
(filepath: string, content: string) => Object | null | Promise<Object | null>
|
||||
```
|
||||
|
||||
Cosmiconfig reads the file when it checks whether the file exists, so it will provide you with both the file's path and its content.
|
||||
Do whatever you need to, and return either a configuration object or `null` (or, for async-only loaders, a Promise that resolves with one of those).
|
||||
`null` indicates that no real configuration was found and the search should continue.
|
||||
|
||||
A few things to note:
|
||||
|
||||
- If you use a custom loader, be aware of whether it's sync or async: you cannot use async customer loaders with the sync API ([`cosmiconfigSync()`]).
|
||||
- **Special JS syntax can also be handled by using a `require` hook**, because `defaultLoaders['.js']` just uses `require`.
|
||||
Whether you use custom loaders or a `require` hook is up to you.
|
||||
|
||||
Examples:
|
||||
|
||||
```js
|
||||
// Allow JSON5 syntax:
|
||||
{
|
||||
'.json': json5Loader
|
||||
}
|
||||
|
||||
// Allow a special configuration syntax of your own creation:
|
||||
{
|
||||
'.special': specialLoader
|
||||
}
|
||||
|
||||
// Allow many flavors of JS, using custom loaders:
|
||||
{
|
||||
'.mjs': esmLoader,
|
||||
'.ts': typeScriptLoader,
|
||||
'.coffee': coffeeScriptLoader
|
||||
}
|
||||
|
||||
// Allow many flavors of JS but rely on require hooks:
|
||||
{
|
||||
'.mjs': defaultLoaders['.js'],
|
||||
'.ts': defaultLoaders['.js'],
|
||||
'.coffee': defaultLoaders['.js']
|
||||
}
|
||||
```
|
||||
|
||||
### packageProp
|
||||
|
||||
Type: `string | Array<string>`.
|
||||
Default: `` `${moduleName}` ``.
|
||||
|
||||
Name of the property in `package.json` to look for.
|
||||
|
||||
Use a period-delimited string or an array of strings to describe a path to nested properties.
|
||||
|
||||
For example, the value `'configs.myPackage'` or `['configs', 'myPackage']` will get you the `"myPackage"` value in a `package.json` like this:
|
||||
|
||||
```json
|
||||
{
|
||||
"configs": {
|
||||
"myPackage": {..}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
If nested property names within the path include periods, you need to use an array of strings. For example, the value `['configs', 'foo.bar', 'baz']` will get you the `"baz"` value in a `package.json` like this:
|
||||
|
||||
```json
|
||||
{
|
||||
"configs": {
|
||||
"foo.bar": {
|
||||
"baz": {..}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
If a string includes period but corresponds to a top-level property name, it will not be interpreted as a period-delimited path. For example, the value `'one.two'` will get you the `"three"` value in a `package.json` like this:
|
||||
|
||||
```json
|
||||
{
|
||||
"one.two": "three",
|
||||
"one": {
|
||||
"two": "four"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### stopDir
|
||||
|
||||
Type: `string`.
|
||||
Default: Absolute path to your home directory.
|
||||
|
||||
Directory where the search will stop.
|
||||
|
||||
### cache
|
||||
|
||||
Type: `boolean`.
|
||||
Default: `true`.
|
||||
|
||||
If `false`, no caches will be used.
|
||||
Read more about ["Caching"](#caching) below.
|
||||
|
||||
### transform
|
||||
|
||||
Type: `(Result) => Promise<Result> | Result`.
|
||||
|
||||
A function that transforms the parsed configuration. Receives the [result].
|
||||
|
||||
If using [`search()`] or [`load()`] \(which are async), the transform function can return the transformed result or return a Promise that resolves with the transformed result.
|
||||
If using `cosmiconfigSync`, [`search()`] or [`load()`], the function must be synchronous and return the transformed result.
|
||||
|
||||
The reason you might use this option — instead of simply applying your transform function some other way — is that *the transformed result will be cached*. If your transformation involves additional filesystem I/O or other potentially slow processing, you can use this option to avoid repeating those steps every time a given configuration is searched or loaded.
|
||||
|
||||
### ignoreEmptySearchPlaces
|
||||
|
||||
Type: `boolean`.
|
||||
Default: `true`.
|
||||
|
||||
By default, if [`search()`] encounters an empty file (containing nothing but whitespace) in one of the [`searchPlaces`], it will ignore the empty file and move on.
|
||||
If you'd like to load empty configuration files, instead, set this option to `false`.
|
||||
|
||||
Why might you want to load empty configuration files?
|
||||
If you want to throw an error, or if an empty configuration file means something to your program.
|
||||
|
||||
## Caching
|
||||
|
||||
As of v2, cosmiconfig uses caching to reduce the need for repetitious reading of the filesystem or expensive transforms. Every new cosmiconfig instance (created with `cosmiconfig()`) has its own caches.
|
||||
|
||||
To avoid or work around caching, you can do the following:
|
||||
|
||||
- Set the `cosmiconfig` option [`cache`] to `false`.
|
||||
- Use the cache-clearing methods [`clearLoadCache()`], [`clearSearchCache()`], and [`clearCaches()`].
|
||||
- Create separate instances of cosmiconfig (separate "explorers").
|
||||
|
||||
## Differences from [rc](https://github.com/dominictarr/rc)
|
||||
|
||||
[rc](https://github.com/dominictarr/rc) serves its focused purpose well. cosmiconfig differs in a few key ways — making it more useful for some projects, less useful for others:
|
||||
|
||||
- Looks for configuration in some different places: in a `package.json` property, an rc file, a `.config.js` file, and rc files with extensions.
|
||||
- Built-in support for JSON, YAML, and CommonJS formats.
|
||||
- Stops at the first configuration found, instead of finding all that can be found up the directory tree and merging them automatically.
|
||||
- Options.
|
||||
- Asynchronous by default (though can be run synchronously).
|
||||
|
||||
## Contributing & Development
|
||||
|
||||
Please note that this project is released with a [Contributor Code of Conduct](CODE_OF_CONDUCT.md). By participating in this project you agree to abide by its terms.
|
||||
|
||||
And please do participate!
|
||||
|
||||
[result]: #result
|
||||
|
||||
[`load()`]: #explorerload
|
||||
|
||||
[`search()`]: #explorersearch
|
||||
|
||||
[`clearloadcache()`]: #explorerclearloadcache
|
||||
|
||||
[`clearsearchcache()`]: #explorerclearsearchcache
|
||||
|
||||
[`cosmiconfig()`]: #cosmiconfig
|
||||
|
||||
[`cosmiconfigSync()`]: #cosmiconfigsync
|
||||
|
||||
[`clearcaches()`]: #explorerclearcaches
|
||||
|
||||
[`packageprop`]: #packageprop
|
||||
|
||||
[`cache`]: #cache
|
||||
|
||||
[`stopdir`]: #stopdir
|
||||
|
||||
[`searchplaces`]: #searchplaces
|
||||
|
||||
[`loaders`]: #loaders
|
||||
|
||||
[`cosmiconfigoptions`]: #cosmiconfigoptions
|
||||
|
||||
[`explorerSync.search()`]: #explorersyncsearch
|
||||
|
||||
[`explorerSync.load()`]: #explorersyncload
|
||||
|
||||
[`explorer.search()`]: #explorersearch
|
||||
|
||||
[`explorer.load()`]: #explorerload
|
||||
14
node_modules/cosmiconfig/dist/Explorer.d.ts
generated
vendored
Normal file
14
node_modules/cosmiconfig/dist/Explorer.d.ts
generated
vendored
Normal file
@@ -0,0 +1,14 @@
|
||||
import { ExplorerBase } from './ExplorerBase';
|
||||
import { CosmiconfigResult, ExplorerOptions } from './types';
|
||||
declare class Explorer extends ExplorerBase<ExplorerOptions> {
|
||||
constructor(options: ExplorerOptions);
|
||||
search(searchFrom?: string): Promise<CosmiconfigResult>;
|
||||
private searchFromDirectory;
|
||||
private searchDirectory;
|
||||
private loadSearchPlace;
|
||||
private loadFileContent;
|
||||
private createCosmiconfigResult;
|
||||
load(filepath: string): Promise<CosmiconfigResult>;
|
||||
}
|
||||
export { Explorer };
|
||||
//# sourceMappingURL=Explorer.d.ts.map
|
||||
1
node_modules/cosmiconfig/dist/Explorer.d.ts.map
generated
vendored
Normal file
1
node_modules/cosmiconfig/dist/Explorer.d.ts.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"Explorer.d.ts","sourceRoot":"","sources":["../src/Explorer.ts"],"names":[],"mappings":"AACA,OAAO,EAAE,YAAY,EAAE,MAAM,gBAAgB,CAAC;AAI9C,OAAO,EAAE,iBAAiB,EAAE,eAAe,EAAqB,MAAM,SAAS,CAAC;AAEhF,cAAM,QAAS,SAAQ,YAAY,CAAC,eAAe,CAAC;gBAC/B,OAAO,EAAE,eAAe;IAI9B,MAAM,CACjB,UAAU,GAAE,MAAsB,GACjC,OAAO,CAAC,iBAAiB,CAAC;YAOf,mBAAmB;YAuBnB,eAAe;YAaf,eAAe;YAYf,eAAe;YAef,uBAAuB;IAUxB,IAAI,CAAC,QAAQ,EAAE,MAAM,GAAG,OAAO,CAAC,iBAAiB,CAAC;CAyBhE;AAED,OAAO,EAAE,QAAQ,EAAE,CAAC"}
|
||||
141
node_modules/cosmiconfig/dist/Explorer.js
generated
vendored
Normal file
141
node_modules/cosmiconfig/dist/Explorer.js
generated
vendored
Normal file
@@ -0,0 +1,141 @@
|
||||
"use strict";
|
||||
|
||||
Object.defineProperty(exports, "__esModule", {
|
||||
value: true
|
||||
});
|
||||
exports.Explorer = void 0;
|
||||
|
||||
var _path = _interopRequireDefault(require("path"));
|
||||
|
||||
var _ExplorerBase = require("./ExplorerBase");
|
||||
|
||||
var _readFile = require("./readFile");
|
||||
|
||||
var _cacheWrapper = require("./cacheWrapper");
|
||||
|
||||
var _getDirectory = require("./getDirectory");
|
||||
|
||||
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
|
||||
|
||||
function _asyncIterator(iterable) { var method; if (typeof Symbol !== "undefined") { if (Symbol.asyncIterator) { method = iterable[Symbol.asyncIterator]; if (method != null) return method.call(iterable); } if (Symbol.iterator) { method = iterable[Symbol.iterator]; if (method != null) return method.call(iterable); } } throw new TypeError("Object is not async iterable"); }
|
||||
|
||||
class Explorer extends _ExplorerBase.ExplorerBase {
|
||||
constructor(options) {
|
||||
super(options);
|
||||
}
|
||||
|
||||
async search(searchFrom = process.cwd()) {
|
||||
const startDirectory = await (0, _getDirectory.getDirectory)(searchFrom);
|
||||
const result = await this.searchFromDirectory(startDirectory);
|
||||
return result;
|
||||
}
|
||||
|
||||
async searchFromDirectory(dir) {
|
||||
const absoluteDir = _path.default.resolve(process.cwd(), dir);
|
||||
|
||||
const run = async () => {
|
||||
const result = await this.searchDirectory(absoluteDir);
|
||||
const nextDir = this.nextDirectoryToSearch(absoluteDir, result);
|
||||
|
||||
if (nextDir) {
|
||||
return this.searchFromDirectory(nextDir);
|
||||
}
|
||||
|
||||
const transformResult = await this.config.transform(result);
|
||||
return transformResult;
|
||||
};
|
||||
|
||||
if (this.searchCache) {
|
||||
return (0, _cacheWrapper.cacheWrapper)(this.searchCache, absoluteDir, run);
|
||||
}
|
||||
|
||||
return run();
|
||||
}
|
||||
|
||||
async searchDirectory(dir) {
|
||||
var _iteratorNormalCompletion = true;
|
||||
var _didIteratorError = false;
|
||||
|
||||
var _iteratorError;
|
||||
|
||||
try {
|
||||
for (var _iterator = _asyncIterator(this.config.searchPlaces), _step, _value; _step = await _iterator.next(), _iteratorNormalCompletion = _step.done, _value = await _step.value, !_iteratorNormalCompletion; _iteratorNormalCompletion = true) {
|
||||
const place = _value;
|
||||
const placeResult = await this.loadSearchPlace(dir, place);
|
||||
|
||||
if (this.shouldSearchStopWithResult(placeResult) === true) {
|
||||
return placeResult;
|
||||
}
|
||||
} // config not found
|
||||
|
||||
} catch (err) {
|
||||
_didIteratorError = true;
|
||||
_iteratorError = err;
|
||||
} finally {
|
||||
try {
|
||||
if (!_iteratorNormalCompletion && _iterator.return != null) {
|
||||
await _iterator.return();
|
||||
}
|
||||
} finally {
|
||||
if (_didIteratorError) {
|
||||
throw _iteratorError;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
async loadSearchPlace(dir, place) {
|
||||
const filepath = _path.default.join(dir, place);
|
||||
|
||||
const fileContents = await (0, _readFile.readFile)(filepath);
|
||||
const result = await this.createCosmiconfigResult(filepath, fileContents);
|
||||
return result;
|
||||
}
|
||||
|
||||
async loadFileContent(filepath, content) {
|
||||
if (content === null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
if (content.trim() === '') {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
const loader = this.getLoaderEntryForFile(filepath);
|
||||
const loaderResult = await loader(filepath, content);
|
||||
return loaderResult;
|
||||
}
|
||||
|
||||
async createCosmiconfigResult(filepath, content) {
|
||||
const fileContent = await this.loadFileContent(filepath, content);
|
||||
const result = this.loadedContentToCosmiconfigResult(filepath, fileContent);
|
||||
return result;
|
||||
}
|
||||
|
||||
async load(filepath) {
|
||||
this.validateFilePath(filepath);
|
||||
|
||||
const absoluteFilePath = _path.default.resolve(process.cwd(), filepath);
|
||||
|
||||
const runLoad = async () => {
|
||||
const fileContents = await (0, _readFile.readFile)(absoluteFilePath, {
|
||||
throwNotFound: true
|
||||
});
|
||||
const result = await this.createCosmiconfigResult(absoluteFilePath, fileContents);
|
||||
const transformResult = await this.config.transform(result);
|
||||
return transformResult;
|
||||
};
|
||||
|
||||
if (this.loadCache) {
|
||||
return (0, _cacheWrapper.cacheWrapper)(this.loadCache, absoluteFilePath, runLoad);
|
||||
}
|
||||
|
||||
return runLoad();
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
exports.Explorer = Explorer;
|
||||
//# sourceMappingURL=Explorer.js.map
|
||||
1
node_modules/cosmiconfig/dist/Explorer.js.map
generated
vendored
Normal file
1
node_modules/cosmiconfig/dist/Explorer.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
21
node_modules/cosmiconfig/dist/ExplorerBase.d.ts
generated
vendored
Normal file
21
node_modules/cosmiconfig/dist/ExplorerBase.d.ts
generated
vendored
Normal file
@@ -0,0 +1,21 @@
|
||||
import { CosmiconfigResult, ExplorerOptions, ExplorerOptionsSync, Cache, LoadedFileContent } from './types';
|
||||
import { Loader } from './index';
|
||||
declare class ExplorerBase<T extends ExplorerOptions | ExplorerOptionsSync> {
|
||||
protected readonly loadCache?: Cache;
|
||||
protected readonly searchCache?: Cache;
|
||||
protected readonly config: T;
|
||||
constructor(options: T);
|
||||
clearLoadCache(): void;
|
||||
clearSearchCache(): void;
|
||||
clearCaches(): void;
|
||||
private validateConfig;
|
||||
protected shouldSearchStopWithResult(result: CosmiconfigResult): boolean;
|
||||
protected nextDirectoryToSearch(currentDir: string, currentResult: CosmiconfigResult): string | null;
|
||||
private loadPackageProp;
|
||||
protected getLoaderEntryForFile(filepath: string): Loader;
|
||||
protected loadedContentToCosmiconfigResult(filepath: string, loadedContent: LoadedFileContent): CosmiconfigResult;
|
||||
protected validateFilePath(filepath: string): void;
|
||||
}
|
||||
declare function getExtensionDescription(filepath: string): string;
|
||||
export { ExplorerBase, getExtensionDescription };
|
||||
//# sourceMappingURL=ExplorerBase.d.ts.map
|
||||
1
node_modules/cosmiconfig/dist/ExplorerBase.d.ts.map
generated
vendored
Normal file
1
node_modules/cosmiconfig/dist/ExplorerBase.d.ts.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"ExplorerBase.d.ts","sourceRoot":"","sources":["../src/ExplorerBase.ts"],"names":[],"mappings":"AAGA,OAAO,EACL,iBAAiB,EACjB,eAAe,EACf,mBAAmB,EACnB,KAAK,EACL,iBAAiB,EAClB,MAAM,SAAS,CAAC;AACjB,OAAO,EAAE,MAAM,EAAE,MAAM,SAAS,CAAC;AAEjC,cAAM,YAAY,CAAC,CAAC,SAAS,eAAe,GAAG,mBAAmB;IAChE,SAAS,CAAC,QAAQ,CAAC,SAAS,CAAC,EAAE,KAAK,CAAC;IACrC,SAAS,CAAC,QAAQ,CAAC,WAAW,CAAC,EAAE,KAAK,CAAC;IACvC,SAAS,CAAC,QAAQ,CAAC,MAAM,EAAE,CAAC,CAAC;gBAEV,OAAO,EAAE,CAAC;IAUtB,cAAc,IAAI,IAAI;IAMtB,gBAAgB,IAAI,IAAI;IAMxB,WAAW,IAAI,IAAI;IAK1B,OAAO,CAAC,cAAc;IAwBtB,SAAS,CAAC,0BAA0B,CAAC,MAAM,EAAE,iBAAiB,GAAG,OAAO;IAMxE,SAAS,CAAC,qBAAqB,CAC7B,UAAU,EAAE,MAAM,EAClB,aAAa,EAAE,iBAAiB,GAC/B,MAAM,GAAG,IAAI;IAWhB,OAAO,CAAC,eAAe;IASvB,SAAS,CAAC,qBAAqB,CAAC,QAAQ,EAAE,MAAM,GAAG,MAAM;IAmBzD,SAAS,CAAC,gCAAgC,CACxC,QAAQ,EAAE,MAAM,EAChB,aAAa,EAAE,iBAAiB,GAC/B,iBAAiB;IAUpB,SAAS,CAAC,gBAAgB,CAAC,QAAQ,EAAE,MAAM,GAAG,IAAI;CAKnD;AAMD,iBAAS,uBAAuB,CAAC,QAAQ,EAAE,MAAM,GAAG,MAAM,CAGzD;AAED,OAAO,EAAE,YAAY,EAAE,uBAAuB,EAAE,CAAC"}
|
||||
142
node_modules/cosmiconfig/dist/ExplorerBase.js
generated
vendored
Normal file
142
node_modules/cosmiconfig/dist/ExplorerBase.js
generated
vendored
Normal file
@@ -0,0 +1,142 @@
|
||||
"use strict";
|
||||
|
||||
Object.defineProperty(exports, "__esModule", {
|
||||
value: true
|
||||
});
|
||||
exports.getExtensionDescription = getExtensionDescription;
|
||||
exports.ExplorerBase = void 0;
|
||||
|
||||
var _path = _interopRequireDefault(require("path"));
|
||||
|
||||
var _loaders = require("./loaders");
|
||||
|
||||
var _getPropertyByPath = require("./getPropertyByPath");
|
||||
|
||||
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
|
||||
|
||||
class ExplorerBase {
|
||||
constructor(options) {
|
||||
if (options.cache === true) {
|
||||
this.loadCache = new Map();
|
||||
this.searchCache = new Map();
|
||||
}
|
||||
|
||||
this.config = options;
|
||||
this.validateConfig();
|
||||
}
|
||||
|
||||
clearLoadCache() {
|
||||
if (this.loadCache) {
|
||||
this.loadCache.clear();
|
||||
}
|
||||
}
|
||||
|
||||
clearSearchCache() {
|
||||
if (this.searchCache) {
|
||||
this.searchCache.clear();
|
||||
}
|
||||
}
|
||||
|
||||
clearCaches() {
|
||||
this.clearLoadCache();
|
||||
this.clearSearchCache();
|
||||
}
|
||||
|
||||
validateConfig() {
|
||||
const config = this.config;
|
||||
config.searchPlaces.forEach(place => {
|
||||
const loaderKey = _path.default.extname(place) || 'noExt';
|
||||
const loader = config.loaders[loaderKey];
|
||||
|
||||
if (!loader) {
|
||||
throw new Error(`No loader specified for ${getExtensionDescription(place)}, so searchPlaces item "${place}" is invalid`);
|
||||
}
|
||||
|
||||
if (typeof loader !== 'function') {
|
||||
throw new Error(`loader for ${getExtensionDescription(place)} is not a function (type provided: "${typeof loader}"), so searchPlaces item "${place}" is invalid`);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
shouldSearchStopWithResult(result) {
|
||||
if (result === null) return false;
|
||||
if (result.isEmpty && this.config.ignoreEmptySearchPlaces) return false;
|
||||
return true;
|
||||
}
|
||||
|
||||
nextDirectoryToSearch(currentDir, currentResult) {
|
||||
if (this.shouldSearchStopWithResult(currentResult)) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const nextDir = nextDirUp(currentDir);
|
||||
|
||||
if (nextDir === currentDir || currentDir === this.config.stopDir) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return nextDir;
|
||||
}
|
||||
|
||||
loadPackageProp(filepath, content) {
|
||||
const parsedContent = _loaders.loaders.loadJson(filepath, content);
|
||||
|
||||
const packagePropValue = (0, _getPropertyByPath.getPropertyByPath)(parsedContent, this.config.packageProp);
|
||||
return packagePropValue || null;
|
||||
}
|
||||
|
||||
getLoaderEntryForFile(filepath) {
|
||||
if (_path.default.basename(filepath) === 'package.json') {
|
||||
const loader = this.loadPackageProp.bind(this);
|
||||
return loader;
|
||||
}
|
||||
|
||||
const loaderKey = _path.default.extname(filepath) || 'noExt';
|
||||
const loader = this.config.loaders[loaderKey];
|
||||
|
||||
if (!loader) {
|
||||
throw new Error(`No loader specified for ${getExtensionDescription(filepath)}`);
|
||||
}
|
||||
|
||||
return loader;
|
||||
}
|
||||
|
||||
loadedContentToCosmiconfigResult(filepath, loadedContent) {
|
||||
if (loadedContent === null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
if (loadedContent === undefined) {
|
||||
return {
|
||||
filepath,
|
||||
config: undefined,
|
||||
isEmpty: true
|
||||
};
|
||||
}
|
||||
|
||||
return {
|
||||
config: loadedContent,
|
||||
filepath
|
||||
};
|
||||
}
|
||||
|
||||
validateFilePath(filepath) {
|
||||
if (!filepath) {
|
||||
throw new Error('load must pass a non-empty string');
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
exports.ExplorerBase = ExplorerBase;
|
||||
|
||||
function nextDirUp(dir) {
|
||||
return _path.default.dirname(dir);
|
||||
}
|
||||
|
||||
function getExtensionDescription(filepath) {
|
||||
const ext = _path.default.extname(filepath);
|
||||
|
||||
return ext ? `extension "${ext}"` : 'files without extensions';
|
||||
}
|
||||
//# sourceMappingURL=ExplorerBase.js.map
|
||||
1
node_modules/cosmiconfig/dist/ExplorerBase.js.map
generated
vendored
Normal file
1
node_modules/cosmiconfig/dist/ExplorerBase.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
14
node_modules/cosmiconfig/dist/ExplorerSync.d.ts
generated
vendored
Normal file
14
node_modules/cosmiconfig/dist/ExplorerSync.d.ts
generated
vendored
Normal file
@@ -0,0 +1,14 @@
|
||||
import { ExplorerBase } from './ExplorerBase';
|
||||
import { CosmiconfigResult, ExplorerOptionsSync } from './types';
|
||||
declare class ExplorerSync extends ExplorerBase<ExplorerOptionsSync> {
|
||||
constructor(options: ExplorerOptionsSync);
|
||||
searchSync(searchFrom?: string): CosmiconfigResult;
|
||||
private searchFromDirectorySync;
|
||||
private searchDirectorySync;
|
||||
private loadSearchPlaceSync;
|
||||
private loadFileContentSync;
|
||||
private createCosmiconfigResultSync;
|
||||
loadSync(filepath: string): CosmiconfigResult;
|
||||
}
|
||||
export { ExplorerSync };
|
||||
//# sourceMappingURL=ExplorerSync.d.ts.map
|
||||
1
node_modules/cosmiconfig/dist/ExplorerSync.d.ts.map
generated
vendored
Normal file
1
node_modules/cosmiconfig/dist/ExplorerSync.d.ts.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"ExplorerSync.d.ts","sourceRoot":"","sources":["../src/ExplorerSync.ts"],"names":[],"mappings":"AACA,OAAO,EAAE,YAAY,EAAE,MAAM,gBAAgB,CAAC;AAI9C,OAAO,EACL,iBAAiB,EACjB,mBAAmB,EAEpB,MAAM,SAAS,CAAC;AAEjB,cAAM,YAAa,SAAQ,YAAY,CAAC,mBAAmB,CAAC;gBACvC,OAAO,EAAE,mBAAmB;IAIxC,UAAU,CAAC,UAAU,GAAE,MAAsB,GAAG,iBAAiB;IAOxE,OAAO,CAAC,uBAAuB;IAuB/B,OAAO,CAAC,mBAAmB;IAa3B,OAAO,CAAC,mBAAmB;IAS3B,OAAO,CAAC,mBAAmB;IAgB3B,OAAO,CAAC,2BAA2B;IAU5B,QAAQ,CAAC,QAAQ,EAAE,MAAM,GAAG,iBAAiB;CAsBrD;AAED,OAAO,EAAE,YAAY,EAAE,CAAC"}
|
||||
118
node_modules/cosmiconfig/dist/ExplorerSync.js
generated
vendored
Normal file
118
node_modules/cosmiconfig/dist/ExplorerSync.js
generated
vendored
Normal file
@@ -0,0 +1,118 @@
|
||||
"use strict";
|
||||
|
||||
Object.defineProperty(exports, "__esModule", {
|
||||
value: true
|
||||
});
|
||||
exports.ExplorerSync = void 0;
|
||||
|
||||
var _path = _interopRequireDefault(require("path"));
|
||||
|
||||
var _ExplorerBase = require("./ExplorerBase");
|
||||
|
||||
var _readFile = require("./readFile");
|
||||
|
||||
var _cacheWrapper = require("./cacheWrapper");
|
||||
|
||||
var _getDirectory = require("./getDirectory");
|
||||
|
||||
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
|
||||
|
||||
class ExplorerSync extends _ExplorerBase.ExplorerBase {
|
||||
constructor(options) {
|
||||
super(options);
|
||||
}
|
||||
|
||||
searchSync(searchFrom = process.cwd()) {
|
||||
const startDirectory = (0, _getDirectory.getDirectorySync)(searchFrom);
|
||||
const result = this.searchFromDirectorySync(startDirectory);
|
||||
return result;
|
||||
}
|
||||
|
||||
searchFromDirectorySync(dir) {
|
||||
const absoluteDir = _path.default.resolve(process.cwd(), dir);
|
||||
|
||||
const run = () => {
|
||||
const result = this.searchDirectorySync(absoluteDir);
|
||||
const nextDir = this.nextDirectoryToSearch(absoluteDir, result);
|
||||
|
||||
if (nextDir) {
|
||||
return this.searchFromDirectorySync(nextDir);
|
||||
}
|
||||
|
||||
const transformResult = this.config.transform(result);
|
||||
return transformResult;
|
||||
};
|
||||
|
||||
if (this.searchCache) {
|
||||
return (0, _cacheWrapper.cacheWrapperSync)(this.searchCache, absoluteDir, run);
|
||||
}
|
||||
|
||||
return run();
|
||||
}
|
||||
|
||||
searchDirectorySync(dir) {
|
||||
for (const place of this.config.searchPlaces) {
|
||||
const placeResult = this.loadSearchPlaceSync(dir, place);
|
||||
|
||||
if (this.shouldSearchStopWithResult(placeResult) === true) {
|
||||
return placeResult;
|
||||
}
|
||||
} // config not found
|
||||
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
loadSearchPlaceSync(dir, place) {
|
||||
const filepath = _path.default.join(dir, place);
|
||||
|
||||
const content = (0, _readFile.readFileSync)(filepath);
|
||||
const result = this.createCosmiconfigResultSync(filepath, content);
|
||||
return result;
|
||||
}
|
||||
|
||||
loadFileContentSync(filepath, content) {
|
||||
if (content === null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
if (content.trim() === '') {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
const loader = this.getLoaderEntryForFile(filepath);
|
||||
const loaderResult = loader(filepath, content);
|
||||
return loaderResult;
|
||||
}
|
||||
|
||||
createCosmiconfigResultSync(filepath, content) {
|
||||
const fileContent = this.loadFileContentSync(filepath, content);
|
||||
const result = this.loadedContentToCosmiconfigResult(filepath, fileContent);
|
||||
return result;
|
||||
}
|
||||
|
||||
loadSync(filepath) {
|
||||
this.validateFilePath(filepath);
|
||||
|
||||
const absoluteFilePath = _path.default.resolve(process.cwd(), filepath);
|
||||
|
||||
const runLoadSync = () => {
|
||||
const content = (0, _readFile.readFileSync)(absoluteFilePath, {
|
||||
throwNotFound: true
|
||||
});
|
||||
const cosmiconfigResult = this.createCosmiconfigResultSync(absoluteFilePath, content);
|
||||
const transformResult = this.config.transform(cosmiconfigResult);
|
||||
return transformResult;
|
||||
};
|
||||
|
||||
if (this.loadCache) {
|
||||
return (0, _cacheWrapper.cacheWrapperSync)(this.loadCache, absoluteFilePath, runLoadSync);
|
||||
}
|
||||
|
||||
return runLoadSync();
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
exports.ExplorerSync = ExplorerSync;
|
||||
//# sourceMappingURL=ExplorerSync.js.map
|
||||
1
node_modules/cosmiconfig/dist/ExplorerSync.js.map
generated
vendored
Normal file
1
node_modules/cosmiconfig/dist/ExplorerSync.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
5
node_modules/cosmiconfig/dist/cacheWrapper.d.ts
generated
vendored
Normal file
5
node_modules/cosmiconfig/dist/cacheWrapper.d.ts
generated
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
import { Cache, CosmiconfigResult } from './types';
|
||||
declare function cacheWrapper(cache: Cache, key: string, fn: () => Promise<CosmiconfigResult>): Promise<CosmiconfigResult>;
|
||||
declare function cacheWrapperSync(cache: Cache, key: string, fn: () => CosmiconfigResult): CosmiconfigResult;
|
||||
export { cacheWrapper, cacheWrapperSync };
|
||||
//# sourceMappingURL=cacheWrapper.d.ts.map
|
||||
1
node_modules/cosmiconfig/dist/cacheWrapper.d.ts.map
generated
vendored
Normal file
1
node_modules/cosmiconfig/dist/cacheWrapper.d.ts.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"cacheWrapper.d.ts","sourceRoot":"","sources":["../src/cacheWrapper.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,KAAK,EAAE,iBAAiB,EAAE,MAAM,SAAS,CAAC;AAEnD,iBAAe,YAAY,CACzB,KAAK,EAAE,KAAK,EACZ,GAAG,EAAE,MAAM,EACX,EAAE,EAAE,MAAM,OAAO,CAAC,iBAAiB,CAAC,GACnC,OAAO,CAAC,iBAAiB,CAAC,CAS5B;AAED,iBAAS,gBAAgB,CACvB,KAAK,EAAE,KAAK,EACZ,GAAG,EAAE,MAAM,EACX,EAAE,EAAE,MAAM,iBAAiB,GAC1B,iBAAiB,CASnB;AAED,OAAO,EAAE,YAAY,EAAE,gBAAgB,EAAE,CAAC"}
|
||||
32
node_modules/cosmiconfig/dist/cacheWrapper.js
generated
vendored
Normal file
32
node_modules/cosmiconfig/dist/cacheWrapper.js
generated
vendored
Normal file
@@ -0,0 +1,32 @@
|
||||
"use strict";
|
||||
|
||||
Object.defineProperty(exports, "__esModule", {
|
||||
value: true
|
||||
});
|
||||
exports.cacheWrapper = cacheWrapper;
|
||||
exports.cacheWrapperSync = cacheWrapperSync;
|
||||
|
||||
async function cacheWrapper(cache, key, fn) {
|
||||
const cached = cache.get(key);
|
||||
|
||||
if (cached !== undefined) {
|
||||
return cached;
|
||||
}
|
||||
|
||||
const result = await fn();
|
||||
cache.set(key, result);
|
||||
return result;
|
||||
}
|
||||
|
||||
function cacheWrapperSync(cache, key, fn) {
|
||||
const cached = cache.get(key);
|
||||
|
||||
if (cached !== undefined) {
|
||||
return cached;
|
||||
}
|
||||
|
||||
const result = fn();
|
||||
cache.set(key, result);
|
||||
return result;
|
||||
}
|
||||
//# sourceMappingURL=cacheWrapper.js.map
|
||||
1
node_modules/cosmiconfig/dist/cacheWrapper.js.map
generated
vendored
Normal file
1
node_modules/cosmiconfig/dist/cacheWrapper.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"sources":["../src/cacheWrapper.ts"],"names":["cacheWrapper","cache","key","fn","cached","get","undefined","result","set","cacheWrapperSync"],"mappings":";;;;;;;;AAEA,eAAeA,YAAf,CACEC,KADF,EAEEC,GAFF,EAGEC,EAHF,EAI8B;AAC5B,QAAMC,MAAM,GAAGH,KAAK,CAACI,GAAN,CAAUH,GAAV,CAAf;;AACA,MAAIE,MAAM,KAAKE,SAAf,EAA0B;AACxB,WAAOF,MAAP;AACD;;AAED,QAAMG,MAAM,GAAG,MAAMJ,EAAE,EAAvB;AACAF,EAAAA,KAAK,CAACO,GAAN,CAAUN,GAAV,EAAeK,MAAf;AACA,SAAOA,MAAP;AACD;;AAED,SAASE,gBAAT,CACER,KADF,EAEEC,GAFF,EAGEC,EAHF,EAIqB;AACnB,QAAMC,MAAM,GAAGH,KAAK,CAACI,GAAN,CAAUH,GAAV,CAAf;;AACA,MAAIE,MAAM,KAAKE,SAAf,EAA0B;AACxB,WAAOF,MAAP;AACD;;AAED,QAAMG,MAAM,GAAGJ,EAAE,EAAjB;AACAF,EAAAA,KAAK,CAACO,GAAN,CAAUN,GAAV,EAAeK,MAAf;AACA,SAAOA,MAAP;AACD","sourcesContent":["import { Cache, CosmiconfigResult } from './types';\n\nasync function cacheWrapper(\n cache: Cache,\n key: string,\n fn: () => Promise<CosmiconfigResult>,\n): Promise<CosmiconfigResult> {\n const cached = cache.get(key);\n if (cached !== undefined) {\n return cached;\n }\n\n const result = await fn();\n cache.set(key, result);\n return result;\n}\n\nfunction cacheWrapperSync(\n cache: Cache,\n key: string,\n fn: () => CosmiconfigResult,\n): CosmiconfigResult {\n const cached = cache.get(key);\n if (cached !== undefined) {\n return cached;\n }\n\n const result = fn();\n cache.set(key, result);\n return result;\n}\n\nexport { cacheWrapper, cacheWrapperSync };\n"],"file":"cacheWrapper.js"}
|
||||
4
node_modules/cosmiconfig/dist/getDirectory.d.ts
generated
vendored
Normal file
4
node_modules/cosmiconfig/dist/getDirectory.d.ts
generated
vendored
Normal file
@@ -0,0 +1,4 @@
|
||||
declare function getDirectory(filepath: string): Promise<string>;
|
||||
declare function getDirectorySync(filepath: string): string;
|
||||
export { getDirectory, getDirectorySync };
|
||||
//# sourceMappingURL=getDirectory.d.ts.map
|
||||
1
node_modules/cosmiconfig/dist/getDirectory.d.ts.map
generated
vendored
Normal file
1
node_modules/cosmiconfig/dist/getDirectory.d.ts.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"getDirectory.d.ts","sourceRoot":"","sources":["../src/getDirectory.ts"],"names":[],"mappings":"AAGA,iBAAe,YAAY,CAAC,QAAQ,EAAE,MAAM,GAAG,OAAO,CAAC,MAAM,CAAC,CAU7D;AAED,iBAAS,gBAAgB,CAAC,QAAQ,EAAE,MAAM,GAAG,MAAM,CAUlD;AAED,OAAO,EAAE,YAAY,EAAE,gBAAgB,EAAE,CAAC"}
|
||||
38
node_modules/cosmiconfig/dist/getDirectory.js
generated
vendored
Normal file
38
node_modules/cosmiconfig/dist/getDirectory.js
generated
vendored
Normal file
@@ -0,0 +1,38 @@
|
||||
"use strict";
|
||||
|
||||
Object.defineProperty(exports, "__esModule", {
|
||||
value: true
|
||||
});
|
||||
exports.getDirectory = getDirectory;
|
||||
exports.getDirectorySync = getDirectorySync;
|
||||
|
||||
var _path = _interopRequireDefault(require("path"));
|
||||
|
||||
var _pathType = require("path-type");
|
||||
|
||||
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
|
||||
|
||||
async function getDirectory(filepath) {
|
||||
const filePathIsDirectory = await (0, _pathType.isDirectory)(filepath);
|
||||
|
||||
if (filePathIsDirectory === true) {
|
||||
return filepath;
|
||||
}
|
||||
|
||||
const directory = _path.default.dirname(filepath);
|
||||
|
||||
return directory;
|
||||
}
|
||||
|
||||
function getDirectorySync(filepath) {
|
||||
const filePathIsDirectory = (0, _pathType.isDirectorySync)(filepath);
|
||||
|
||||
if (filePathIsDirectory === true) {
|
||||
return filepath;
|
||||
}
|
||||
|
||||
const directory = _path.default.dirname(filepath);
|
||||
|
||||
return directory;
|
||||
}
|
||||
//# sourceMappingURL=getDirectory.js.map
|
||||
1
node_modules/cosmiconfig/dist/getDirectory.js.map
generated
vendored
Normal file
1
node_modules/cosmiconfig/dist/getDirectory.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"sources":["../src/getDirectory.ts"],"names":["getDirectory","filepath","filePathIsDirectory","directory","path","dirname","getDirectorySync"],"mappings":";;;;;;;;AAAA;;AACA;;;;AAEA,eAAeA,YAAf,CAA4BC,QAA5B,EAA+D;AAC7D,QAAMC,mBAAmB,GAAG,MAAM,2BAAYD,QAAZ,CAAlC;;AAEA,MAAIC,mBAAmB,KAAK,IAA5B,EAAkC;AAChC,WAAOD,QAAP;AACD;;AAED,QAAME,SAAS,GAAGC,cAAKC,OAAL,CAAaJ,QAAb,CAAlB;;AAEA,SAAOE,SAAP;AACD;;AAED,SAASG,gBAAT,CAA0BL,QAA1B,EAAoD;AAClD,QAAMC,mBAAmB,GAAG,+BAAgBD,QAAhB,CAA5B;;AAEA,MAAIC,mBAAmB,KAAK,IAA5B,EAAkC;AAChC,WAAOD,QAAP;AACD;;AAED,QAAME,SAAS,GAAGC,cAAKC,OAAL,CAAaJ,QAAb,CAAlB;;AAEA,SAAOE,SAAP;AACD","sourcesContent":["import path from 'path';\nimport { isDirectory, isDirectorySync } from 'path-type';\n\nasync function getDirectory(filepath: string): Promise<string> {\n const filePathIsDirectory = await isDirectory(filepath);\n\n if (filePathIsDirectory === true) {\n return filepath;\n }\n\n const directory = path.dirname(filepath);\n\n return directory;\n}\n\nfunction getDirectorySync(filepath: string): string {\n const filePathIsDirectory = isDirectorySync(filepath);\n\n if (filePathIsDirectory === true) {\n return filepath;\n }\n\n const directory = path.dirname(filepath);\n\n return directory;\n}\n\nexport { getDirectory, getDirectorySync };\n"],"file":"getDirectory.js"}
|
||||
5
node_modules/cosmiconfig/dist/getPropertyByPath.d.ts
generated
vendored
Normal file
5
node_modules/cosmiconfig/dist/getPropertyByPath.d.ts
generated
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
declare function getPropertyByPath(source: {
|
||||
[key: string]: unknown;
|
||||
}, path: string | Array<string>): unknown;
|
||||
export { getPropertyByPath };
|
||||
//# sourceMappingURL=getPropertyByPath.d.ts.map
|
||||
1
node_modules/cosmiconfig/dist/getPropertyByPath.d.ts.map
generated
vendored
Normal file
1
node_modules/cosmiconfig/dist/getPropertyByPath.d.ts.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"getPropertyByPath.d.ts","sourceRoot":"","sources":["../src/getPropertyByPath.ts"],"names":[],"mappings":"AAKA,iBAAS,iBAAiB,CACxB,MAAM,EAAE;IAAE,CAAC,GAAG,EAAE,MAAM,GAAG,OAAO,CAAA;CAAE,EAClC,IAAI,EAAE,MAAM,GAAG,KAAK,CAAC,MAAM,CAAC,GAC3B,OAAO,CAgBT;AAED,OAAO,EAAE,iBAAiB,EAAE,CAAC"}
|
||||
28
node_modules/cosmiconfig/dist/getPropertyByPath.js
generated
vendored
Normal file
28
node_modules/cosmiconfig/dist/getPropertyByPath.js
generated
vendored
Normal file
@@ -0,0 +1,28 @@
|
||||
"use strict";
|
||||
|
||||
Object.defineProperty(exports, "__esModule", {
|
||||
value: true
|
||||
});
|
||||
exports.getPropertyByPath = getPropertyByPath;
|
||||
|
||||
// Resolves property names or property paths defined with period-delimited
|
||||
// strings or arrays of strings. Property names that are found on the source
|
||||
// object are used directly (even if they include a period).
|
||||
// Nested property names that include periods, within a path, are only
|
||||
// understood in array paths.
|
||||
function getPropertyByPath(source, path) {
|
||||
if (typeof path === 'string' && Object.prototype.hasOwnProperty.call(source, path)) {
|
||||
return source[path];
|
||||
}
|
||||
|
||||
const parsedPath = typeof path === 'string' ? path.split('.') : path; // eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
|
||||
return parsedPath.reduce((previous, key) => {
|
||||
if (previous === undefined) {
|
||||
return previous;
|
||||
}
|
||||
|
||||
return previous[key];
|
||||
}, source);
|
||||
}
|
||||
//# sourceMappingURL=getPropertyByPath.js.map
|
||||
1
node_modules/cosmiconfig/dist/getPropertyByPath.js.map
generated
vendored
Normal file
1
node_modules/cosmiconfig/dist/getPropertyByPath.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"sources":["../src/getPropertyByPath.ts"],"names":["getPropertyByPath","source","path","Object","prototype","hasOwnProperty","call","parsedPath","split","reduce","previous","key","undefined"],"mappings":";;;;;;;AAAA;AACA;AACA;AACA;AACA;AACA,SAASA,iBAAT,CACEC,MADF,EAEEC,IAFF,EAGW;AACT,MACE,OAAOA,IAAP,KAAgB,QAAhB,IACAC,MAAM,CAACC,SAAP,CAAiBC,cAAjB,CAAgCC,IAAhC,CAAqCL,MAArC,EAA6CC,IAA7C,CAFF,EAGE;AACA,WAAOD,MAAM,CAACC,IAAD,CAAb;AACD;;AAED,QAAMK,UAAU,GAAG,OAAOL,IAAP,KAAgB,QAAhB,GAA2BA,IAAI,CAACM,KAAL,CAAW,GAAX,CAA3B,GAA6CN,IAAhE,CARS,CAST;;AACA,SAAOK,UAAU,CAACE,MAAX,CAAkB,CAACC,QAAD,EAAgBC,GAAhB,KAAiC;AACxD,QAAID,QAAQ,KAAKE,SAAjB,EAA4B;AAC1B,aAAOF,QAAP;AACD;;AACD,WAAOA,QAAQ,CAACC,GAAD,CAAf;AACD,GALM,EAKJV,MALI,CAAP;AAMD","sourcesContent":["// Resolves property names or property paths defined with period-delimited\n// strings or arrays of strings. Property names that are found on the source\n// object are used directly (even if they include a period).\n// Nested property names that include periods, within a path, are only\n// understood in array paths.\nfunction getPropertyByPath(\n source: { [key: string]: unknown },\n path: string | Array<string>,\n): unknown {\n if (\n typeof path === 'string' &&\n Object.prototype.hasOwnProperty.call(source, path)\n ) {\n return source[path];\n }\n\n const parsedPath = typeof path === 'string' ? path.split('.') : path;\n // eslint-disable-next-line @typescript-eslint/no-explicit-any\n return parsedPath.reduce((previous: any, key): unknown => {\n if (previous === undefined) {\n return previous;\n }\n return previous[key];\n }, source);\n}\n\nexport { getPropertyByPath };\n"],"file":"getPropertyByPath.js"}
|
||||
44
node_modules/cosmiconfig/dist/index.d.ts
generated
vendored
Normal file
44
node_modules/cosmiconfig/dist/index.d.ts
generated
vendored
Normal file
@@ -0,0 +1,44 @@
|
||||
import { Config, CosmiconfigResult, Loaders, LoadersSync } from './types';
|
||||
declare type LoaderResult = Config | null;
|
||||
export declare type Loader = ((filepath: string, content: string) => Promise<LoaderResult>) | LoaderSync;
|
||||
export declare type LoaderSync = (filepath: string, content: string) => LoaderResult;
|
||||
export declare type Transform = ((CosmiconfigResult: CosmiconfigResult) => Promise<CosmiconfigResult>) | TransformSync;
|
||||
export declare type TransformSync = (CosmiconfigResult: CosmiconfigResult) => CosmiconfigResult;
|
||||
interface OptionsBase {
|
||||
packageProp?: string;
|
||||
searchPlaces?: Array<string>;
|
||||
ignoreEmptySearchPlaces?: boolean;
|
||||
stopDir?: string;
|
||||
cache?: boolean;
|
||||
}
|
||||
export interface Options extends OptionsBase {
|
||||
loaders?: Loaders;
|
||||
transform?: Transform;
|
||||
}
|
||||
export interface OptionsSync extends OptionsBase {
|
||||
loaders?: LoadersSync;
|
||||
transform?: TransformSync;
|
||||
}
|
||||
declare function cosmiconfig(moduleName: string, options?: Options): {
|
||||
readonly search: (searchFrom?: string) => Promise<CosmiconfigResult>;
|
||||
readonly load: (filepath: string) => Promise<CosmiconfigResult>;
|
||||
readonly clearLoadCache: () => void;
|
||||
readonly clearSearchCache: () => void;
|
||||
readonly clearCaches: () => void;
|
||||
};
|
||||
declare function cosmiconfigSync(moduleName: string, options?: OptionsSync): {
|
||||
readonly search: (searchFrom?: string) => CosmiconfigResult;
|
||||
readonly load: (filepath: string) => CosmiconfigResult;
|
||||
readonly clearLoadCache: () => void;
|
||||
readonly clearSearchCache: () => void;
|
||||
readonly clearCaches: () => void;
|
||||
};
|
||||
declare const defaultLoaders: Readonly<{
|
||||
readonly '.js': LoaderSync;
|
||||
readonly '.json': LoaderSync;
|
||||
readonly '.yaml': LoaderSync;
|
||||
readonly '.yml': LoaderSync;
|
||||
readonly noExt: LoaderSync;
|
||||
}>;
|
||||
export { cosmiconfig, cosmiconfigSync, defaultLoaders };
|
||||
//# sourceMappingURL=index.d.ts.map
|
||||
1
node_modules/cosmiconfig/dist/index.d.ts.map
generated
vendored
Normal file
1
node_modules/cosmiconfig/dist/index.d.ts.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"index.d.ts","sourceRoot":"","sources":["../src/index.ts"],"names":[],"mappings":"AAIA,OAAO,EACL,MAAM,EACN,iBAAiB,EAGjB,OAAO,EACP,WAAW,EACZ,MAAM,SAAS,CAAC;AAEjB,aAAK,YAAY,GAAG,MAAM,GAAG,IAAI,CAAC;AAClC,oBAAY,MAAM,GACd,CAAC,CAAC,QAAQ,EAAE,MAAM,EAAE,OAAO,EAAE,MAAM,KAAK,OAAO,CAAC,YAAY,CAAC,CAAC,GAC9D,UAAU,CAAC;AACf,oBAAY,UAAU,GAAG,CAAC,QAAQ,EAAE,MAAM,EAAE,OAAO,EAAE,MAAM,KAAK,YAAY,CAAC;AAE7E,oBAAY,SAAS,GACjB,CAAC,CAAC,iBAAiB,EAAE,iBAAiB,KAAK,OAAO,CAAC,iBAAiB,CAAC,CAAC,GACtE,aAAa,CAAC;AAElB,oBAAY,aAAa,GAAG,CAC1B,iBAAiB,EAAE,iBAAiB,KACjC,iBAAiB,CAAC;AAEvB,UAAU,WAAW;IACnB,WAAW,CAAC,EAAE,MAAM,CAAC;IACrB,YAAY,CAAC,EAAE,KAAK,CAAC,MAAM,CAAC,CAAC;IAC7B,uBAAuB,CAAC,EAAE,OAAO,CAAC;IAClC,OAAO,CAAC,EAAE,MAAM,CAAC;IACjB,KAAK,CAAC,EAAE,OAAO,CAAC;CACjB;AAED,MAAM,WAAW,OAAQ,SAAQ,WAAW;IAC1C,OAAO,CAAC,EAAE,OAAO,CAAC;IAClB,SAAS,CAAC,EAAE,SAAS,CAAC;CACvB;AAED,MAAM,WAAW,WAAY,SAAQ,WAAW;IAC9C,OAAO,CAAC,EAAE,WAAW,CAAC;IACtB,SAAS,CAAC,EAAE,aAAa,CAAC;CAC3B;AAGD,iBAAS,WAAW,CAAC,UAAU,EAAE,MAAM,EAAE,OAAO,GAAE,OAAY;;;;;;EAe7D;AAGD,iBAAS,eAAe,CAAC,UAAU,EAAE,MAAM,EAAE,OAAO,GAAE,WAAgB;;;;;;EAerE;AAGD,QAAA,MAAM,cAAc;;;;;;EAMT,CAAC;AAgDZ,OAAO,EAAE,WAAW,EAAE,eAAe,EAAE,cAAc,EAAE,CAAC"}
|
||||
80
node_modules/cosmiconfig/dist/index.js
generated
vendored
Normal file
80
node_modules/cosmiconfig/dist/index.js
generated
vendored
Normal file
@@ -0,0 +1,80 @@
|
||||
"use strict";
|
||||
|
||||
Object.defineProperty(exports, "__esModule", {
|
||||
value: true
|
||||
});
|
||||
exports.cosmiconfig = cosmiconfig;
|
||||
exports.cosmiconfigSync = cosmiconfigSync;
|
||||
exports.defaultLoaders = void 0;
|
||||
|
||||
var _os = _interopRequireDefault(require("os"));
|
||||
|
||||
var _Explorer = require("./Explorer");
|
||||
|
||||
var _ExplorerSync = require("./ExplorerSync");
|
||||
|
||||
var _loaders = require("./loaders");
|
||||
|
||||
var _types = require("./types");
|
||||
|
||||
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
|
||||
|
||||
// eslint-disable-next-line @typescript-eslint/explicit-function-return-type
|
||||
function cosmiconfig(moduleName, options = {}) {
|
||||
const normalizedOptions = normalizeOptions(moduleName, options);
|
||||
const explorer = new _Explorer.Explorer(normalizedOptions);
|
||||
return {
|
||||
search: explorer.search.bind(explorer),
|
||||
load: explorer.load.bind(explorer),
|
||||
clearLoadCache: explorer.clearLoadCache.bind(explorer),
|
||||
clearSearchCache: explorer.clearSearchCache.bind(explorer),
|
||||
clearCaches: explorer.clearCaches.bind(explorer)
|
||||
};
|
||||
} // eslint-disable-next-line @typescript-eslint/explicit-function-return-type
|
||||
|
||||
|
||||
function cosmiconfigSync(moduleName, options = {}) {
|
||||
const normalizedOptions = normalizeOptions(moduleName, options);
|
||||
const explorerSync = new _ExplorerSync.ExplorerSync(normalizedOptions);
|
||||
return {
|
||||
search: explorerSync.searchSync.bind(explorerSync),
|
||||
load: explorerSync.loadSync.bind(explorerSync),
|
||||
clearLoadCache: explorerSync.clearLoadCache.bind(explorerSync),
|
||||
clearSearchCache: explorerSync.clearSearchCache.bind(explorerSync),
|
||||
clearCaches: explorerSync.clearCaches.bind(explorerSync)
|
||||
};
|
||||
} // do not allow mutation of default loaders. Make sure it is set inside options
|
||||
|
||||
|
||||
const defaultLoaders = Object.freeze({
|
||||
'.js': _loaders.loaders.loadJs,
|
||||
'.json': _loaders.loaders.loadJson,
|
||||
'.yaml': _loaders.loaders.loadYaml,
|
||||
'.yml': _loaders.loaders.loadYaml,
|
||||
noExt: _loaders.loaders.loadYaml
|
||||
});
|
||||
exports.defaultLoaders = defaultLoaders;
|
||||
|
||||
function normalizeOptions(moduleName, options) {
|
||||
const defaults = {
|
||||
packageProp: moduleName,
|
||||
searchPlaces: ['package.json', `.${moduleName}rc`, `.${moduleName}rc.json`, `.${moduleName}rc.yaml`, `.${moduleName}rc.yml`, `.${moduleName}rc.js`, `${moduleName}.config.js`],
|
||||
ignoreEmptySearchPlaces: true,
|
||||
stopDir: _os.default.homedir(),
|
||||
cache: true,
|
||||
transform: identity,
|
||||
loaders: defaultLoaders
|
||||
};
|
||||
const normalizedOptions = { ...defaults,
|
||||
...options,
|
||||
loaders: { ...defaults.loaders,
|
||||
...options.loaders
|
||||
}
|
||||
};
|
||||
return normalizedOptions;
|
||||
}
|
||||
|
||||
const identity = function identity(x) {
|
||||
return x;
|
||||
};
|
||||
//# sourceMappingURL=index.js.map
|
||||
1
node_modules/cosmiconfig/dist/index.js.map
generated
vendored
Normal file
1
node_modules/cosmiconfig/dist/index.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
4
node_modules/cosmiconfig/dist/loaders.d.ts
generated
vendored
Normal file
4
node_modules/cosmiconfig/dist/loaders.d.ts
generated
vendored
Normal file
@@ -0,0 +1,4 @@
|
||||
import { LoadersSync } from './types';
|
||||
declare const loaders: LoadersSync;
|
||||
export { loaders };
|
||||
//# sourceMappingURL=loaders.d.ts.map
|
||||
1
node_modules/cosmiconfig/dist/loaders.d.ts.map
generated
vendored
Normal file
1
node_modules/cosmiconfig/dist/loaders.d.ts.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"loaders.d.ts","sourceRoot":"","sources":["../src/loaders.ts"],"names":[],"mappings":"AAMA,OAAO,EAAE,WAAW,EAAE,MAAM,SAAS,CAAC;AA0CtC,QAAA,MAAM,OAAO,EAAE,WAA4C,CAAC;AAE5D,OAAO,EAAE,OAAO,EAAE,CAAC"}
|
||||
60
node_modules/cosmiconfig/dist/loaders.js
generated
vendored
Normal file
60
node_modules/cosmiconfig/dist/loaders.js
generated
vendored
Normal file
@@ -0,0 +1,60 @@
|
||||
"use strict";
|
||||
|
||||
Object.defineProperty(exports, "__esModule", {
|
||||
value: true
|
||||
});
|
||||
exports.loaders = void 0;
|
||||
|
||||
/* eslint-disable @typescript-eslint/no-require-imports */
|
||||
let importFresh;
|
||||
|
||||
const loadJs = function loadJs(filepath) {
|
||||
if (importFresh === undefined) {
|
||||
importFresh = require('import-fresh');
|
||||
}
|
||||
|
||||
const result = importFresh(filepath);
|
||||
return result;
|
||||
};
|
||||
|
||||
let parseJson;
|
||||
|
||||
const loadJson = function loadJson(filepath, content) {
|
||||
if (parseJson === undefined) {
|
||||
parseJson = require('parse-json');
|
||||
}
|
||||
|
||||
try {
|
||||
const result = parseJson(content);
|
||||
return result;
|
||||
} catch (error) {
|
||||
error.message = `JSON Error in ${filepath}:\n${error.message}`;
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
let yaml;
|
||||
|
||||
const loadYaml = function loadYaml(filepath, content) {
|
||||
if (yaml === undefined) {
|
||||
yaml = require('yaml');
|
||||
}
|
||||
|
||||
try {
|
||||
const result = yaml.parse(content, {
|
||||
prettyErrors: true
|
||||
});
|
||||
return result;
|
||||
} catch (error) {
|
||||
error.message = `YAML Error in ${filepath}:\n${error.message}`;
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
const loaders = {
|
||||
loadJs,
|
||||
loadJson,
|
||||
loadYaml
|
||||
};
|
||||
exports.loaders = loaders;
|
||||
//# sourceMappingURL=loaders.js.map
|
||||
1
node_modules/cosmiconfig/dist/loaders.js.map
generated
vendored
Normal file
1
node_modules/cosmiconfig/dist/loaders.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"sources":["../src/loaders.ts"],"names":["importFresh","loadJs","filepath","undefined","require","result","parseJson","loadJson","content","error","message","yaml","loadYaml","parse","prettyErrors","loaders"],"mappings":";;;;;;;AAAA;AAQA,IAAIA,WAAJ;;AACA,MAAMC,MAAkB,GAAG,SAASA,MAAT,CAAgBC,QAAhB,EAA0B;AACnD,MAAIF,WAAW,KAAKG,SAApB,EAA+B;AAC7BH,IAAAA,WAAW,GAAGI,OAAO,CAAC,cAAD,CAArB;AACD;;AAED,QAAMC,MAAM,GAAGL,WAAW,CAACE,QAAD,CAA1B;AACA,SAAOG,MAAP;AACD,CAPD;;AASA,IAAIC,SAAJ;;AACA,MAAMC,QAAoB,GAAG,SAASA,QAAT,CAAkBL,QAAlB,EAA4BM,OAA5B,EAAqC;AAChE,MAAIF,SAAS,KAAKH,SAAlB,EAA6B;AAC3BG,IAAAA,SAAS,GAAGF,OAAO,CAAC,YAAD,CAAnB;AACD;;AAED,MAAI;AACF,UAAMC,MAAM,GAAGC,SAAS,CAACE,OAAD,CAAxB;AACA,WAAOH,MAAP;AACD,GAHD,CAGE,OAAOI,KAAP,EAAc;AACdA,IAAAA,KAAK,CAACC,OAAN,GAAiB,iBAAgBR,QAAS,MAAKO,KAAK,CAACC,OAAQ,EAA7D;AACA,UAAMD,KAAN;AACD;AACF,CAZD;;AAcA,IAAIE,IAAJ;;AACA,MAAMC,QAAoB,GAAG,SAASA,QAAT,CAAkBV,QAAlB,EAA4BM,OAA5B,EAAqC;AAChE,MAAIG,IAAI,KAAKR,SAAb,EAAwB;AACtBQ,IAAAA,IAAI,GAAGP,OAAO,CAAC,MAAD,CAAd;AACD;;AAED,MAAI;AACF,UAAMC,MAAM,GAAGM,IAAI,CAACE,KAAL,CAAWL,OAAX,EAAoB;AAAEM,MAAAA,YAAY,EAAE;AAAhB,KAApB,CAAf;AACA,WAAOT,MAAP;AACD,GAHD,CAGE,OAAOI,KAAP,EAAc;AACdA,IAAAA,KAAK,CAACC,OAAN,GAAiB,iBAAgBR,QAAS,MAAKO,KAAK,CAACC,OAAQ,EAA7D;AACA,UAAMD,KAAN;AACD;AACF,CAZD;;AAcA,MAAMM,OAAoB,GAAG;AAAEd,EAAAA,MAAF;AAAUM,EAAAA,QAAV;AAAoBK,EAAAA;AAApB,CAA7B","sourcesContent":["/* eslint-disable @typescript-eslint/no-require-imports */\n\nimport parseJsonType from 'parse-json';\nimport yamlType from 'yaml';\nimport importFreshType from 'import-fresh';\nimport { LoaderSync } from './index';\nimport { LoadersSync } from './types';\n\nlet importFresh: typeof importFreshType;\nconst loadJs: LoaderSync = function loadJs(filepath) {\n if (importFresh === undefined) {\n importFresh = require('import-fresh');\n }\n\n const result = importFresh(filepath);\n return result;\n};\n\nlet parseJson: typeof parseJsonType;\nconst loadJson: LoaderSync = function loadJson(filepath, content) {\n if (parseJson === undefined) {\n parseJson = require('parse-json');\n }\n\n try {\n const result = parseJson(content);\n return result;\n } catch (error) {\n error.message = `JSON Error in ${filepath}:\\n${error.message}`;\n throw error;\n }\n};\n\nlet yaml: typeof yamlType;\nconst loadYaml: LoaderSync = function loadYaml(filepath, content) {\n if (yaml === undefined) {\n yaml = require('yaml');\n }\n\n try {\n const result = yaml.parse(content, { prettyErrors: true });\n return result;\n } catch (error) {\n error.message = `YAML Error in ${filepath}:\\n${error.message}`;\n throw error;\n }\n};\n\nconst loaders: LoadersSync = { loadJs, loadJson, loadYaml };\n\nexport { loaders };\n"],"file":"loaders.js"}
|
||||
7
node_modules/cosmiconfig/dist/readFile.d.ts
generated
vendored
Normal file
7
node_modules/cosmiconfig/dist/readFile.d.ts
generated
vendored
Normal file
@@ -0,0 +1,7 @@
|
||||
interface Options {
|
||||
throwNotFound?: boolean;
|
||||
}
|
||||
declare function readFile(filepath: string, options?: Options): Promise<string | null>;
|
||||
declare function readFileSync(filepath: string, options?: Options): string | null;
|
||||
export { readFile, readFileSync };
|
||||
//# sourceMappingURL=readFile.d.ts.map
|
||||
1
node_modules/cosmiconfig/dist/readFile.d.ts.map
generated
vendored
Normal file
1
node_modules/cosmiconfig/dist/readFile.d.ts.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"readFile.d.ts","sourceRoot":"","sources":["../src/readFile.ts"],"names":[],"mappings":"AAkBA,UAAU,OAAO;IACf,aAAa,CAAC,EAAE,OAAO,CAAC;CACzB;AAED,iBAAe,QAAQ,CACrB,QAAQ,EAAE,MAAM,EAChB,OAAO,GAAE,OAAY,GACpB,OAAO,CAAC,MAAM,GAAG,IAAI,CAAC,CAcxB;AAED,iBAAS,YAAY,CAAC,QAAQ,EAAE,MAAM,EAAE,OAAO,GAAE,OAAY,GAAG,MAAM,GAAG,IAAI,CAc5E;AAED,OAAO,EAAE,QAAQ,EAAE,YAAY,EAAE,CAAC"}
|
||||
56
node_modules/cosmiconfig/dist/readFile.js
generated
vendored
Normal file
56
node_modules/cosmiconfig/dist/readFile.js
generated
vendored
Normal file
@@ -0,0 +1,56 @@
|
||||
"use strict";
|
||||
|
||||
Object.defineProperty(exports, "__esModule", {
|
||||
value: true
|
||||
});
|
||||
exports.readFile = readFile;
|
||||
exports.readFileSync = readFileSync;
|
||||
|
||||
var _fs = _interopRequireDefault(require("fs"));
|
||||
|
||||
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
|
||||
|
||||
async function fsReadFileAsync(pathname, encoding) {
|
||||
return new Promise((resolve, reject) => {
|
||||
_fs.default.readFile(pathname, encoding, (error, contents) => {
|
||||
if (error) {
|
||||
reject(error);
|
||||
return;
|
||||
}
|
||||
|
||||
resolve(contents);
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
async function readFile(filepath, options = {}) {
|
||||
const throwNotFound = options.throwNotFound === true;
|
||||
|
||||
try {
|
||||
const content = await fsReadFileAsync(filepath, 'utf8');
|
||||
return content;
|
||||
} catch (error) {
|
||||
if (throwNotFound === false && error.code === 'ENOENT') {
|
||||
return null;
|
||||
}
|
||||
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
function readFileSync(filepath, options = {}) {
|
||||
const throwNotFound = options.throwNotFound === true;
|
||||
|
||||
try {
|
||||
const content = _fs.default.readFileSync(filepath, 'utf8');
|
||||
|
||||
return content;
|
||||
} catch (error) {
|
||||
if (throwNotFound === false && error.code === 'ENOENT') {
|
||||
return null;
|
||||
}
|
||||
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
//# sourceMappingURL=readFile.js.map
|
||||
1
node_modules/cosmiconfig/dist/readFile.js.map
generated
vendored
Normal file
1
node_modules/cosmiconfig/dist/readFile.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"sources":["../src/readFile.ts"],"names":["fsReadFileAsync","pathname","encoding","Promise","resolve","reject","fs","readFile","error","contents","filepath","options","throwNotFound","content","code","readFileSync"],"mappings":";;;;;;;;AAAA;;;;AAEA,eAAeA,eAAf,CACEC,QADF,EAEEC,QAFF,EAGmB;AACjB,SAAO,IAAIC,OAAJ,CAAY,CAACC,OAAD,EAAUC,MAAV,KAA2B;AAC5CC,gBAAGC,QAAH,CAAYN,QAAZ,EAAsBC,QAAtB,EAAgC,CAACM,KAAD,EAAQC,QAAR,KAA2B;AACzD,UAAID,KAAJ,EAAW;AACTH,QAAAA,MAAM,CAACG,KAAD,CAAN;AACA;AACD;;AAEDJ,MAAAA,OAAO,CAACK,QAAD,CAAP;AACD,KAPD;AAQD,GATM,CAAP;AAUD;;AAMD,eAAeF,QAAf,CACEG,QADF,EAEEC,OAAgB,GAAG,EAFrB,EAG0B;AACxB,QAAMC,aAAa,GAAGD,OAAO,CAACC,aAAR,KAA0B,IAAhD;;AAEA,MAAI;AACF,UAAMC,OAAO,GAAG,MAAMb,eAAe,CAACU,QAAD,EAAW,MAAX,CAArC;AAEA,WAAOG,OAAP;AACD,GAJD,CAIE,OAAOL,KAAP,EAAc;AACd,QAAII,aAAa,KAAK,KAAlB,IAA2BJ,KAAK,CAACM,IAAN,KAAe,QAA9C,EAAwD;AACtD,aAAO,IAAP;AACD;;AAED,UAAMN,KAAN;AACD;AACF;;AAED,SAASO,YAAT,CAAsBL,QAAtB,EAAwCC,OAAgB,GAAG,EAA3D,EAA8E;AAC5E,QAAMC,aAAa,GAAGD,OAAO,CAACC,aAAR,KAA0B,IAAhD;;AAEA,MAAI;AACF,UAAMC,OAAO,GAAGP,YAAGS,YAAH,CAAgBL,QAAhB,EAA0B,MAA1B,CAAhB;;AAEA,WAAOG,OAAP;AACD,GAJD,CAIE,OAAOL,KAAP,EAAc;AACd,QAAII,aAAa,KAAK,KAAlB,IAA2BJ,KAAK,CAACM,IAAN,KAAe,QAA9C,EAAwD;AACtD,aAAO,IAAP;AACD;;AAED,UAAMN,KAAN;AACD;AACF","sourcesContent":["import fs from 'fs';\n\nasync function fsReadFileAsync(\n pathname: string,\n encoding: string,\n): Promise<string> {\n return new Promise((resolve, reject): void => {\n fs.readFile(pathname, encoding, (error, contents): void => {\n if (error) {\n reject(error);\n return;\n }\n\n resolve(contents);\n });\n });\n}\n\ninterface Options {\n throwNotFound?: boolean;\n}\n\nasync function readFile(\n filepath: string,\n options: Options = {},\n): Promise<string | null> {\n const throwNotFound = options.throwNotFound === true;\n\n try {\n const content = await fsReadFileAsync(filepath, 'utf8');\n\n return content;\n } catch (error) {\n if (throwNotFound === false && error.code === 'ENOENT') {\n return null;\n }\n\n throw error;\n }\n}\n\nfunction readFileSync(filepath: string, options: Options = {}): string | null {\n const throwNotFound = options.throwNotFound === true;\n\n try {\n const content = fs.readFileSync(filepath, 'utf8');\n\n return content;\n } catch (error) {\n if (throwNotFound === false && error.code === 'ENOENT') {\n return null;\n }\n\n throw error;\n }\n}\n\nexport { readFile, readFileSync };\n"],"file":"readFile.js"}
|
||||
20
node_modules/cosmiconfig/dist/types.d.ts
generated
vendored
Normal file
20
node_modules/cosmiconfig/dist/types.d.ts
generated
vendored
Normal file
@@ -0,0 +1,20 @@
|
||||
import { Loader, LoaderSync, Options, OptionsSync } from './index';
|
||||
export declare type Config = any;
|
||||
export declare type CosmiconfigResult = {
|
||||
config: Config;
|
||||
filepath: string;
|
||||
isEmpty?: boolean;
|
||||
} | null;
|
||||
export interface ExplorerOptions extends Required<Options> {
|
||||
}
|
||||
export interface ExplorerOptionsSync extends Required<OptionsSync> {
|
||||
}
|
||||
export declare type Cache = Map<string, CosmiconfigResult>;
|
||||
export declare type LoadedFileContent = Config | null | undefined;
|
||||
export interface Loaders {
|
||||
[key: string]: Loader;
|
||||
}
|
||||
export interface LoadersSync {
|
||||
[key: string]: LoaderSync;
|
||||
}
|
||||
//# sourceMappingURL=types.d.ts.map
|
||||
1
node_modules/cosmiconfig/dist/types.d.ts.map
generated
vendored
Normal file
1
node_modules/cosmiconfig/dist/types.d.ts.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"types.d.ts","sourceRoot":"","sources":["../src/types.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,MAAM,EAAE,UAAU,EAAE,OAAO,EAAE,WAAW,EAAE,MAAM,SAAS,CAAC;AAGnE,oBAAY,MAAM,GAAG,GAAG,CAAC;AAEzB,oBAAY,iBAAiB,GAAG;IAC9B,MAAM,EAAE,MAAM,CAAC;IACf,QAAQ,EAAE,MAAM,CAAC;IACjB,OAAO,CAAC,EAAE,OAAO,CAAC;CACnB,GAAG,IAAI,CAAC;AAIT,MAAM,WAAW,eAAgB,SAAQ,QAAQ,CAAC,OAAO,CAAC;CAAG;AAC7D,MAAM,WAAW,mBAAoB,SAAQ,QAAQ,CAAC,WAAW,CAAC;CAAG;AAGrE,oBAAY,KAAK,GAAG,GAAG,CAAC,MAAM,EAAE,iBAAiB,CAAC,CAAC;AAMnD,oBAAY,iBAAiB,GAAG,MAAM,GAAG,IAAI,GAAG,SAAS,CAAC;AAE1D,MAAM,WAAW,OAAO;IACtB,CAAC,GAAG,EAAE,MAAM,GAAG,MAAM,CAAC;CACvB;AAED,MAAM,WAAW,WAAW;IAC1B,CAAC,GAAG,EAAE,MAAM,GAAG,UAAU,CAAC;CAC3B"}
|
||||
2
node_modules/cosmiconfig/dist/types.js
generated
vendored
Normal file
2
node_modules/cosmiconfig/dist/types.js
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
"use strict";
|
||||
//# sourceMappingURL=types.js.map
|
||||
1
node_modules/cosmiconfig/dist/types.js.map
generated
vendored
Normal file
1
node_modules/cosmiconfig/dist/types.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"sources":[],"names":[],"mappings":"","sourcesContent":[],"file":"types.js"}
|
||||
13
node_modules/cosmiconfig/node_modules/yaml/LICENSE
generated
vendored
Normal file
13
node_modules/cosmiconfig/node_modules/yaml/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
Copyright 2018 Eemeli Aro <eemeli@gmail.com>
|
||||
|
||||
Permission to use, copy, modify, and/or distribute this software for any purpose
|
||||
with or without fee is hereby granted, provided that the above copyright notice
|
||||
and this permission notice appear in all copies.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH
|
||||
REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
|
||||
FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,
|
||||
INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS
|
||||
OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER
|
||||
TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF
|
||||
THIS SOFTWARE.
|
||||
127
node_modules/cosmiconfig/node_modules/yaml/README.md
generated
vendored
Normal file
127
node_modules/cosmiconfig/node_modules/yaml/README.md
generated
vendored
Normal file
@@ -0,0 +1,127 @@
|
||||
# YAML <a href="https://www.npmjs.com/package/yaml"><img align="right" src="https://badge.fury.io/js/yaml.svg" title="npm package" /></a>
|
||||
|
||||
`yaml` is a JavaScript parser and stringifier for [YAML](http://yaml.org/), a human friendly data serialization standard. It supports both parsing and stringifying data using all versions of YAML, along with all common data schemas. As a particularly distinguishing feature, `yaml` fully supports reading and writing comments and blank lines in YAML documents.
|
||||
|
||||
The library is released under the ISC open source license, and the code is [available on GitHub](https://github.com/eemeli/yaml/). It has no external dependencies and runs on Node.js 6 and later, and in browsers from IE 11 upwards.
|
||||
|
||||
For the purposes of versioning, any changes that break any of the endpoints or APIs documented here will be considered semver-major breaking changes. Undocumented library internals may change between minor versions, and previous APIs may be deprecated (but not removed).
|
||||
|
||||
For more information, see the project's documentation site: [**eemeli.org/yaml/v1**](https://eemeli.org/yaml/v1/)
|
||||
|
||||
To install:
|
||||
|
||||
```sh
|
||||
npm install yaml
|
||||
```
|
||||
|
||||
**Note:** This is `yaml@1`. You may also be interested in the next version, currently available as [`yaml@next`](https://www.npmjs.com/package/yaml/v/next).
|
||||
|
||||
## API Overview
|
||||
|
||||
The API provided by `yaml` has three layers, depending on how deep you need to go: [Parse & Stringify](https://eemeli.org/yaml/v1/#parse-amp-stringify), [Documents](https://eemeli.org/yaml/#documents), and the [CST Parser](https://eemeli.org/yaml/#cst-parser). The first has the simplest API and "just works", the second gets you all the bells and whistles supported by the library along with a decent [AST](https://eemeli.org/yaml/#content-nodes), and the third is the closest to YAML source, making it fast, raw, and crude.
|
||||
|
||||
```js
|
||||
import YAML from 'yaml'
|
||||
// or
|
||||
const YAML = require('yaml')
|
||||
```
|
||||
|
||||
### Parse & Stringify
|
||||
|
||||
- [`YAML.parse(str, options): value`](https://eemeli.org/yaml/v1/#yaml-parse)
|
||||
- [`YAML.stringify(value, options): string`](https://eemeli.org/yaml/v1/#yaml-stringify)
|
||||
|
||||
### YAML Documents
|
||||
|
||||
- [`YAML.createNode(value, wrapScalars, tag): Node`](https://eemeli.org/yaml/v1/#creating-nodes)
|
||||
- [`YAML.defaultOptions`](https://eemeli.org/yaml/v1/#options)
|
||||
- [`YAML.Document`](https://eemeli.org/yaml/v1/#yaml-documents)
|
||||
- [`constructor(options)`](https://eemeli.org/yaml/v1/#creating-documents)
|
||||
- [`defaults`](https://eemeli.org/yaml/v1/#options)
|
||||
- [`#anchors`](https://eemeli.org/yaml/v1/#working-with-anchors)
|
||||
- [`#contents`](https://eemeli.org/yaml/v1/#content-nodes)
|
||||
- [`#errors`](https://eemeli.org/yaml/v1/#errors)
|
||||
- [`YAML.parseAllDocuments(str, options): YAML.Document[]`](https://eemeli.org/yaml/v1/#parsing-documents)
|
||||
- [`YAML.parseDocument(str, options): YAML.Document`](https://eemeli.org/yaml/v1/#parsing-documents)
|
||||
|
||||
```js
|
||||
import { Pair, YAMLMap, YAMLSeq } from 'yaml/types'
|
||||
```
|
||||
|
||||
- [`new Pair(key, value)`](https://eemeli.org/yaml/v1/#creating-nodes)
|
||||
- [`new YAMLMap()`](https://eemeli.org/yaml/v1/#creating-nodes)
|
||||
- [`new YAMLSeq()`](https://eemeli.org/yaml/v1/#creating-nodes)
|
||||
|
||||
### CST Parser
|
||||
|
||||
```js
|
||||
import parseCST from 'yaml/parse-cst'
|
||||
```
|
||||
|
||||
- [`parseCST(str): CSTDocument[]`](https://eemeli.org/yaml/v1/#parsecst)
|
||||
- [`YAML.parseCST(str): CSTDocument[]`](https://eemeli.org/yaml/v1/#parsecst)
|
||||
|
||||
## YAML.parse
|
||||
|
||||
```yaml
|
||||
# file.yml
|
||||
YAML:
|
||||
- A human-readable data serialization language
|
||||
- https://en.wikipedia.org/wiki/YAML
|
||||
yaml:
|
||||
- A complete JavaScript implementation
|
||||
- https://www.npmjs.com/package/yaml
|
||||
```
|
||||
|
||||
```js
|
||||
import fs from 'fs'
|
||||
import YAML from 'yaml'
|
||||
|
||||
YAML.parse('3.14159')
|
||||
// 3.14159
|
||||
|
||||
YAML.parse('[ true, false, maybe, null ]\n')
|
||||
// [ true, false, 'maybe', null ]
|
||||
|
||||
const file = fs.readFileSync('./file.yml', 'utf8')
|
||||
YAML.parse(file)
|
||||
// { YAML:
|
||||
// [ 'A human-readable data serialization language',
|
||||
// 'https://en.wikipedia.org/wiki/YAML' ],
|
||||
// yaml:
|
||||
// [ 'A complete JavaScript implementation',
|
||||
// 'https://www.npmjs.com/package/yaml' ] }
|
||||
```
|
||||
|
||||
## YAML.stringify
|
||||
|
||||
```js
|
||||
import YAML from 'yaml'
|
||||
|
||||
YAML.stringify(3.14159)
|
||||
// '3.14159\n'
|
||||
|
||||
YAML.stringify([true, false, 'maybe', null])
|
||||
// `- true
|
||||
// - false
|
||||
// - maybe
|
||||
// - null
|
||||
// `
|
||||
|
||||
YAML.stringify({ number: 3, plain: 'string', block: 'two\nlines\n' })
|
||||
// `number: 3
|
||||
// plain: string
|
||||
// block: >
|
||||
// two
|
||||
//
|
||||
// lines
|
||||
// `
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
Browser testing provided by:
|
||||
|
||||
<a href="https://www.browserstack.com/open-source">
|
||||
<img width=200 src="https://eemeli.org/yaml/images/browserstack.svg" />
|
||||
</a>
|
||||
751
node_modules/cosmiconfig/node_modules/yaml/browser/dist/PlainValue-183afbad.js
generated
vendored
Normal file
751
node_modules/cosmiconfig/node_modules/yaml/browser/dist/PlainValue-183afbad.js
generated
vendored
Normal file
@@ -0,0 +1,751 @@
|
||||
const Char = {
|
||||
ANCHOR: '&',
|
||||
COMMENT: '#',
|
||||
TAG: '!',
|
||||
DIRECTIVES_END: '-',
|
||||
DOCUMENT_END: '.'
|
||||
};
|
||||
const Type = {
|
||||
ALIAS: 'ALIAS',
|
||||
BLANK_LINE: 'BLANK_LINE',
|
||||
BLOCK_FOLDED: 'BLOCK_FOLDED',
|
||||
BLOCK_LITERAL: 'BLOCK_LITERAL',
|
||||
COMMENT: 'COMMENT',
|
||||
DIRECTIVE: 'DIRECTIVE',
|
||||
DOCUMENT: 'DOCUMENT',
|
||||
FLOW_MAP: 'FLOW_MAP',
|
||||
FLOW_SEQ: 'FLOW_SEQ',
|
||||
MAP: 'MAP',
|
||||
MAP_KEY: 'MAP_KEY',
|
||||
MAP_VALUE: 'MAP_VALUE',
|
||||
PLAIN: 'PLAIN',
|
||||
QUOTE_DOUBLE: 'QUOTE_DOUBLE',
|
||||
QUOTE_SINGLE: 'QUOTE_SINGLE',
|
||||
SEQ: 'SEQ',
|
||||
SEQ_ITEM: 'SEQ_ITEM'
|
||||
};
|
||||
const defaultTagPrefix = 'tag:yaml.org,2002:';
|
||||
const defaultTags = {
|
||||
MAP: 'tag:yaml.org,2002:map',
|
||||
SEQ: 'tag:yaml.org,2002:seq',
|
||||
STR: 'tag:yaml.org,2002:str'
|
||||
};
|
||||
|
||||
function findLineStarts(src) {
|
||||
const ls = [0];
|
||||
let offset = src.indexOf('\n');
|
||||
while (offset !== -1) {
|
||||
offset += 1;
|
||||
ls.push(offset);
|
||||
offset = src.indexOf('\n', offset);
|
||||
}
|
||||
return ls;
|
||||
}
|
||||
function getSrcInfo(cst) {
|
||||
let lineStarts, src;
|
||||
if (typeof cst === 'string') {
|
||||
lineStarts = findLineStarts(cst);
|
||||
src = cst;
|
||||
} else {
|
||||
if (Array.isArray(cst)) cst = cst[0];
|
||||
if (cst && cst.context) {
|
||||
if (!cst.lineStarts) cst.lineStarts = findLineStarts(cst.context.src);
|
||||
lineStarts = cst.lineStarts;
|
||||
src = cst.context.src;
|
||||
}
|
||||
}
|
||||
return {
|
||||
lineStarts,
|
||||
src
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* @typedef {Object} LinePos - One-indexed position in the source
|
||||
* @property {number} line
|
||||
* @property {number} col
|
||||
*/
|
||||
|
||||
/**
|
||||
* Determine the line/col position matching a character offset.
|
||||
*
|
||||
* Accepts a source string or a CST document as the second parameter. With
|
||||
* the latter, starting indices for lines are cached in the document as
|
||||
* `lineStarts: number[]`.
|
||||
*
|
||||
* Returns a one-indexed `{ line, col }` location if found, or
|
||||
* `undefined` otherwise.
|
||||
*
|
||||
* @param {number} offset
|
||||
* @param {string|Document|Document[]} cst
|
||||
* @returns {?LinePos}
|
||||
*/
|
||||
function getLinePos(offset, cst) {
|
||||
if (typeof offset !== 'number' || offset < 0) return null;
|
||||
const {
|
||||
lineStarts,
|
||||
src
|
||||
} = getSrcInfo(cst);
|
||||
if (!lineStarts || !src || offset > src.length) return null;
|
||||
for (let i = 0; i < lineStarts.length; ++i) {
|
||||
const start = lineStarts[i];
|
||||
if (offset < start) {
|
||||
return {
|
||||
line: i,
|
||||
col: offset - lineStarts[i - 1] + 1
|
||||
};
|
||||
}
|
||||
if (offset === start) return {
|
||||
line: i + 1,
|
||||
col: 1
|
||||
};
|
||||
}
|
||||
const line = lineStarts.length;
|
||||
return {
|
||||
line,
|
||||
col: offset - lineStarts[line - 1] + 1
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Get a specified line from the source.
|
||||
*
|
||||
* Accepts a source string or a CST document as the second parameter. With
|
||||
* the latter, starting indices for lines are cached in the document as
|
||||
* `lineStarts: number[]`.
|
||||
*
|
||||
* Returns the line as a string if found, or `null` otherwise.
|
||||
*
|
||||
* @param {number} line One-indexed line number
|
||||
* @param {string|Document|Document[]} cst
|
||||
* @returns {?string}
|
||||
*/
|
||||
function getLine(line, cst) {
|
||||
const {
|
||||
lineStarts,
|
||||
src
|
||||
} = getSrcInfo(cst);
|
||||
if (!lineStarts || !(line >= 1) || line > lineStarts.length) return null;
|
||||
const start = lineStarts[line - 1];
|
||||
let end = lineStarts[line]; // undefined for last line; that's ok for slice()
|
||||
while (end && end > start && src[end - 1] === '\n') --end;
|
||||
return src.slice(start, end);
|
||||
}
|
||||
|
||||
/**
|
||||
* Pretty-print the starting line from the source indicated by the range `pos`
|
||||
*
|
||||
* Trims output to `maxWidth` chars while keeping the starting column visible,
|
||||
* using `…` at either end to indicate dropped characters.
|
||||
*
|
||||
* Returns a two-line string (or `null`) with `\n` as separator; the second line
|
||||
* will hold appropriately indented `^` marks indicating the column range.
|
||||
*
|
||||
* @param {Object} pos
|
||||
* @param {LinePos} pos.start
|
||||
* @param {LinePos} [pos.end]
|
||||
* @param {string|Document|Document[]*} cst
|
||||
* @param {number} [maxWidth=80]
|
||||
* @returns {?string}
|
||||
*/
|
||||
function getPrettyContext({
|
||||
start,
|
||||
end
|
||||
}, cst, maxWidth = 80) {
|
||||
let src = getLine(start.line, cst);
|
||||
if (!src) return null;
|
||||
let {
|
||||
col
|
||||
} = start;
|
||||
if (src.length > maxWidth) {
|
||||
if (col <= maxWidth - 10) {
|
||||
src = src.substr(0, maxWidth - 1) + '…';
|
||||
} else {
|
||||
const halfWidth = Math.round(maxWidth / 2);
|
||||
if (src.length > col + halfWidth) src = src.substr(0, col + halfWidth - 1) + '…';
|
||||
col -= src.length - maxWidth;
|
||||
src = '…' + src.substr(1 - maxWidth);
|
||||
}
|
||||
}
|
||||
let errLen = 1;
|
||||
let errEnd = '';
|
||||
if (end) {
|
||||
if (end.line === start.line && col + (end.col - start.col) <= maxWidth + 1) {
|
||||
errLen = end.col - start.col;
|
||||
} else {
|
||||
errLen = Math.min(src.length + 1, maxWidth) - col;
|
||||
errEnd = '…';
|
||||
}
|
||||
}
|
||||
const offset = col > 1 ? ' '.repeat(col - 1) : '';
|
||||
const err = '^'.repeat(errLen);
|
||||
return `${src}\n${offset}${err}${errEnd}`;
|
||||
}
|
||||
|
||||
class Range {
|
||||
static copy(orig) {
|
||||
return new Range(orig.start, orig.end);
|
||||
}
|
||||
constructor(start, end) {
|
||||
this.start = start;
|
||||
this.end = end || start;
|
||||
}
|
||||
isEmpty() {
|
||||
return typeof this.start !== 'number' || !this.end || this.end <= this.start;
|
||||
}
|
||||
|
||||
/**
|
||||
* Set `origStart` and `origEnd` to point to the original source range for
|
||||
* this node, which may differ due to dropped CR characters.
|
||||
*
|
||||
* @param {number[]} cr - Positions of dropped CR characters
|
||||
* @param {number} offset - Starting index of `cr` from the last call
|
||||
* @returns {number} - The next offset, matching the one found for `origStart`
|
||||
*/
|
||||
setOrigRange(cr, offset) {
|
||||
const {
|
||||
start,
|
||||
end
|
||||
} = this;
|
||||
if (cr.length === 0 || end <= cr[0]) {
|
||||
this.origStart = start;
|
||||
this.origEnd = end;
|
||||
return offset;
|
||||
}
|
||||
let i = offset;
|
||||
while (i < cr.length) {
|
||||
if (cr[i] > start) break;else ++i;
|
||||
}
|
||||
this.origStart = start + i;
|
||||
const nextOffset = i;
|
||||
while (i < cr.length) {
|
||||
// if end was at \n, it should now be at \r
|
||||
if (cr[i] >= end) break;else ++i;
|
||||
}
|
||||
this.origEnd = end + i;
|
||||
return nextOffset;
|
||||
}
|
||||
}
|
||||
|
||||
/** Root class of all nodes */
|
||||
class Node {
|
||||
static addStringTerminator(src, offset, str) {
|
||||
if (str[str.length - 1] === '\n') return str;
|
||||
const next = Node.endOfWhiteSpace(src, offset);
|
||||
return next >= src.length || src[next] === '\n' ? str + '\n' : str;
|
||||
}
|
||||
|
||||
// ^(---|...)
|
||||
static atDocumentBoundary(src, offset, sep) {
|
||||
const ch0 = src[offset];
|
||||
if (!ch0) return true;
|
||||
const prev = src[offset - 1];
|
||||
if (prev && prev !== '\n') return false;
|
||||
if (sep) {
|
||||
if (ch0 !== sep) return false;
|
||||
} else {
|
||||
if (ch0 !== Char.DIRECTIVES_END && ch0 !== Char.DOCUMENT_END) return false;
|
||||
}
|
||||
const ch1 = src[offset + 1];
|
||||
const ch2 = src[offset + 2];
|
||||
if (ch1 !== ch0 || ch2 !== ch0) return false;
|
||||
const ch3 = src[offset + 3];
|
||||
return !ch3 || ch3 === '\n' || ch3 === '\t' || ch3 === ' ';
|
||||
}
|
||||
static endOfIdentifier(src, offset) {
|
||||
let ch = src[offset];
|
||||
const isVerbatim = ch === '<';
|
||||
const notOk = isVerbatim ? ['\n', '\t', ' ', '>'] : ['\n', '\t', ' ', '[', ']', '{', '}', ','];
|
||||
while (ch && notOk.indexOf(ch) === -1) ch = src[offset += 1];
|
||||
if (isVerbatim && ch === '>') offset += 1;
|
||||
return offset;
|
||||
}
|
||||
static endOfIndent(src, offset) {
|
||||
let ch = src[offset];
|
||||
while (ch === ' ') ch = src[offset += 1];
|
||||
return offset;
|
||||
}
|
||||
static endOfLine(src, offset) {
|
||||
let ch = src[offset];
|
||||
while (ch && ch !== '\n') ch = src[offset += 1];
|
||||
return offset;
|
||||
}
|
||||
static endOfWhiteSpace(src, offset) {
|
||||
let ch = src[offset];
|
||||
while (ch === '\t' || ch === ' ') ch = src[offset += 1];
|
||||
return offset;
|
||||
}
|
||||
static startOfLine(src, offset) {
|
||||
let ch = src[offset - 1];
|
||||
if (ch === '\n') return offset;
|
||||
while (ch && ch !== '\n') ch = src[offset -= 1];
|
||||
return offset + 1;
|
||||
}
|
||||
|
||||
/**
|
||||
* End of indentation, or null if the line's indent level is not more
|
||||
* than `indent`
|
||||
*
|
||||
* @param {string} src
|
||||
* @param {number} indent
|
||||
* @param {number} lineStart
|
||||
* @returns {?number}
|
||||
*/
|
||||
static endOfBlockIndent(src, indent, lineStart) {
|
||||
const inEnd = Node.endOfIndent(src, lineStart);
|
||||
if (inEnd > lineStart + indent) {
|
||||
return inEnd;
|
||||
} else {
|
||||
const wsEnd = Node.endOfWhiteSpace(src, inEnd);
|
||||
const ch = src[wsEnd];
|
||||
if (!ch || ch === '\n') return wsEnd;
|
||||
}
|
||||
return null;
|
||||
}
|
||||
static atBlank(src, offset, endAsBlank) {
|
||||
const ch = src[offset];
|
||||
return ch === '\n' || ch === '\t' || ch === ' ' || endAsBlank && !ch;
|
||||
}
|
||||
static nextNodeIsIndented(ch, indentDiff, indicatorAsIndent) {
|
||||
if (!ch || indentDiff < 0) return false;
|
||||
if (indentDiff > 0) return true;
|
||||
return indicatorAsIndent && ch === '-';
|
||||
}
|
||||
|
||||
// should be at line or string end, or at next non-whitespace char
|
||||
static normalizeOffset(src, offset) {
|
||||
const ch = src[offset];
|
||||
return !ch ? offset : ch !== '\n' && src[offset - 1] === '\n' ? offset - 1 : Node.endOfWhiteSpace(src, offset);
|
||||
}
|
||||
|
||||
// fold single newline into space, multiple newlines to N - 1 newlines
|
||||
// presumes src[offset] === '\n'
|
||||
static foldNewline(src, offset, indent) {
|
||||
let inCount = 0;
|
||||
let error = false;
|
||||
let fold = '';
|
||||
let ch = src[offset + 1];
|
||||
while (ch === ' ' || ch === '\t' || ch === '\n') {
|
||||
switch (ch) {
|
||||
case '\n':
|
||||
inCount = 0;
|
||||
offset += 1;
|
||||
fold += '\n';
|
||||
break;
|
||||
case '\t':
|
||||
if (inCount <= indent) error = true;
|
||||
offset = Node.endOfWhiteSpace(src, offset + 2) - 1;
|
||||
break;
|
||||
case ' ':
|
||||
inCount += 1;
|
||||
offset += 1;
|
||||
break;
|
||||
}
|
||||
ch = src[offset + 1];
|
||||
}
|
||||
if (!fold) fold = ' ';
|
||||
if (ch && inCount <= indent) error = true;
|
||||
return {
|
||||
fold,
|
||||
offset,
|
||||
error
|
||||
};
|
||||
}
|
||||
constructor(type, props, context) {
|
||||
Object.defineProperty(this, 'context', {
|
||||
value: context || null,
|
||||
writable: true
|
||||
});
|
||||
this.error = null;
|
||||
this.range = null;
|
||||
this.valueRange = null;
|
||||
this.props = props || [];
|
||||
this.type = type;
|
||||
this.value = null;
|
||||
}
|
||||
getPropValue(idx, key, skipKey) {
|
||||
if (!this.context) return null;
|
||||
const {
|
||||
src
|
||||
} = this.context;
|
||||
const prop = this.props[idx];
|
||||
return prop && src[prop.start] === key ? src.slice(prop.start + (skipKey ? 1 : 0), prop.end) : null;
|
||||
}
|
||||
get anchor() {
|
||||
for (let i = 0; i < this.props.length; ++i) {
|
||||
const anchor = this.getPropValue(i, Char.ANCHOR, true);
|
||||
if (anchor != null) return anchor;
|
||||
}
|
||||
return null;
|
||||
}
|
||||
get comment() {
|
||||
const comments = [];
|
||||
for (let i = 0; i < this.props.length; ++i) {
|
||||
const comment = this.getPropValue(i, Char.COMMENT, true);
|
||||
if (comment != null) comments.push(comment);
|
||||
}
|
||||
return comments.length > 0 ? comments.join('\n') : null;
|
||||
}
|
||||
commentHasRequiredWhitespace(start) {
|
||||
const {
|
||||
src
|
||||
} = this.context;
|
||||
if (this.header && start === this.header.end) return false;
|
||||
if (!this.valueRange) return false;
|
||||
const {
|
||||
end
|
||||
} = this.valueRange;
|
||||
return start !== end || Node.atBlank(src, end - 1);
|
||||
}
|
||||
get hasComment() {
|
||||
if (this.context) {
|
||||
const {
|
||||
src
|
||||
} = this.context;
|
||||
for (let i = 0; i < this.props.length; ++i) {
|
||||
if (src[this.props[i].start] === Char.COMMENT) return true;
|
||||
}
|
||||
}
|
||||
return false;
|
||||
}
|
||||
get hasProps() {
|
||||
if (this.context) {
|
||||
const {
|
||||
src
|
||||
} = this.context;
|
||||
for (let i = 0; i < this.props.length; ++i) {
|
||||
if (src[this.props[i].start] !== Char.COMMENT) return true;
|
||||
}
|
||||
}
|
||||
return false;
|
||||
}
|
||||
get includesTrailingLines() {
|
||||
return false;
|
||||
}
|
||||
get jsonLike() {
|
||||
const jsonLikeTypes = [Type.FLOW_MAP, Type.FLOW_SEQ, Type.QUOTE_DOUBLE, Type.QUOTE_SINGLE];
|
||||
return jsonLikeTypes.indexOf(this.type) !== -1;
|
||||
}
|
||||
get rangeAsLinePos() {
|
||||
if (!this.range || !this.context) return undefined;
|
||||
const start = getLinePos(this.range.start, this.context.root);
|
||||
if (!start) return undefined;
|
||||
const end = getLinePos(this.range.end, this.context.root);
|
||||
return {
|
||||
start,
|
||||
end
|
||||
};
|
||||
}
|
||||
get rawValue() {
|
||||
if (!this.valueRange || !this.context) return null;
|
||||
const {
|
||||
start,
|
||||
end
|
||||
} = this.valueRange;
|
||||
return this.context.src.slice(start, end);
|
||||
}
|
||||
get tag() {
|
||||
for (let i = 0; i < this.props.length; ++i) {
|
||||
const tag = this.getPropValue(i, Char.TAG, false);
|
||||
if (tag != null) {
|
||||
if (tag[1] === '<') {
|
||||
return {
|
||||
verbatim: tag.slice(2, -1)
|
||||
};
|
||||
} else {
|
||||
// eslint-disable-next-line no-unused-vars
|
||||
const [_, handle, suffix] = tag.match(/^(.*!)([^!]*)$/);
|
||||
return {
|
||||
handle,
|
||||
suffix
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
return null;
|
||||
}
|
||||
get valueRangeContainsNewline() {
|
||||
if (!this.valueRange || !this.context) return false;
|
||||
const {
|
||||
start,
|
||||
end
|
||||
} = this.valueRange;
|
||||
const {
|
||||
src
|
||||
} = this.context;
|
||||
for (let i = start; i < end; ++i) {
|
||||
if (src[i] === '\n') return true;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
parseComment(start) {
|
||||
const {
|
||||
src
|
||||
} = this.context;
|
||||
if (src[start] === Char.COMMENT) {
|
||||
const end = Node.endOfLine(src, start + 1);
|
||||
const commentRange = new Range(start, end);
|
||||
this.props.push(commentRange);
|
||||
return end;
|
||||
}
|
||||
return start;
|
||||
}
|
||||
|
||||
/**
|
||||
* Populates the `origStart` and `origEnd` values of all ranges for this
|
||||
* node. Extended by child classes to handle descendant nodes.
|
||||
*
|
||||
* @param {number[]} cr - Positions of dropped CR characters
|
||||
* @param {number} offset - Starting index of `cr` from the last call
|
||||
* @returns {number} - The next offset, matching the one found for `origStart`
|
||||
*/
|
||||
setOrigRanges(cr, offset) {
|
||||
if (this.range) offset = this.range.setOrigRange(cr, offset);
|
||||
if (this.valueRange) this.valueRange.setOrigRange(cr, offset);
|
||||
this.props.forEach(prop => prop.setOrigRange(cr, offset));
|
||||
return offset;
|
||||
}
|
||||
toString() {
|
||||
const {
|
||||
context: {
|
||||
src
|
||||
},
|
||||
range,
|
||||
value
|
||||
} = this;
|
||||
if (value != null) return value;
|
||||
const str = src.slice(range.start, range.end);
|
||||
return Node.addStringTerminator(src, range.end, str);
|
||||
}
|
||||
}
|
||||
|
||||
class YAMLError extends Error {
|
||||
constructor(name, source, message) {
|
||||
if (!message || !(source instanceof Node)) throw new Error(`Invalid arguments for new ${name}`);
|
||||
super();
|
||||
this.name = name;
|
||||
this.message = message;
|
||||
this.source = source;
|
||||
}
|
||||
makePretty() {
|
||||
if (!this.source) return;
|
||||
this.nodeType = this.source.type;
|
||||
const cst = this.source.context && this.source.context.root;
|
||||
if (typeof this.offset === 'number') {
|
||||
this.range = new Range(this.offset, this.offset + 1);
|
||||
const start = cst && getLinePos(this.offset, cst);
|
||||
if (start) {
|
||||
const end = {
|
||||
line: start.line,
|
||||
col: start.col + 1
|
||||
};
|
||||
this.linePos = {
|
||||
start,
|
||||
end
|
||||
};
|
||||
}
|
||||
delete this.offset;
|
||||
} else {
|
||||
this.range = this.source.range;
|
||||
this.linePos = this.source.rangeAsLinePos;
|
||||
}
|
||||
if (this.linePos) {
|
||||
const {
|
||||
line,
|
||||
col
|
||||
} = this.linePos.start;
|
||||
this.message += ` at line ${line}, column ${col}`;
|
||||
const ctx = cst && getPrettyContext(this.linePos, cst);
|
||||
if (ctx) this.message += `:\n\n${ctx}\n`;
|
||||
}
|
||||
delete this.source;
|
||||
}
|
||||
}
|
||||
class YAMLReferenceError extends YAMLError {
|
||||
constructor(source, message) {
|
||||
super('YAMLReferenceError', source, message);
|
||||
}
|
||||
}
|
||||
class YAMLSemanticError extends YAMLError {
|
||||
constructor(source, message) {
|
||||
super('YAMLSemanticError', source, message);
|
||||
}
|
||||
}
|
||||
class YAMLSyntaxError extends YAMLError {
|
||||
constructor(source, message) {
|
||||
super('YAMLSyntaxError', source, message);
|
||||
}
|
||||
}
|
||||
class YAMLWarning extends YAMLError {
|
||||
constructor(source, message) {
|
||||
super('YAMLWarning', source, message);
|
||||
}
|
||||
}
|
||||
|
||||
function _defineProperty(e, r, t) {
|
||||
return (r = _toPropertyKey(r)) in e ? Object.defineProperty(e, r, {
|
||||
value: t,
|
||||
enumerable: !0,
|
||||
configurable: !0,
|
||||
writable: !0
|
||||
}) : e[r] = t, e;
|
||||
}
|
||||
function _toPrimitive(t, r) {
|
||||
if ("object" != typeof t || !t) return t;
|
||||
var e = t[Symbol.toPrimitive];
|
||||
if (void 0 !== e) {
|
||||
var i = e.call(t, r || "default");
|
||||
if ("object" != typeof i) return i;
|
||||
throw new TypeError("@@toPrimitive must return a primitive value.");
|
||||
}
|
||||
return ("string" === r ? String : Number)(t);
|
||||
}
|
||||
function _toPropertyKey(t) {
|
||||
var i = _toPrimitive(t, "string");
|
||||
return "symbol" == typeof i ? i : i + "";
|
||||
}
|
||||
|
||||
class PlainValue extends Node {
|
||||
static endOfLine(src, start, inFlow) {
|
||||
let ch = src[start];
|
||||
let offset = start;
|
||||
while (ch && ch !== '\n') {
|
||||
if (inFlow && (ch === '[' || ch === ']' || ch === '{' || ch === '}' || ch === ',')) break;
|
||||
const next = src[offset + 1];
|
||||
if (ch === ':' && (!next || next === '\n' || next === '\t' || next === ' ' || inFlow && next === ',')) break;
|
||||
if ((ch === ' ' || ch === '\t') && next === '#') break;
|
||||
offset += 1;
|
||||
ch = next;
|
||||
}
|
||||
return offset;
|
||||
}
|
||||
get strValue() {
|
||||
if (!this.valueRange || !this.context) return null;
|
||||
let {
|
||||
start,
|
||||
end
|
||||
} = this.valueRange;
|
||||
const {
|
||||
src
|
||||
} = this.context;
|
||||
let ch = src[end - 1];
|
||||
while (start < end && (ch === '\n' || ch === '\t' || ch === ' ')) ch = src[--end - 1];
|
||||
let str = '';
|
||||
for (let i = start; i < end; ++i) {
|
||||
const ch = src[i];
|
||||
if (ch === '\n') {
|
||||
const {
|
||||
fold,
|
||||
offset
|
||||
} = Node.foldNewline(src, i, -1);
|
||||
str += fold;
|
||||
i = offset;
|
||||
} else if (ch === ' ' || ch === '\t') {
|
||||
// trim trailing whitespace
|
||||
const wsStart = i;
|
||||
let next = src[i + 1];
|
||||
while (i < end && (next === ' ' || next === '\t')) {
|
||||
i += 1;
|
||||
next = src[i + 1];
|
||||
}
|
||||
if (next !== '\n') str += i > wsStart ? src.slice(wsStart, i + 1) : ch;
|
||||
} else {
|
||||
str += ch;
|
||||
}
|
||||
}
|
||||
const ch0 = src[start];
|
||||
switch (ch0) {
|
||||
case '\t':
|
||||
{
|
||||
const msg = 'Plain value cannot start with a tab character';
|
||||
const errors = [new YAMLSemanticError(this, msg)];
|
||||
return {
|
||||
errors,
|
||||
str
|
||||
};
|
||||
}
|
||||
case '@':
|
||||
case '`':
|
||||
{
|
||||
const msg = `Plain value cannot start with reserved character ${ch0}`;
|
||||
const errors = [new YAMLSemanticError(this, msg)];
|
||||
return {
|
||||
errors,
|
||||
str
|
||||
};
|
||||
}
|
||||
default:
|
||||
return str;
|
||||
}
|
||||
}
|
||||
parseBlockValue(start) {
|
||||
const {
|
||||
indent,
|
||||
inFlow,
|
||||
src
|
||||
} = this.context;
|
||||
let offset = start;
|
||||
let valueEnd = start;
|
||||
for (let ch = src[offset]; ch === '\n'; ch = src[offset]) {
|
||||
if (Node.atDocumentBoundary(src, offset + 1)) break;
|
||||
const end = Node.endOfBlockIndent(src, indent, offset + 1);
|
||||
if (end === null || src[end] === '#') break;
|
||||
if (src[end] === '\n') {
|
||||
offset = end;
|
||||
} else {
|
||||
valueEnd = PlainValue.endOfLine(src, end, inFlow);
|
||||
offset = valueEnd;
|
||||
}
|
||||
}
|
||||
if (this.valueRange.isEmpty()) this.valueRange.start = start;
|
||||
this.valueRange.end = valueEnd;
|
||||
return valueEnd;
|
||||
}
|
||||
|
||||
/**
|
||||
* Parses a plain value from the source
|
||||
*
|
||||
* Accepted forms are:
|
||||
* ```
|
||||
* #comment
|
||||
*
|
||||
* first line
|
||||
*
|
||||
* first line #comment
|
||||
*
|
||||
* first line
|
||||
* block
|
||||
* lines
|
||||
*
|
||||
* #comment
|
||||
* block
|
||||
* lines
|
||||
* ```
|
||||
* where block lines are empty or have an indent level greater than `indent`.
|
||||
*
|
||||
* @param {ParseContext} context
|
||||
* @param {number} start - Index of first character
|
||||
* @returns {number} - Index of the character after this scalar, may be `\n`
|
||||
*/
|
||||
parse(context, start) {
|
||||
this.context = context;
|
||||
const {
|
||||
inFlow,
|
||||
src
|
||||
} = context;
|
||||
let offset = start;
|
||||
const ch = src[offset];
|
||||
if (ch && ch !== '#' && ch !== '\n') {
|
||||
offset = PlainValue.endOfLine(src, start, inFlow);
|
||||
}
|
||||
this.valueRange = new Range(start, offset);
|
||||
offset = Node.endOfWhiteSpace(src, offset);
|
||||
offset = this.parseComment(offset);
|
||||
if (!this.hasComment || this.valueRange.isEmpty()) {
|
||||
offset = this.parseBlockValue(offset);
|
||||
}
|
||||
return offset;
|
||||
}
|
||||
}
|
||||
|
||||
export { Char as C, Node as N, PlainValue as P, Range as R, Type as T, YAMLSyntaxError as Y, _defineProperty as _, YAMLWarning as a, YAMLSemanticError as b, YAMLError as c, defaultTagPrefix as d, defaultTags as e, YAMLReferenceError as f };
|
||||
467
node_modules/cosmiconfig/node_modules/yaml/browser/dist/Schema-9530c078.js
generated
vendored
Normal file
467
node_modules/cosmiconfig/node_modules/yaml/browser/dist/Schema-9530c078.js
generated
vendored
Normal file
@@ -0,0 +1,467 @@
|
||||
import { _ as _defineProperty, d as defaultTagPrefix, e as defaultTags } from './PlainValue-183afbad.js';
|
||||
import { d as YAMLMap, g as resolveMap, Y as YAMLSeq, h as resolveSeq, j as resolveString, c as stringifyString, s as strOptions, S as Scalar, n as nullOptions, a as boolOptions, i as intOptions, k as stringifyNumber, N as Node, A as Alias, P as Pair } from './resolveSeq-67caf78a.js';
|
||||
import { b as binary, o as omap, p as pairs, s as set, i as intTime, f as floatTime, t as timestamp, a as warnOptionDeprecation } from './warnings-5e4358fe.js';
|
||||
|
||||
function createMap(schema, obj, ctx) {
|
||||
const map = new YAMLMap(schema);
|
||||
if (obj instanceof Map) {
|
||||
for (const [key, value] of obj) map.items.push(schema.createPair(key, value, ctx));
|
||||
} else if (obj && typeof obj === 'object') {
|
||||
for (const key of Object.keys(obj)) map.items.push(schema.createPair(key, obj[key], ctx));
|
||||
}
|
||||
if (typeof schema.sortMapEntries === 'function') {
|
||||
map.items.sort(schema.sortMapEntries);
|
||||
}
|
||||
return map;
|
||||
}
|
||||
const map = {
|
||||
createNode: createMap,
|
||||
default: true,
|
||||
nodeClass: YAMLMap,
|
||||
tag: 'tag:yaml.org,2002:map',
|
||||
resolve: resolveMap
|
||||
};
|
||||
|
||||
function createSeq(schema, obj, ctx) {
|
||||
const seq = new YAMLSeq(schema);
|
||||
if (obj && obj[Symbol.iterator]) {
|
||||
for (const it of obj) {
|
||||
const v = schema.createNode(it, ctx.wrapScalars, null, ctx);
|
||||
seq.items.push(v);
|
||||
}
|
||||
}
|
||||
return seq;
|
||||
}
|
||||
const seq = {
|
||||
createNode: createSeq,
|
||||
default: true,
|
||||
nodeClass: YAMLSeq,
|
||||
tag: 'tag:yaml.org,2002:seq',
|
||||
resolve: resolveSeq
|
||||
};
|
||||
|
||||
const string = {
|
||||
identify: value => typeof value === 'string',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:str',
|
||||
resolve: resolveString,
|
||||
stringify(item, ctx, onComment, onChompKeep) {
|
||||
ctx = Object.assign({
|
||||
actualString: true
|
||||
}, ctx);
|
||||
return stringifyString(item, ctx, onComment, onChompKeep);
|
||||
},
|
||||
options: strOptions
|
||||
};
|
||||
|
||||
const failsafe = [map, seq, string];
|
||||
|
||||
/* global BigInt */
|
||||
const intIdentify$2 = value => typeof value === 'bigint' || Number.isInteger(value);
|
||||
const intResolve$1 = (src, part, radix) => intOptions.asBigInt ? BigInt(src) : parseInt(part, radix);
|
||||
function intStringify$1(node, radix, prefix) {
|
||||
const {
|
||||
value
|
||||
} = node;
|
||||
if (intIdentify$2(value) && value >= 0) return prefix + value.toString(radix);
|
||||
return stringifyNumber(node);
|
||||
}
|
||||
const nullObj = {
|
||||
identify: value => value == null,
|
||||
createNode: (schema, value, ctx) => ctx.wrapScalars ? new Scalar(null) : null,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:null',
|
||||
test: /^(?:~|[Nn]ull|NULL)?$/,
|
||||
resolve: () => null,
|
||||
options: nullOptions,
|
||||
stringify: () => nullOptions.nullStr
|
||||
};
|
||||
const boolObj = {
|
||||
identify: value => typeof value === 'boolean',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:bool',
|
||||
test: /^(?:[Tt]rue|TRUE|[Ff]alse|FALSE)$/,
|
||||
resolve: str => str[0] === 't' || str[0] === 'T',
|
||||
options: boolOptions,
|
||||
stringify: ({
|
||||
value
|
||||
}) => value ? boolOptions.trueStr : boolOptions.falseStr
|
||||
};
|
||||
const octObj = {
|
||||
identify: value => intIdentify$2(value) && value >= 0,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:int',
|
||||
format: 'OCT',
|
||||
test: /^0o([0-7]+)$/,
|
||||
resolve: (str, oct) => intResolve$1(str, oct, 8),
|
||||
options: intOptions,
|
||||
stringify: node => intStringify$1(node, 8, '0o')
|
||||
};
|
||||
const intObj = {
|
||||
identify: intIdentify$2,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:int',
|
||||
test: /^[-+]?[0-9]+$/,
|
||||
resolve: str => intResolve$1(str, str, 10),
|
||||
options: intOptions,
|
||||
stringify: stringifyNumber
|
||||
};
|
||||
const hexObj = {
|
||||
identify: value => intIdentify$2(value) && value >= 0,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:int',
|
||||
format: 'HEX',
|
||||
test: /^0x([0-9a-fA-F]+)$/,
|
||||
resolve: (str, hex) => intResolve$1(str, hex, 16),
|
||||
options: intOptions,
|
||||
stringify: node => intStringify$1(node, 16, '0x')
|
||||
};
|
||||
const nanObj = {
|
||||
identify: value => typeof value === 'number',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:float',
|
||||
test: /^(?:[-+]?\.inf|(\.nan))$/i,
|
||||
resolve: (str, nan) => nan ? NaN : str[0] === '-' ? Number.NEGATIVE_INFINITY : Number.POSITIVE_INFINITY,
|
||||
stringify: stringifyNumber
|
||||
};
|
||||
const expObj = {
|
||||
identify: value => typeof value === 'number',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:float',
|
||||
format: 'EXP',
|
||||
test: /^[-+]?(?:\.[0-9]+|[0-9]+(?:\.[0-9]*)?)[eE][-+]?[0-9]+$/,
|
||||
resolve: str => parseFloat(str),
|
||||
stringify: ({
|
||||
value
|
||||
}) => Number(value).toExponential()
|
||||
};
|
||||
const floatObj = {
|
||||
identify: value => typeof value === 'number',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:float',
|
||||
test: /^[-+]?(?:\.([0-9]+)|[0-9]+\.([0-9]*))$/,
|
||||
resolve(str, frac1, frac2) {
|
||||
const frac = frac1 || frac2;
|
||||
const node = new Scalar(parseFloat(str));
|
||||
if (frac && frac[frac.length - 1] === '0') node.minFractionDigits = frac.length;
|
||||
return node;
|
||||
},
|
||||
stringify: stringifyNumber
|
||||
};
|
||||
const core = failsafe.concat([nullObj, boolObj, octObj, intObj, hexObj, nanObj, expObj, floatObj]);
|
||||
|
||||
/* global BigInt */
|
||||
const intIdentify$1 = value => typeof value === 'bigint' || Number.isInteger(value);
|
||||
const stringifyJSON = ({
|
||||
value
|
||||
}) => JSON.stringify(value);
|
||||
const json = [map, seq, {
|
||||
identify: value => typeof value === 'string',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:str',
|
||||
resolve: resolveString,
|
||||
stringify: stringifyJSON
|
||||
}, {
|
||||
identify: value => value == null,
|
||||
createNode: (schema, value, ctx) => ctx.wrapScalars ? new Scalar(null) : null,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:null',
|
||||
test: /^null$/,
|
||||
resolve: () => null,
|
||||
stringify: stringifyJSON
|
||||
}, {
|
||||
identify: value => typeof value === 'boolean',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:bool',
|
||||
test: /^true|false$/,
|
||||
resolve: str => str === 'true',
|
||||
stringify: stringifyJSON
|
||||
}, {
|
||||
identify: intIdentify$1,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:int',
|
||||
test: /^-?(?:0|[1-9][0-9]*)$/,
|
||||
resolve: str => intOptions.asBigInt ? BigInt(str) : parseInt(str, 10),
|
||||
stringify: ({
|
||||
value
|
||||
}) => intIdentify$1(value) ? value.toString() : JSON.stringify(value)
|
||||
}, {
|
||||
identify: value => typeof value === 'number',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:float',
|
||||
test: /^-?(?:0|[1-9][0-9]*)(?:\.[0-9]*)?(?:[eE][-+]?[0-9]+)?$/,
|
||||
resolve: str => parseFloat(str),
|
||||
stringify: stringifyJSON
|
||||
}];
|
||||
json.scalarFallback = str => {
|
||||
throw new SyntaxError(`Unresolved plain scalar ${JSON.stringify(str)}`);
|
||||
};
|
||||
|
||||
/* global BigInt */
|
||||
const boolStringify = ({
|
||||
value
|
||||
}) => value ? boolOptions.trueStr : boolOptions.falseStr;
|
||||
const intIdentify = value => typeof value === 'bigint' || Number.isInteger(value);
|
||||
function intResolve(sign, src, radix) {
|
||||
let str = src.replace(/_/g, '');
|
||||
if (intOptions.asBigInt) {
|
||||
switch (radix) {
|
||||
case 2:
|
||||
str = `0b${str}`;
|
||||
break;
|
||||
case 8:
|
||||
str = `0o${str}`;
|
||||
break;
|
||||
case 16:
|
||||
str = `0x${str}`;
|
||||
break;
|
||||
}
|
||||
const n = BigInt(str);
|
||||
return sign === '-' ? BigInt(-1) * n : n;
|
||||
}
|
||||
const n = parseInt(str, radix);
|
||||
return sign === '-' ? -1 * n : n;
|
||||
}
|
||||
function intStringify(node, radix, prefix) {
|
||||
const {
|
||||
value
|
||||
} = node;
|
||||
if (intIdentify(value)) {
|
||||
const str = value.toString(radix);
|
||||
return value < 0 ? '-' + prefix + str.substr(1) : prefix + str;
|
||||
}
|
||||
return stringifyNumber(node);
|
||||
}
|
||||
const yaml11 = failsafe.concat([{
|
||||
identify: value => value == null,
|
||||
createNode: (schema, value, ctx) => ctx.wrapScalars ? new Scalar(null) : null,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:null',
|
||||
test: /^(?:~|[Nn]ull|NULL)?$/,
|
||||
resolve: () => null,
|
||||
options: nullOptions,
|
||||
stringify: () => nullOptions.nullStr
|
||||
}, {
|
||||
identify: value => typeof value === 'boolean',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:bool',
|
||||
test: /^(?:Y|y|[Yy]es|YES|[Tt]rue|TRUE|[Oo]n|ON)$/,
|
||||
resolve: () => true,
|
||||
options: boolOptions,
|
||||
stringify: boolStringify
|
||||
}, {
|
||||
identify: value => typeof value === 'boolean',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:bool',
|
||||
test: /^(?:N|n|[Nn]o|NO|[Ff]alse|FALSE|[Oo]ff|OFF)$/i,
|
||||
resolve: () => false,
|
||||
options: boolOptions,
|
||||
stringify: boolStringify
|
||||
}, {
|
||||
identify: intIdentify,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:int',
|
||||
format: 'BIN',
|
||||
test: /^([-+]?)0b([0-1_]+)$/,
|
||||
resolve: (str, sign, bin) => intResolve(sign, bin, 2),
|
||||
stringify: node => intStringify(node, 2, '0b')
|
||||
}, {
|
||||
identify: intIdentify,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:int',
|
||||
format: 'OCT',
|
||||
test: /^([-+]?)0([0-7_]+)$/,
|
||||
resolve: (str, sign, oct) => intResolve(sign, oct, 8),
|
||||
stringify: node => intStringify(node, 8, '0')
|
||||
}, {
|
||||
identify: intIdentify,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:int',
|
||||
test: /^([-+]?)([0-9][0-9_]*)$/,
|
||||
resolve: (str, sign, abs) => intResolve(sign, abs, 10),
|
||||
stringify: stringifyNumber
|
||||
}, {
|
||||
identify: intIdentify,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:int',
|
||||
format: 'HEX',
|
||||
test: /^([-+]?)0x([0-9a-fA-F_]+)$/,
|
||||
resolve: (str, sign, hex) => intResolve(sign, hex, 16),
|
||||
stringify: node => intStringify(node, 16, '0x')
|
||||
}, {
|
||||
identify: value => typeof value === 'number',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:float',
|
||||
test: /^(?:[-+]?\.inf|(\.nan))$/i,
|
||||
resolve: (str, nan) => nan ? NaN : str[0] === '-' ? Number.NEGATIVE_INFINITY : Number.POSITIVE_INFINITY,
|
||||
stringify: stringifyNumber
|
||||
}, {
|
||||
identify: value => typeof value === 'number',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:float',
|
||||
format: 'EXP',
|
||||
test: /^[-+]?([0-9][0-9_]*)?(\.[0-9_]*)?[eE][-+]?[0-9]+$/,
|
||||
resolve: str => parseFloat(str.replace(/_/g, '')),
|
||||
stringify: ({
|
||||
value
|
||||
}) => Number(value).toExponential()
|
||||
}, {
|
||||
identify: value => typeof value === 'number',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:float',
|
||||
test: /^[-+]?(?:[0-9][0-9_]*)?\.([0-9_]*)$/,
|
||||
resolve(str, frac) {
|
||||
const node = new Scalar(parseFloat(str.replace(/_/g, '')));
|
||||
if (frac) {
|
||||
const f = frac.replace(/_/g, '');
|
||||
if (f[f.length - 1] === '0') node.minFractionDigits = f.length;
|
||||
}
|
||||
return node;
|
||||
},
|
||||
stringify: stringifyNumber
|
||||
}], binary, omap, pairs, set, intTime, floatTime, timestamp);
|
||||
|
||||
const schemas = {
|
||||
core,
|
||||
failsafe,
|
||||
json,
|
||||
yaml11
|
||||
};
|
||||
const tags = {
|
||||
binary,
|
||||
bool: boolObj,
|
||||
float: floatObj,
|
||||
floatExp: expObj,
|
||||
floatNaN: nanObj,
|
||||
floatTime,
|
||||
int: intObj,
|
||||
intHex: hexObj,
|
||||
intOct: octObj,
|
||||
intTime,
|
||||
map,
|
||||
null: nullObj,
|
||||
omap,
|
||||
pairs,
|
||||
seq,
|
||||
set,
|
||||
timestamp
|
||||
};
|
||||
|
||||
function findTagObject(value, tagName, tags) {
|
||||
if (tagName) {
|
||||
const match = tags.filter(t => t.tag === tagName);
|
||||
const tagObj = match.find(t => !t.format) || match[0];
|
||||
if (!tagObj) throw new Error(`Tag ${tagName} not found`);
|
||||
return tagObj;
|
||||
}
|
||||
|
||||
// TODO: deprecate/remove class check
|
||||
return tags.find(t => (t.identify && t.identify(value) || t.class && value instanceof t.class) && !t.format);
|
||||
}
|
||||
function createNode(value, tagName, ctx) {
|
||||
if (value instanceof Node) return value;
|
||||
const {
|
||||
defaultPrefix,
|
||||
onTagObj,
|
||||
prevObjects,
|
||||
schema,
|
||||
wrapScalars
|
||||
} = ctx;
|
||||
if (tagName && tagName.startsWith('!!')) tagName = defaultPrefix + tagName.slice(2);
|
||||
let tagObj = findTagObject(value, tagName, schema.tags);
|
||||
if (!tagObj) {
|
||||
if (typeof value.toJSON === 'function') value = value.toJSON();
|
||||
if (!value || typeof value !== 'object') return wrapScalars ? new Scalar(value) : value;
|
||||
tagObj = value instanceof Map ? map : value[Symbol.iterator] ? seq : map;
|
||||
}
|
||||
if (onTagObj) {
|
||||
onTagObj(tagObj);
|
||||
delete ctx.onTagObj;
|
||||
}
|
||||
|
||||
// Detect duplicate references to the same object & use Alias nodes for all
|
||||
// after first. The `obj` wrapper allows for circular references to resolve.
|
||||
const obj = {
|
||||
value: undefined,
|
||||
node: undefined
|
||||
};
|
||||
if (value && typeof value === 'object' && prevObjects) {
|
||||
const prev = prevObjects.get(value);
|
||||
if (prev) {
|
||||
const alias = new Alias(prev); // leaves source dirty; must be cleaned by caller
|
||||
ctx.aliasNodes.push(alias); // defined along with prevObjects
|
||||
return alias;
|
||||
}
|
||||
obj.value = value;
|
||||
prevObjects.set(value, obj);
|
||||
}
|
||||
obj.node = tagObj.createNode ? tagObj.createNode(ctx.schema, value, ctx) : wrapScalars ? new Scalar(value) : value;
|
||||
if (tagName && obj.node instanceof Node) obj.node.tag = tagName;
|
||||
return obj.node;
|
||||
}
|
||||
|
||||
function getSchemaTags(schemas, knownTags, customTags, schemaId) {
|
||||
let tags = schemas[schemaId.replace(/\W/g, '')]; // 'yaml-1.1' -> 'yaml11'
|
||||
if (!tags) {
|
||||
const keys = Object.keys(schemas).map(key => JSON.stringify(key)).join(', ');
|
||||
throw new Error(`Unknown schema "${schemaId}"; use one of ${keys}`);
|
||||
}
|
||||
if (Array.isArray(customTags)) {
|
||||
for (const tag of customTags) tags = tags.concat(tag);
|
||||
} else if (typeof customTags === 'function') {
|
||||
tags = customTags(tags.slice());
|
||||
}
|
||||
for (let i = 0; i < tags.length; ++i) {
|
||||
const tag = tags[i];
|
||||
if (typeof tag === 'string') {
|
||||
const tagObj = knownTags[tag];
|
||||
if (!tagObj) {
|
||||
const keys = Object.keys(knownTags).map(key => JSON.stringify(key)).join(', ');
|
||||
throw new Error(`Unknown custom tag "${tag}"; use one of ${keys}`);
|
||||
}
|
||||
tags[i] = tagObj;
|
||||
}
|
||||
}
|
||||
return tags;
|
||||
}
|
||||
|
||||
const sortMapEntriesByKey = (a, b) => a.key < b.key ? -1 : a.key > b.key ? 1 : 0;
|
||||
class Schema {
|
||||
// TODO: remove in v2
|
||||
|
||||
constructor({
|
||||
customTags,
|
||||
merge,
|
||||
schema,
|
||||
sortMapEntries,
|
||||
tags: deprecatedCustomTags
|
||||
}) {
|
||||
this.merge = !!merge;
|
||||
this.name = schema;
|
||||
this.sortMapEntries = sortMapEntries === true ? sortMapEntriesByKey : sortMapEntries || null;
|
||||
if (!customTags && deprecatedCustomTags) warnOptionDeprecation('tags', 'customTags');
|
||||
this.tags = getSchemaTags(schemas, tags, customTags || deprecatedCustomTags, schema);
|
||||
}
|
||||
createNode(value, wrapScalars, tagName, ctx) {
|
||||
const baseCtx = {
|
||||
defaultPrefix: Schema.defaultPrefix,
|
||||
schema: this,
|
||||
wrapScalars
|
||||
};
|
||||
const createCtx = ctx ? Object.assign(ctx, baseCtx) : baseCtx;
|
||||
return createNode(value, tagName, createCtx);
|
||||
}
|
||||
createPair(key, value, ctx) {
|
||||
if (!ctx) ctx = {
|
||||
wrapScalars: true
|
||||
};
|
||||
const k = this.createNode(key, ctx.wrapScalars, null, ctx);
|
||||
const v = this.createNode(value, ctx.wrapScalars, null, ctx);
|
||||
return new Pair(k, v);
|
||||
}
|
||||
}
|
||||
_defineProperty(Schema, "defaultPrefix", defaultTagPrefix);
|
||||
// TODO: remove in v2
|
||||
_defineProperty(Schema, "defaultTags", defaultTags);
|
||||
|
||||
export { Schema as S };
|
||||
692
node_modules/cosmiconfig/node_modules/yaml/browser/dist/index.js
generated
vendored
Normal file
692
node_modules/cosmiconfig/node_modules/yaml/browser/dist/index.js
generated
vendored
Normal file
@@ -0,0 +1,692 @@
|
||||
import { parse as parse$1 } from './parse-cst.js';
|
||||
import { d as defaultTagPrefix, _ as _defineProperty, T as Type, Y as YAMLSyntaxError, a as YAMLWarning, b as YAMLSemanticError, c as YAMLError } from './PlainValue-183afbad.js';
|
||||
import { b as binaryOptions, a as boolOptions, i as intOptions, n as nullOptions, s as strOptions, N as Node, P as Pair, S as Scalar, c as stringifyString, A as Alias, Y as YAMLSeq, d as YAMLMap, M as Merge, C as Collection, r as resolveNode, e as isEmptyPath, t as toJSON, f as addComment } from './resolveSeq-67caf78a.js';
|
||||
import { S as Schema } from './Schema-9530c078.js';
|
||||
import { w as warn } from './warnings-5e4358fe.js';
|
||||
|
||||
const defaultOptions = {
|
||||
anchorPrefix: 'a',
|
||||
customTags: null,
|
||||
indent: 2,
|
||||
indentSeq: true,
|
||||
keepCstNodes: false,
|
||||
keepNodeTypes: true,
|
||||
keepBlobsInJSON: true,
|
||||
mapAsMap: false,
|
||||
maxAliasCount: 100,
|
||||
prettyErrors: false,
|
||||
// TODO Set true in v2
|
||||
simpleKeys: false,
|
||||
version: '1.2'
|
||||
};
|
||||
const scalarOptions = {
|
||||
get binary() {
|
||||
return binaryOptions;
|
||||
},
|
||||
set binary(opt) {
|
||||
Object.assign(binaryOptions, opt);
|
||||
},
|
||||
get bool() {
|
||||
return boolOptions;
|
||||
},
|
||||
set bool(opt) {
|
||||
Object.assign(boolOptions, opt);
|
||||
},
|
||||
get int() {
|
||||
return intOptions;
|
||||
},
|
||||
set int(opt) {
|
||||
Object.assign(intOptions, opt);
|
||||
},
|
||||
get null() {
|
||||
return nullOptions;
|
||||
},
|
||||
set null(opt) {
|
||||
Object.assign(nullOptions, opt);
|
||||
},
|
||||
get str() {
|
||||
return strOptions;
|
||||
},
|
||||
set str(opt) {
|
||||
Object.assign(strOptions, opt);
|
||||
}
|
||||
};
|
||||
const documentOptions = {
|
||||
'1.0': {
|
||||
schema: 'yaml-1.1',
|
||||
merge: true,
|
||||
tagPrefixes: [{
|
||||
handle: '!',
|
||||
prefix: defaultTagPrefix
|
||||
}, {
|
||||
handle: '!!',
|
||||
prefix: 'tag:private.yaml.org,2002:'
|
||||
}]
|
||||
},
|
||||
1.1: {
|
||||
schema: 'yaml-1.1',
|
||||
merge: true,
|
||||
tagPrefixes: [{
|
||||
handle: '!',
|
||||
prefix: '!'
|
||||
}, {
|
||||
handle: '!!',
|
||||
prefix: defaultTagPrefix
|
||||
}]
|
||||
},
|
||||
1.2: {
|
||||
schema: 'core',
|
||||
merge: false,
|
||||
tagPrefixes: [{
|
||||
handle: '!',
|
||||
prefix: '!'
|
||||
}, {
|
||||
handle: '!!',
|
||||
prefix: defaultTagPrefix
|
||||
}]
|
||||
}
|
||||
};
|
||||
|
||||
function stringifyTag(doc, tag) {
|
||||
if ((doc.version || doc.options.version) === '1.0') {
|
||||
const priv = tag.match(/^tag:private\.yaml\.org,2002:([^:/]+)$/);
|
||||
if (priv) return '!' + priv[1];
|
||||
const vocab = tag.match(/^tag:([a-zA-Z0-9-]+)\.yaml\.org,2002:(.*)/);
|
||||
return vocab ? `!${vocab[1]}/${vocab[2]}` : `!${tag.replace(/^tag:/, '')}`;
|
||||
}
|
||||
let p = doc.tagPrefixes.find(p => tag.indexOf(p.prefix) === 0);
|
||||
if (!p) {
|
||||
const dtp = doc.getDefaults().tagPrefixes;
|
||||
p = dtp && dtp.find(p => tag.indexOf(p.prefix) === 0);
|
||||
}
|
||||
if (!p) return tag[0] === '!' ? tag : `!<${tag}>`;
|
||||
const suffix = tag.substr(p.prefix.length).replace(/[!,[\]{}]/g, ch => ({
|
||||
'!': '%21',
|
||||
',': '%2C',
|
||||
'[': '%5B',
|
||||
']': '%5D',
|
||||
'{': '%7B',
|
||||
'}': '%7D'
|
||||
})[ch]);
|
||||
return p.handle + suffix;
|
||||
}
|
||||
|
||||
function getTagObject(tags, item) {
|
||||
if (item instanceof Alias) return Alias;
|
||||
if (item.tag) {
|
||||
const match = tags.filter(t => t.tag === item.tag);
|
||||
if (match.length > 0) return match.find(t => t.format === item.format) || match[0];
|
||||
}
|
||||
let tagObj, obj;
|
||||
if (item instanceof Scalar) {
|
||||
obj = item.value;
|
||||
// TODO: deprecate/remove class check
|
||||
const match = tags.filter(t => t.identify && t.identify(obj) || t.class && obj instanceof t.class);
|
||||
tagObj = match.find(t => t.format === item.format) || match.find(t => !t.format);
|
||||
} else {
|
||||
obj = item;
|
||||
tagObj = tags.find(t => t.nodeClass && obj instanceof t.nodeClass);
|
||||
}
|
||||
if (!tagObj) {
|
||||
const name = obj && obj.constructor ? obj.constructor.name : typeof obj;
|
||||
throw new Error(`Tag not resolved for ${name} value`);
|
||||
}
|
||||
return tagObj;
|
||||
}
|
||||
|
||||
// needs to be called before value stringifier to allow for circular anchor refs
|
||||
function stringifyProps(node, tagObj, {
|
||||
anchors,
|
||||
doc
|
||||
}) {
|
||||
const props = [];
|
||||
const anchor = doc.anchors.getName(node);
|
||||
if (anchor) {
|
||||
anchors[anchor] = node;
|
||||
props.push(`&${anchor}`);
|
||||
}
|
||||
if (node.tag) {
|
||||
props.push(stringifyTag(doc, node.tag));
|
||||
} else if (!tagObj.default) {
|
||||
props.push(stringifyTag(doc, tagObj.tag));
|
||||
}
|
||||
return props.join(' ');
|
||||
}
|
||||
function stringify$1(item, ctx, onComment, onChompKeep) {
|
||||
const {
|
||||
anchors,
|
||||
schema
|
||||
} = ctx.doc;
|
||||
let tagObj;
|
||||
if (!(item instanceof Node)) {
|
||||
const createCtx = {
|
||||
aliasNodes: [],
|
||||
onTagObj: o => tagObj = o,
|
||||
prevObjects: new Map()
|
||||
};
|
||||
item = schema.createNode(item, true, null, createCtx);
|
||||
for (const alias of createCtx.aliasNodes) {
|
||||
alias.source = alias.source.node;
|
||||
let name = anchors.getName(alias.source);
|
||||
if (!name) {
|
||||
name = anchors.newName();
|
||||
anchors.map[name] = alias.source;
|
||||
}
|
||||
}
|
||||
}
|
||||
if (item instanceof Pair) return item.toString(ctx, onComment, onChompKeep);
|
||||
if (!tagObj) tagObj = getTagObject(schema.tags, item);
|
||||
const props = stringifyProps(item, tagObj, ctx);
|
||||
if (props.length > 0) ctx.indentAtStart = (ctx.indentAtStart || 0) + props.length + 1;
|
||||
const str = typeof tagObj.stringify === 'function' ? tagObj.stringify(item, ctx, onComment, onChompKeep) : item instanceof Scalar ? stringifyString(item, ctx, onComment, onChompKeep) : item.toString(ctx, onComment, onChompKeep);
|
||||
if (!props) return str;
|
||||
return item instanceof Scalar || str[0] === '{' || str[0] === '[' ? `${props} ${str}` : `${props}\n${ctx.indent}${str}`;
|
||||
}
|
||||
|
||||
class Anchors {
|
||||
static validAnchorNode(node) {
|
||||
return node instanceof Scalar || node instanceof YAMLSeq || node instanceof YAMLMap;
|
||||
}
|
||||
constructor(prefix) {
|
||||
_defineProperty(this, "map", Object.create(null));
|
||||
this.prefix = prefix;
|
||||
}
|
||||
createAlias(node, name) {
|
||||
this.setAnchor(node, name);
|
||||
return new Alias(node);
|
||||
}
|
||||
createMergePair(...sources) {
|
||||
const merge = new Merge();
|
||||
merge.value.items = sources.map(s => {
|
||||
if (s instanceof Alias) {
|
||||
if (s.source instanceof YAMLMap) return s;
|
||||
} else if (s instanceof YAMLMap) {
|
||||
return this.createAlias(s);
|
||||
}
|
||||
throw new Error('Merge sources must be Map nodes or their Aliases');
|
||||
});
|
||||
return merge;
|
||||
}
|
||||
getName(node) {
|
||||
const {
|
||||
map
|
||||
} = this;
|
||||
return Object.keys(map).find(a => map[a] === node);
|
||||
}
|
||||
getNames() {
|
||||
return Object.keys(this.map);
|
||||
}
|
||||
getNode(name) {
|
||||
return this.map[name];
|
||||
}
|
||||
newName(prefix) {
|
||||
if (!prefix) prefix = this.prefix;
|
||||
const names = Object.keys(this.map);
|
||||
for (let i = 1; true; ++i) {
|
||||
const name = `${prefix}${i}`;
|
||||
if (!names.includes(name)) return name;
|
||||
}
|
||||
}
|
||||
|
||||
// During parsing, map & aliases contain CST nodes
|
||||
resolveNodes() {
|
||||
const {
|
||||
map,
|
||||
_cstAliases
|
||||
} = this;
|
||||
Object.keys(map).forEach(a => {
|
||||
map[a] = map[a].resolved;
|
||||
});
|
||||
_cstAliases.forEach(a => {
|
||||
a.source = a.source.resolved;
|
||||
});
|
||||
delete this._cstAliases;
|
||||
}
|
||||
setAnchor(node, name) {
|
||||
if (node != null && !Anchors.validAnchorNode(node)) {
|
||||
throw new Error('Anchors may only be set for Scalar, Seq and Map nodes');
|
||||
}
|
||||
if (name && /[\x00-\x19\s,[\]{}]/.test(name)) {
|
||||
throw new Error('Anchor names must not contain whitespace or control characters');
|
||||
}
|
||||
const {
|
||||
map
|
||||
} = this;
|
||||
const prev = node && Object.keys(map).find(a => map[a] === node);
|
||||
if (prev) {
|
||||
if (!name) {
|
||||
return prev;
|
||||
} else if (prev !== name) {
|
||||
delete map[prev];
|
||||
map[name] = node;
|
||||
}
|
||||
} else {
|
||||
if (!name) {
|
||||
if (!node) return null;
|
||||
name = this.newName();
|
||||
}
|
||||
map[name] = node;
|
||||
}
|
||||
return name;
|
||||
}
|
||||
}
|
||||
|
||||
const visit = (node, tags) => {
|
||||
if (node && typeof node === 'object') {
|
||||
const {
|
||||
tag
|
||||
} = node;
|
||||
if (node instanceof Collection) {
|
||||
if (tag) tags[tag] = true;
|
||||
node.items.forEach(n => visit(n, tags));
|
||||
} else if (node instanceof Pair) {
|
||||
visit(node.key, tags);
|
||||
visit(node.value, tags);
|
||||
} else if (node instanceof Scalar) {
|
||||
if (tag) tags[tag] = true;
|
||||
}
|
||||
}
|
||||
return tags;
|
||||
};
|
||||
const listTagNames = node => Object.keys(visit(node, {}));
|
||||
|
||||
function parseContents(doc, contents) {
|
||||
const comments = {
|
||||
before: [],
|
||||
after: []
|
||||
};
|
||||
let body = undefined;
|
||||
let spaceBefore = false;
|
||||
for (const node of contents) {
|
||||
if (node.valueRange) {
|
||||
if (body !== undefined) {
|
||||
const msg = 'Document contains trailing content not separated by a ... or --- line';
|
||||
doc.errors.push(new YAMLSyntaxError(node, msg));
|
||||
break;
|
||||
}
|
||||
const res = resolveNode(doc, node);
|
||||
if (spaceBefore) {
|
||||
res.spaceBefore = true;
|
||||
spaceBefore = false;
|
||||
}
|
||||
body = res;
|
||||
} else if (node.comment !== null) {
|
||||
const cc = body === undefined ? comments.before : comments.after;
|
||||
cc.push(node.comment);
|
||||
} else if (node.type === Type.BLANK_LINE) {
|
||||
spaceBefore = true;
|
||||
if (body === undefined && comments.before.length > 0 && !doc.commentBefore) {
|
||||
// space-separated comments at start are parsed as document comments
|
||||
doc.commentBefore = comments.before.join('\n');
|
||||
comments.before = [];
|
||||
}
|
||||
}
|
||||
}
|
||||
doc.contents = body || null;
|
||||
if (!body) {
|
||||
doc.comment = comments.before.concat(comments.after).join('\n') || null;
|
||||
} else {
|
||||
const cb = comments.before.join('\n');
|
||||
if (cb) {
|
||||
const cbNode = body instanceof Collection && body.items[0] ? body.items[0] : body;
|
||||
cbNode.commentBefore = cbNode.commentBefore ? `${cb}\n${cbNode.commentBefore}` : cb;
|
||||
}
|
||||
doc.comment = comments.after.join('\n') || null;
|
||||
}
|
||||
}
|
||||
|
||||
function resolveTagDirective({
|
||||
tagPrefixes
|
||||
}, directive) {
|
||||
const [handle, prefix] = directive.parameters;
|
||||
if (!handle || !prefix) {
|
||||
const msg = 'Insufficient parameters given for %TAG directive';
|
||||
throw new YAMLSemanticError(directive, msg);
|
||||
}
|
||||
if (tagPrefixes.some(p => p.handle === handle)) {
|
||||
const msg = 'The %TAG directive must only be given at most once per handle in the same document.';
|
||||
throw new YAMLSemanticError(directive, msg);
|
||||
}
|
||||
return {
|
||||
handle,
|
||||
prefix
|
||||
};
|
||||
}
|
||||
function resolveYamlDirective(doc, directive) {
|
||||
let [version] = directive.parameters;
|
||||
if (directive.name === 'YAML:1.0') version = '1.0';
|
||||
if (!version) {
|
||||
const msg = 'Insufficient parameters given for %YAML directive';
|
||||
throw new YAMLSemanticError(directive, msg);
|
||||
}
|
||||
if (!documentOptions[version]) {
|
||||
const v0 = doc.version || doc.options.version;
|
||||
const msg = `Document will be parsed as YAML ${v0} rather than YAML ${version}`;
|
||||
doc.warnings.push(new YAMLWarning(directive, msg));
|
||||
}
|
||||
return version;
|
||||
}
|
||||
function parseDirectives(doc, directives, prevDoc) {
|
||||
const directiveComments = [];
|
||||
let hasDirectives = false;
|
||||
for (const directive of directives) {
|
||||
const {
|
||||
comment,
|
||||
name
|
||||
} = directive;
|
||||
switch (name) {
|
||||
case 'TAG':
|
||||
try {
|
||||
doc.tagPrefixes.push(resolveTagDirective(doc, directive));
|
||||
} catch (error) {
|
||||
doc.errors.push(error);
|
||||
}
|
||||
hasDirectives = true;
|
||||
break;
|
||||
case 'YAML':
|
||||
case 'YAML:1.0':
|
||||
if (doc.version) {
|
||||
const msg = 'The %YAML directive must only be given at most once per document.';
|
||||
doc.errors.push(new YAMLSemanticError(directive, msg));
|
||||
}
|
||||
try {
|
||||
doc.version = resolveYamlDirective(doc, directive);
|
||||
} catch (error) {
|
||||
doc.errors.push(error);
|
||||
}
|
||||
hasDirectives = true;
|
||||
break;
|
||||
default:
|
||||
if (name) {
|
||||
const msg = `YAML only supports %TAG and %YAML directives, and not %${name}`;
|
||||
doc.warnings.push(new YAMLWarning(directive, msg));
|
||||
}
|
||||
}
|
||||
if (comment) directiveComments.push(comment);
|
||||
}
|
||||
if (prevDoc && !hasDirectives && '1.1' === (doc.version || prevDoc.version || doc.options.version)) {
|
||||
const copyTagPrefix = ({
|
||||
handle,
|
||||
prefix
|
||||
}) => ({
|
||||
handle,
|
||||
prefix
|
||||
});
|
||||
doc.tagPrefixes = prevDoc.tagPrefixes.map(copyTagPrefix);
|
||||
doc.version = prevDoc.version;
|
||||
}
|
||||
doc.commentBefore = directiveComments.join('\n') || null;
|
||||
}
|
||||
|
||||
function assertCollection(contents) {
|
||||
if (contents instanceof Collection) return true;
|
||||
throw new Error('Expected a YAML collection as document contents');
|
||||
}
|
||||
class Document$1 {
|
||||
constructor(options) {
|
||||
this.anchors = new Anchors(options.anchorPrefix);
|
||||
this.commentBefore = null;
|
||||
this.comment = null;
|
||||
this.contents = null;
|
||||
this.directivesEndMarker = null;
|
||||
this.errors = [];
|
||||
this.options = options;
|
||||
this.schema = null;
|
||||
this.tagPrefixes = [];
|
||||
this.version = null;
|
||||
this.warnings = [];
|
||||
}
|
||||
add(value) {
|
||||
assertCollection(this.contents);
|
||||
return this.contents.add(value);
|
||||
}
|
||||
addIn(path, value) {
|
||||
assertCollection(this.contents);
|
||||
this.contents.addIn(path, value);
|
||||
}
|
||||
delete(key) {
|
||||
assertCollection(this.contents);
|
||||
return this.contents.delete(key);
|
||||
}
|
||||
deleteIn(path) {
|
||||
if (isEmptyPath(path)) {
|
||||
if (this.contents == null) return false;
|
||||
this.contents = null;
|
||||
return true;
|
||||
}
|
||||
assertCollection(this.contents);
|
||||
return this.contents.deleteIn(path);
|
||||
}
|
||||
getDefaults() {
|
||||
return Document$1.defaults[this.version] || Document$1.defaults[this.options.version] || {};
|
||||
}
|
||||
get(key, keepScalar) {
|
||||
return this.contents instanceof Collection ? this.contents.get(key, keepScalar) : undefined;
|
||||
}
|
||||
getIn(path, keepScalar) {
|
||||
if (isEmptyPath(path)) return !keepScalar && this.contents instanceof Scalar ? this.contents.value : this.contents;
|
||||
return this.contents instanceof Collection ? this.contents.getIn(path, keepScalar) : undefined;
|
||||
}
|
||||
has(key) {
|
||||
return this.contents instanceof Collection ? this.contents.has(key) : false;
|
||||
}
|
||||
hasIn(path) {
|
||||
if (isEmptyPath(path)) return this.contents !== undefined;
|
||||
return this.contents instanceof Collection ? this.contents.hasIn(path) : false;
|
||||
}
|
||||
set(key, value) {
|
||||
assertCollection(this.contents);
|
||||
this.contents.set(key, value);
|
||||
}
|
||||
setIn(path, value) {
|
||||
if (isEmptyPath(path)) this.contents = value;else {
|
||||
assertCollection(this.contents);
|
||||
this.contents.setIn(path, value);
|
||||
}
|
||||
}
|
||||
setSchema(id, customTags) {
|
||||
if (!id && !customTags && this.schema) return;
|
||||
if (typeof id === 'number') id = id.toFixed(1);
|
||||
if (id === '1.0' || id === '1.1' || id === '1.2') {
|
||||
if (this.version) this.version = id;else this.options.version = id;
|
||||
delete this.options.schema;
|
||||
} else if (id && typeof id === 'string') {
|
||||
this.options.schema = id;
|
||||
}
|
||||
if (Array.isArray(customTags)) this.options.customTags = customTags;
|
||||
const opt = Object.assign({}, this.getDefaults(), this.options);
|
||||
this.schema = new Schema(opt);
|
||||
}
|
||||
parse(node, prevDoc) {
|
||||
if (this.options.keepCstNodes) this.cstNode = node;
|
||||
if (this.options.keepNodeTypes) this.type = 'DOCUMENT';
|
||||
const {
|
||||
directives = [],
|
||||
contents = [],
|
||||
directivesEndMarker,
|
||||
error,
|
||||
valueRange
|
||||
} = node;
|
||||
if (error) {
|
||||
if (!error.source) error.source = this;
|
||||
this.errors.push(error);
|
||||
}
|
||||
parseDirectives(this, directives, prevDoc);
|
||||
if (directivesEndMarker) this.directivesEndMarker = true;
|
||||
this.range = valueRange ? [valueRange.start, valueRange.end] : null;
|
||||
this.setSchema();
|
||||
this.anchors._cstAliases = [];
|
||||
parseContents(this, contents);
|
||||
this.anchors.resolveNodes();
|
||||
if (this.options.prettyErrors) {
|
||||
for (const error of this.errors) if (error instanceof YAMLError) error.makePretty();
|
||||
for (const warn of this.warnings) if (warn instanceof YAMLError) warn.makePretty();
|
||||
}
|
||||
return this;
|
||||
}
|
||||
listNonDefaultTags() {
|
||||
return listTagNames(this.contents).filter(t => t.indexOf(Schema.defaultPrefix) !== 0);
|
||||
}
|
||||
setTagPrefix(handle, prefix) {
|
||||
if (handle[0] !== '!' || handle[handle.length - 1] !== '!') throw new Error('Handle must start and end with !');
|
||||
if (prefix) {
|
||||
const prev = this.tagPrefixes.find(p => p.handle === handle);
|
||||
if (prev) prev.prefix = prefix;else this.tagPrefixes.push({
|
||||
handle,
|
||||
prefix
|
||||
});
|
||||
} else {
|
||||
this.tagPrefixes = this.tagPrefixes.filter(p => p.handle !== handle);
|
||||
}
|
||||
}
|
||||
toJSON(arg, onAnchor) {
|
||||
const {
|
||||
keepBlobsInJSON,
|
||||
mapAsMap,
|
||||
maxAliasCount
|
||||
} = this.options;
|
||||
const keep = keepBlobsInJSON && (typeof arg !== 'string' || !(this.contents instanceof Scalar));
|
||||
const ctx = {
|
||||
doc: this,
|
||||
indentStep: ' ',
|
||||
keep,
|
||||
mapAsMap: keep && !!mapAsMap,
|
||||
maxAliasCount,
|
||||
stringify: stringify$1 // Requiring directly in Pair would create circular dependencies
|
||||
};
|
||||
const anchorNames = Object.keys(this.anchors.map);
|
||||
if (anchorNames.length > 0) ctx.anchors = new Map(anchorNames.map(name => [this.anchors.map[name], {
|
||||
alias: [],
|
||||
aliasCount: 0,
|
||||
count: 1
|
||||
}]));
|
||||
const res = toJSON(this.contents, arg, ctx);
|
||||
if (typeof onAnchor === 'function' && ctx.anchors) for (const {
|
||||
count,
|
||||
res
|
||||
} of ctx.anchors.values()) onAnchor(res, count);
|
||||
return res;
|
||||
}
|
||||
toString() {
|
||||
if (this.errors.length > 0) throw new Error('Document with errors cannot be stringified');
|
||||
const indentSize = this.options.indent;
|
||||
if (!Number.isInteger(indentSize) || indentSize <= 0) {
|
||||
const s = JSON.stringify(indentSize);
|
||||
throw new Error(`"indent" option must be a positive integer, not ${s}`);
|
||||
}
|
||||
this.setSchema();
|
||||
const lines = [];
|
||||
let hasDirectives = false;
|
||||
if (this.version) {
|
||||
let vd = '%YAML 1.2';
|
||||
if (this.schema.name === 'yaml-1.1') {
|
||||
if (this.version === '1.0') vd = '%YAML:1.0';else if (this.version === '1.1') vd = '%YAML 1.1';
|
||||
}
|
||||
lines.push(vd);
|
||||
hasDirectives = true;
|
||||
}
|
||||
const tagNames = this.listNonDefaultTags();
|
||||
this.tagPrefixes.forEach(({
|
||||
handle,
|
||||
prefix
|
||||
}) => {
|
||||
if (tagNames.some(t => t.indexOf(prefix) === 0)) {
|
||||
lines.push(`%TAG ${handle} ${prefix}`);
|
||||
hasDirectives = true;
|
||||
}
|
||||
});
|
||||
if (hasDirectives || this.directivesEndMarker) lines.push('---');
|
||||
if (this.commentBefore) {
|
||||
if (hasDirectives || !this.directivesEndMarker) lines.unshift('');
|
||||
lines.unshift(this.commentBefore.replace(/^/gm, '#'));
|
||||
}
|
||||
const ctx = {
|
||||
anchors: Object.create(null),
|
||||
doc: this,
|
||||
indent: '',
|
||||
indentStep: ' '.repeat(indentSize),
|
||||
stringify: stringify$1 // Requiring directly in nodes would create circular dependencies
|
||||
};
|
||||
let chompKeep = false;
|
||||
let contentComment = null;
|
||||
if (this.contents) {
|
||||
if (this.contents instanceof Node) {
|
||||
if (this.contents.spaceBefore && (hasDirectives || this.directivesEndMarker)) lines.push('');
|
||||
if (this.contents.commentBefore) lines.push(this.contents.commentBefore.replace(/^/gm, '#'));
|
||||
// top-level block scalars need to be indented if followed by a comment
|
||||
ctx.forceBlockIndent = !!this.comment;
|
||||
contentComment = this.contents.comment;
|
||||
}
|
||||
const onChompKeep = contentComment ? null : () => chompKeep = true;
|
||||
const body = stringify$1(this.contents, ctx, () => contentComment = null, onChompKeep);
|
||||
lines.push(addComment(body, '', contentComment));
|
||||
} else if (this.contents !== undefined) {
|
||||
lines.push(stringify$1(this.contents, ctx));
|
||||
}
|
||||
if (this.comment) {
|
||||
if ((!chompKeep || contentComment) && lines[lines.length - 1] !== '') lines.push('');
|
||||
lines.push(this.comment.replace(/^/gm, '#'));
|
||||
}
|
||||
return lines.join('\n') + '\n';
|
||||
}
|
||||
}
|
||||
_defineProperty(Document$1, "defaults", documentOptions);
|
||||
|
||||
function createNode(value, wrapScalars = true, tag) {
|
||||
if (tag === undefined && typeof wrapScalars === 'string') {
|
||||
tag = wrapScalars;
|
||||
wrapScalars = true;
|
||||
}
|
||||
const options = Object.assign({}, Document$1.defaults[defaultOptions.version], defaultOptions);
|
||||
const schema = new Schema(options);
|
||||
return schema.createNode(value, wrapScalars, tag);
|
||||
}
|
||||
class Document extends Document$1 {
|
||||
constructor(options) {
|
||||
super(Object.assign({}, defaultOptions, options));
|
||||
}
|
||||
}
|
||||
function parseAllDocuments(src, options) {
|
||||
const stream = [];
|
||||
let prev;
|
||||
for (const cstDoc of parse$1(src)) {
|
||||
const doc = new Document(options);
|
||||
doc.parse(cstDoc, prev);
|
||||
stream.push(doc);
|
||||
prev = doc;
|
||||
}
|
||||
return stream;
|
||||
}
|
||||
function parseDocument(src, options) {
|
||||
const cst = parse$1(src);
|
||||
const doc = new Document(options).parse(cst[0]);
|
||||
if (cst.length > 1) {
|
||||
const errMsg = 'Source contains multiple documents; please use YAML.parseAllDocuments()';
|
||||
doc.errors.unshift(new YAMLSemanticError(cst[1], errMsg));
|
||||
}
|
||||
return doc;
|
||||
}
|
||||
function parse(src, options) {
|
||||
const doc = parseDocument(src, options);
|
||||
doc.warnings.forEach(warning => warn(warning));
|
||||
if (doc.errors.length > 0) throw doc.errors[0];
|
||||
return doc.toJSON();
|
||||
}
|
||||
function stringify(value, options) {
|
||||
const doc = new Document(options);
|
||||
doc.contents = value;
|
||||
return String(doc);
|
||||
}
|
||||
const YAML = {
|
||||
createNode,
|
||||
defaultOptions,
|
||||
Document,
|
||||
parse,
|
||||
parseAllDocuments,
|
||||
parseCST: parse$1,
|
||||
parseDocument,
|
||||
scalarOptions,
|
||||
stringify
|
||||
};
|
||||
|
||||
export { YAML };
|
||||
3
node_modules/cosmiconfig/node_modules/yaml/browser/dist/legacy-exports.js
generated
vendored
Normal file
3
node_modules/cosmiconfig/node_modules/yaml/browser/dist/legacy-exports.js
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
export { b as binary, f as floatTime, i as intTime, o as omap, p as pairs, s as set, t as timestamp, c as warnFileDeprecation } from './warnings-5e4358fe.js';
|
||||
import './PlainValue-183afbad.js';
|
||||
import './resolveSeq-67caf78a.js';
|
||||
1
node_modules/cosmiconfig/node_modules/yaml/browser/dist/package.json
generated
vendored
Normal file
1
node_modules/cosmiconfig/node_modules/yaml/browser/dist/package.json
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{ "type": "module" }
|
||||
1505
node_modules/cosmiconfig/node_modules/yaml/browser/dist/parse-cst.js
generated
vendored
Normal file
1505
node_modules/cosmiconfig/node_modules/yaml/browser/dist/parse-cst.js
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
1835
node_modules/cosmiconfig/node_modules/yaml/browser/dist/resolveSeq-67caf78a.js
generated
vendored
Normal file
1835
node_modules/cosmiconfig/node_modules/yaml/browser/dist/resolveSeq-67caf78a.js
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
4
node_modules/cosmiconfig/node_modules/yaml/browser/dist/types.js
generated
vendored
Normal file
4
node_modules/cosmiconfig/node_modules/yaml/browser/dist/types.js
generated
vendored
Normal file
@@ -0,0 +1,4 @@
|
||||
export { A as Alias, C as Collection, M as Merge, N as Node, P as Pair, S as Scalar, d as YAMLMap, Y as YAMLSeq, b as binaryOptions, a as boolOptions, i as intOptions, n as nullOptions, s as strOptions } from './resolveSeq-67caf78a.js';
|
||||
export { S as Schema } from './Schema-9530c078.js';
|
||||
import './PlainValue-183afbad.js';
|
||||
import './warnings-5e4358fe.js';
|
||||
2
node_modules/cosmiconfig/node_modules/yaml/browser/dist/util.js
generated
vendored
Normal file
2
node_modules/cosmiconfig/node_modules/yaml/browser/dist/util.js
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
export { l as findPair, g as parseMap, h as parseSeq, k as stringifyNumber, c as stringifyString, t as toJSON } from './resolveSeq-67caf78a.js';
|
||||
export { T as Type, c as YAMLError, f as YAMLReferenceError, b as YAMLSemanticError, Y as YAMLSyntaxError, a as YAMLWarning } from './PlainValue-183afbad.js';
|
||||
348
node_modules/cosmiconfig/node_modules/yaml/browser/dist/warnings-5e4358fe.js
generated
vendored
Normal file
348
node_modules/cosmiconfig/node_modules/yaml/browser/dist/warnings-5e4358fe.js
generated
vendored
Normal file
@@ -0,0 +1,348 @@
|
||||
import { f as YAMLReferenceError, T as Type, b as YAMLSemanticError, _ as _defineProperty } from './PlainValue-183afbad.js';
|
||||
import { j as resolveString, b as binaryOptions, c as stringifyString, h as resolveSeq, P as Pair, d as YAMLMap, Y as YAMLSeq, t as toJSON, S as Scalar, l as findPair, g as resolveMap, k as stringifyNumber } from './resolveSeq-67caf78a.js';
|
||||
|
||||
/* global atob, btoa, Buffer */
|
||||
const binary = {
|
||||
identify: value => value instanceof Uint8Array,
|
||||
// Buffer inherits from Uint8Array
|
||||
default: false,
|
||||
tag: 'tag:yaml.org,2002:binary',
|
||||
/**
|
||||
* Returns a Buffer in node and an Uint8Array in browsers
|
||||
*
|
||||
* To use the resulting buffer as an image, you'll want to do something like:
|
||||
*
|
||||
* const blob = new Blob([buffer], { type: 'image/jpeg' })
|
||||
* document.querySelector('#photo').src = URL.createObjectURL(blob)
|
||||
*/
|
||||
resolve: (doc, node) => {
|
||||
const src = resolveString(doc, node);
|
||||
if (typeof Buffer === 'function') {
|
||||
return Buffer.from(src, 'base64');
|
||||
} else if (typeof atob === 'function') {
|
||||
// On IE 11, atob() can't handle newlines
|
||||
const str = atob(src.replace(/[\n\r]/g, ''));
|
||||
const buffer = new Uint8Array(str.length);
|
||||
for (let i = 0; i < str.length; ++i) buffer[i] = str.charCodeAt(i);
|
||||
return buffer;
|
||||
} else {
|
||||
const msg = 'This environment does not support reading binary tags; either Buffer or atob is required';
|
||||
doc.errors.push(new YAMLReferenceError(node, msg));
|
||||
return null;
|
||||
}
|
||||
},
|
||||
options: binaryOptions,
|
||||
stringify: ({
|
||||
comment,
|
||||
type,
|
||||
value
|
||||
}, ctx, onComment, onChompKeep) => {
|
||||
let src;
|
||||
if (typeof Buffer === 'function') {
|
||||
src = value instanceof Buffer ? value.toString('base64') : Buffer.from(value.buffer).toString('base64');
|
||||
} else if (typeof btoa === 'function') {
|
||||
let s = '';
|
||||
for (let i = 0; i < value.length; ++i) s += String.fromCharCode(value[i]);
|
||||
src = btoa(s);
|
||||
} else {
|
||||
throw new Error('This environment does not support writing binary tags; either Buffer or btoa is required');
|
||||
}
|
||||
if (!type) type = binaryOptions.defaultType;
|
||||
if (type === Type.QUOTE_DOUBLE) {
|
||||
value = src;
|
||||
} else {
|
||||
const {
|
||||
lineWidth
|
||||
} = binaryOptions;
|
||||
const n = Math.ceil(src.length / lineWidth);
|
||||
const lines = new Array(n);
|
||||
for (let i = 0, o = 0; i < n; ++i, o += lineWidth) {
|
||||
lines[i] = src.substr(o, lineWidth);
|
||||
}
|
||||
value = lines.join(type === Type.BLOCK_LITERAL ? '\n' : ' ');
|
||||
}
|
||||
return stringifyString({
|
||||
comment,
|
||||
type,
|
||||
value
|
||||
}, ctx, onComment, onChompKeep);
|
||||
}
|
||||
};
|
||||
|
||||
function parsePairs(doc, cst) {
|
||||
const seq = resolveSeq(doc, cst);
|
||||
for (let i = 0; i < seq.items.length; ++i) {
|
||||
let item = seq.items[i];
|
||||
if (item instanceof Pair) continue;else if (item instanceof YAMLMap) {
|
||||
if (item.items.length > 1) {
|
||||
const msg = 'Each pair must have its own sequence indicator';
|
||||
throw new YAMLSemanticError(cst, msg);
|
||||
}
|
||||
const pair = item.items[0] || new Pair();
|
||||
if (item.commentBefore) pair.commentBefore = pair.commentBefore ? `${item.commentBefore}\n${pair.commentBefore}` : item.commentBefore;
|
||||
if (item.comment) pair.comment = pair.comment ? `${item.comment}\n${pair.comment}` : item.comment;
|
||||
item = pair;
|
||||
}
|
||||
seq.items[i] = item instanceof Pair ? item : new Pair(item);
|
||||
}
|
||||
return seq;
|
||||
}
|
||||
function createPairs(schema, iterable, ctx) {
|
||||
const pairs = new YAMLSeq(schema);
|
||||
pairs.tag = 'tag:yaml.org,2002:pairs';
|
||||
for (const it of iterable) {
|
||||
let key, value;
|
||||
if (Array.isArray(it)) {
|
||||
if (it.length === 2) {
|
||||
key = it[0];
|
||||
value = it[1];
|
||||
} else throw new TypeError(`Expected [key, value] tuple: ${it}`);
|
||||
} else if (it && it instanceof Object) {
|
||||
const keys = Object.keys(it);
|
||||
if (keys.length === 1) {
|
||||
key = keys[0];
|
||||
value = it[key];
|
||||
} else throw new TypeError(`Expected { key: value } tuple: ${it}`);
|
||||
} else {
|
||||
key = it;
|
||||
}
|
||||
const pair = schema.createPair(key, value, ctx);
|
||||
pairs.items.push(pair);
|
||||
}
|
||||
return pairs;
|
||||
}
|
||||
const pairs = {
|
||||
default: false,
|
||||
tag: 'tag:yaml.org,2002:pairs',
|
||||
resolve: parsePairs,
|
||||
createNode: createPairs
|
||||
};
|
||||
|
||||
class YAMLOMap extends YAMLSeq {
|
||||
constructor() {
|
||||
super();
|
||||
_defineProperty(this, "add", YAMLMap.prototype.add.bind(this));
|
||||
_defineProperty(this, "delete", YAMLMap.prototype.delete.bind(this));
|
||||
_defineProperty(this, "get", YAMLMap.prototype.get.bind(this));
|
||||
_defineProperty(this, "has", YAMLMap.prototype.has.bind(this));
|
||||
_defineProperty(this, "set", YAMLMap.prototype.set.bind(this));
|
||||
this.tag = YAMLOMap.tag;
|
||||
}
|
||||
toJSON(_, ctx) {
|
||||
const map = new Map();
|
||||
if (ctx && ctx.onCreate) ctx.onCreate(map);
|
||||
for (const pair of this.items) {
|
||||
let key, value;
|
||||
if (pair instanceof Pair) {
|
||||
key = toJSON(pair.key, '', ctx);
|
||||
value = toJSON(pair.value, key, ctx);
|
||||
} else {
|
||||
key = toJSON(pair, '', ctx);
|
||||
}
|
||||
if (map.has(key)) throw new Error('Ordered maps must not include duplicate keys');
|
||||
map.set(key, value);
|
||||
}
|
||||
return map;
|
||||
}
|
||||
}
|
||||
_defineProperty(YAMLOMap, "tag", 'tag:yaml.org,2002:omap');
|
||||
function parseOMap(doc, cst) {
|
||||
const pairs = parsePairs(doc, cst);
|
||||
const seenKeys = [];
|
||||
for (const {
|
||||
key
|
||||
} of pairs.items) {
|
||||
if (key instanceof Scalar) {
|
||||
if (seenKeys.includes(key.value)) {
|
||||
const msg = 'Ordered maps must not include duplicate keys';
|
||||
throw new YAMLSemanticError(cst, msg);
|
||||
} else {
|
||||
seenKeys.push(key.value);
|
||||
}
|
||||
}
|
||||
}
|
||||
return Object.assign(new YAMLOMap(), pairs);
|
||||
}
|
||||
function createOMap(schema, iterable, ctx) {
|
||||
const pairs = createPairs(schema, iterable, ctx);
|
||||
const omap = new YAMLOMap();
|
||||
omap.items = pairs.items;
|
||||
return omap;
|
||||
}
|
||||
const omap = {
|
||||
identify: value => value instanceof Map,
|
||||
nodeClass: YAMLOMap,
|
||||
default: false,
|
||||
tag: 'tag:yaml.org,2002:omap',
|
||||
resolve: parseOMap,
|
||||
createNode: createOMap
|
||||
};
|
||||
|
||||
class YAMLSet extends YAMLMap {
|
||||
constructor() {
|
||||
super();
|
||||
this.tag = YAMLSet.tag;
|
||||
}
|
||||
add(key) {
|
||||
const pair = key instanceof Pair ? key : new Pair(key);
|
||||
const prev = findPair(this.items, pair.key);
|
||||
if (!prev) this.items.push(pair);
|
||||
}
|
||||
get(key, keepPair) {
|
||||
const pair = findPair(this.items, key);
|
||||
return !keepPair && pair instanceof Pair ? pair.key instanceof Scalar ? pair.key.value : pair.key : pair;
|
||||
}
|
||||
set(key, value) {
|
||||
if (typeof value !== 'boolean') throw new Error(`Expected boolean value for set(key, value) in a YAML set, not ${typeof value}`);
|
||||
const prev = findPair(this.items, key);
|
||||
if (prev && !value) {
|
||||
this.items.splice(this.items.indexOf(prev), 1);
|
||||
} else if (!prev && value) {
|
||||
this.items.push(new Pair(key));
|
||||
}
|
||||
}
|
||||
toJSON(_, ctx) {
|
||||
return super.toJSON(_, ctx, Set);
|
||||
}
|
||||
toString(ctx, onComment, onChompKeep) {
|
||||
if (!ctx) return JSON.stringify(this);
|
||||
if (this.hasAllNullValues()) return super.toString(ctx, onComment, onChompKeep);else throw new Error('Set items must all have null values');
|
||||
}
|
||||
}
|
||||
_defineProperty(YAMLSet, "tag", 'tag:yaml.org,2002:set');
|
||||
function parseSet(doc, cst) {
|
||||
const map = resolveMap(doc, cst);
|
||||
if (!map.hasAllNullValues()) throw new YAMLSemanticError(cst, 'Set items must all have null values');
|
||||
return Object.assign(new YAMLSet(), map);
|
||||
}
|
||||
function createSet(schema, iterable, ctx) {
|
||||
const set = new YAMLSet();
|
||||
for (const value of iterable) set.items.push(schema.createPair(value, null, ctx));
|
||||
return set;
|
||||
}
|
||||
const set = {
|
||||
identify: value => value instanceof Set,
|
||||
nodeClass: YAMLSet,
|
||||
default: false,
|
||||
tag: 'tag:yaml.org,2002:set',
|
||||
resolve: parseSet,
|
||||
createNode: createSet
|
||||
};
|
||||
|
||||
const parseSexagesimal = (sign, parts) => {
|
||||
const n = parts.split(':').reduce((n, p) => n * 60 + Number(p), 0);
|
||||
return sign === '-' ? -n : n;
|
||||
};
|
||||
|
||||
// hhhh:mm:ss.sss
|
||||
const stringifySexagesimal = ({
|
||||
value
|
||||
}) => {
|
||||
if (isNaN(value) || !isFinite(value)) return stringifyNumber(value);
|
||||
let sign = '';
|
||||
if (value < 0) {
|
||||
sign = '-';
|
||||
value = Math.abs(value);
|
||||
}
|
||||
const parts = [value % 60]; // seconds, including ms
|
||||
if (value < 60) {
|
||||
parts.unshift(0); // at least one : is required
|
||||
} else {
|
||||
value = Math.round((value - parts[0]) / 60);
|
||||
parts.unshift(value % 60); // minutes
|
||||
if (value >= 60) {
|
||||
value = Math.round((value - parts[0]) / 60);
|
||||
parts.unshift(value); // hours
|
||||
}
|
||||
}
|
||||
return sign + parts.map(n => n < 10 ? '0' + String(n) : String(n)).join(':').replace(/000000\d*$/, '') // % 60 may introduce error
|
||||
;
|
||||
};
|
||||
const intTime = {
|
||||
identify: value => typeof value === 'number',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:int',
|
||||
format: 'TIME',
|
||||
test: /^([-+]?)([0-9][0-9_]*(?::[0-5]?[0-9])+)$/,
|
||||
resolve: (str, sign, parts) => parseSexagesimal(sign, parts.replace(/_/g, '')),
|
||||
stringify: stringifySexagesimal
|
||||
};
|
||||
const floatTime = {
|
||||
identify: value => typeof value === 'number',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:float',
|
||||
format: 'TIME',
|
||||
test: /^([-+]?)([0-9][0-9_]*(?::[0-5]?[0-9])+\.[0-9_]*)$/,
|
||||
resolve: (str, sign, parts) => parseSexagesimal(sign, parts.replace(/_/g, '')),
|
||||
stringify: stringifySexagesimal
|
||||
};
|
||||
const timestamp = {
|
||||
identify: value => value instanceof Date,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:timestamp',
|
||||
// If the time zone is omitted, the timestamp is assumed to be specified in UTC. The time part
|
||||
// may be omitted altogether, resulting in a date format. In such a case, the time part is
|
||||
// assumed to be 00:00:00Z (start of day, UTC).
|
||||
test: RegExp('^(?:' + '([0-9]{4})-([0-9]{1,2})-([0-9]{1,2})' +
|
||||
// YYYY-Mm-Dd
|
||||
'(?:(?:t|T|[ \\t]+)' +
|
||||
// t | T | whitespace
|
||||
'([0-9]{1,2}):([0-9]{1,2}):([0-9]{1,2}(\\.[0-9]+)?)' +
|
||||
// Hh:Mm:Ss(.ss)?
|
||||
'(?:[ \\t]*(Z|[-+][012]?[0-9](?::[0-9]{2})?))?' +
|
||||
// Z | +5 | -03:30
|
||||
')?' + ')$'),
|
||||
resolve: (str, year, month, day, hour, minute, second, millisec, tz) => {
|
||||
if (millisec) millisec = (millisec + '00').substr(1, 3);
|
||||
let date = Date.UTC(year, month - 1, day, hour || 0, minute || 0, second || 0, millisec || 0);
|
||||
if (tz && tz !== 'Z') {
|
||||
let d = parseSexagesimal(tz[0], tz.slice(1));
|
||||
if (Math.abs(d) < 30) d *= 60;
|
||||
date -= 60000 * d;
|
||||
}
|
||||
return new Date(date);
|
||||
},
|
||||
stringify: ({
|
||||
value
|
||||
}) => value.toISOString().replace(/((T00:00)?:00)?\.000Z$/, '')
|
||||
};
|
||||
|
||||
/* global console, process, YAML_SILENCE_DEPRECATION_WARNINGS, YAML_SILENCE_WARNINGS */
|
||||
|
||||
function shouldWarn(deprecation) {
|
||||
const env = typeof process !== 'undefined' && process.env || {};
|
||||
if (deprecation) {
|
||||
if (typeof YAML_SILENCE_DEPRECATION_WARNINGS !== 'undefined') return !YAML_SILENCE_DEPRECATION_WARNINGS;
|
||||
return !env.YAML_SILENCE_DEPRECATION_WARNINGS;
|
||||
}
|
||||
if (typeof YAML_SILENCE_WARNINGS !== 'undefined') return !YAML_SILENCE_WARNINGS;
|
||||
return !env.YAML_SILENCE_WARNINGS;
|
||||
}
|
||||
function warn(warning, type) {
|
||||
if (shouldWarn(false)) {
|
||||
const emit = typeof process !== 'undefined' && process.emitWarning;
|
||||
// This will throw in Jest if `warning` is an Error instance due to
|
||||
// https://github.com/facebook/jest/issues/2549
|
||||
if (emit) emit(warning, type);else {
|
||||
// eslint-disable-next-line no-console
|
||||
console.warn(type ? `${type}: ${warning}` : warning);
|
||||
}
|
||||
}
|
||||
}
|
||||
function warnFileDeprecation(filename) {
|
||||
if (shouldWarn(true)) {
|
||||
const path = filename.replace(/.*yaml[/\\]/i, '').replace(/\.js$/, '').replace(/\\/g, '/');
|
||||
warn(`The endpoint 'yaml/${path}' will be removed in a future release.`, 'DeprecationWarning');
|
||||
}
|
||||
}
|
||||
const warned = {};
|
||||
function warnOptionDeprecation(name, alternative) {
|
||||
if (!warned[name] && shouldWarn(true)) {
|
||||
warned[name] = true;
|
||||
let msg = `The option '${name}' will be removed in a future release`;
|
||||
msg += alternative ? `, use '${alternative}' instead.` : '.';
|
||||
warn(msg, 'DeprecationWarning');
|
||||
}
|
||||
}
|
||||
|
||||
export { warnOptionDeprecation as a, binary as b, warnFileDeprecation as c, floatTime as f, intTime as i, omap as o, pairs as p, set as s, timestamp as t, warn as w };
|
||||
1
node_modules/cosmiconfig/node_modules/yaml/browser/index.js
generated
vendored
Normal file
1
node_modules/cosmiconfig/node_modules/yaml/browser/index.js
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
module.exports = require('./dist').YAML
|
||||
2
node_modules/cosmiconfig/node_modules/yaml/browser/map.js
generated
vendored
Normal file
2
node_modules/cosmiconfig/node_modules/yaml/browser/map.js
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
module.exports = require('./dist/types').YAMLMap
|
||||
require('./dist/legacy-exports').warnFileDeprecation(__filename)
|
||||
2
node_modules/cosmiconfig/node_modules/yaml/browser/pair.js
generated
vendored
Normal file
2
node_modules/cosmiconfig/node_modules/yaml/browser/pair.js
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
module.exports = require('./dist/types').Pair
|
||||
require('./dist/legacy-exports').warnFileDeprecation(__filename)
|
||||
1
node_modules/cosmiconfig/node_modules/yaml/browser/parse-cst.js
generated
vendored
Normal file
1
node_modules/cosmiconfig/node_modules/yaml/browser/parse-cst.js
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
module.exports = require('./dist/parse-cst').parse
|
||||
2
node_modules/cosmiconfig/node_modules/yaml/browser/scalar.js
generated
vendored
Normal file
2
node_modules/cosmiconfig/node_modules/yaml/browser/scalar.js
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
module.exports = require('./dist/types').Scalar
|
||||
require('./dist/legacy-exports').warnFileDeprecation(__filename)
|
||||
9
node_modules/cosmiconfig/node_modules/yaml/browser/schema.js
generated
vendored
Normal file
9
node_modules/cosmiconfig/node_modules/yaml/browser/schema.js
generated
vendored
Normal file
@@ -0,0 +1,9 @@
|
||||
const types = require('./dist/types')
|
||||
const util = require('./dist/util')
|
||||
|
||||
module.exports = types.Schema
|
||||
module.exports.nullOptions = types.nullOptions
|
||||
module.exports.strOptions = types.strOptions
|
||||
module.exports.stringify = util.stringifyString
|
||||
|
||||
require('./dist/legacy-exports').warnFileDeprecation(__filename)
|
||||
2
node_modules/cosmiconfig/node_modules/yaml/browser/seq.js
generated
vendored
Normal file
2
node_modules/cosmiconfig/node_modules/yaml/browser/seq.js
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
module.exports = require('./dist/types').YAMLSeq
|
||||
require('./dist/legacy-exports').warnFileDeprecation(__filename)
|
||||
1
node_modules/cosmiconfig/node_modules/yaml/browser/types.js
generated
vendored
Normal file
1
node_modules/cosmiconfig/node_modules/yaml/browser/types.js
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
module.exports = require('./dist/types')
|
||||
8
node_modules/cosmiconfig/node_modules/yaml/browser/types/binary.js
generated
vendored
Normal file
8
node_modules/cosmiconfig/node_modules/yaml/browser/types/binary.js
generated
vendored
Normal file
@@ -0,0 +1,8 @@
|
||||
'use strict'
|
||||
Object.defineProperty(exports, '__esModule', { value: true })
|
||||
|
||||
const legacy = require('../dist/legacy-exports')
|
||||
exports.binary = legacy.binary
|
||||
exports.default = [exports.binary]
|
||||
|
||||
legacy.warnFileDeprecation(__filename)
|
||||
3
node_modules/cosmiconfig/node_modules/yaml/browser/types/omap.js
generated
vendored
Normal file
3
node_modules/cosmiconfig/node_modules/yaml/browser/types/omap.js
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
const legacy = require('../dist/legacy-exports')
|
||||
module.exports = legacy.omap
|
||||
legacy.warnFileDeprecation(__filename)
|
||||
3
node_modules/cosmiconfig/node_modules/yaml/browser/types/pairs.js
generated
vendored
Normal file
3
node_modules/cosmiconfig/node_modules/yaml/browser/types/pairs.js
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
const legacy = require('../dist/legacy-exports')
|
||||
module.exports = legacy.pairs
|
||||
legacy.warnFileDeprecation(__filename)
|
||||
3
node_modules/cosmiconfig/node_modules/yaml/browser/types/set.js
generated
vendored
Normal file
3
node_modules/cosmiconfig/node_modules/yaml/browser/types/set.js
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
const legacy = require('../dist/legacy-exports')
|
||||
module.exports = legacy.set
|
||||
legacy.warnFileDeprecation(__filename)
|
||||
10
node_modules/cosmiconfig/node_modules/yaml/browser/types/timestamp.js
generated
vendored
Normal file
10
node_modules/cosmiconfig/node_modules/yaml/browser/types/timestamp.js
generated
vendored
Normal file
@@ -0,0 +1,10 @@
|
||||
'use strict'
|
||||
Object.defineProperty(exports, '__esModule', { value: true })
|
||||
|
||||
const legacy = require('../dist/legacy-exports')
|
||||
exports.default = [legacy.intTime, legacy.floatTime, legacy.timestamp]
|
||||
exports.floatTime = legacy.floatTime
|
||||
exports.intTime = legacy.intTime
|
||||
exports.timestamp = legacy.timestamp
|
||||
|
||||
legacy.warnFileDeprecation(__filename)
|
||||
1
node_modules/cosmiconfig/node_modules/yaml/browser/util.js
generated
vendored
Normal file
1
node_modules/cosmiconfig/node_modules/yaml/browser/util.js
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
module.exports = require('./dist/util')
|
||||
637
node_modules/cosmiconfig/node_modules/yaml/dist/Document-a8d0fbf9.js
generated
vendored
Normal file
637
node_modules/cosmiconfig/node_modules/yaml/dist/Document-a8d0fbf9.js
generated
vendored
Normal file
@@ -0,0 +1,637 @@
|
||||
'use strict';
|
||||
|
||||
var PlainValue = require('./PlainValue-516d5bc2.js');
|
||||
var resolveSeq = require('./resolveSeq-95613e94.js');
|
||||
var Schema = require('./Schema-bcc6c2d7.js');
|
||||
|
||||
const defaultOptions = {
|
||||
anchorPrefix: 'a',
|
||||
customTags: null,
|
||||
indent: 2,
|
||||
indentSeq: true,
|
||||
keepCstNodes: false,
|
||||
keepNodeTypes: true,
|
||||
keepBlobsInJSON: true,
|
||||
mapAsMap: false,
|
||||
maxAliasCount: 100,
|
||||
prettyErrors: false,
|
||||
// TODO Set true in v2
|
||||
simpleKeys: false,
|
||||
version: '1.2'
|
||||
};
|
||||
const scalarOptions = {
|
||||
get binary() {
|
||||
return resolveSeq.binaryOptions;
|
||||
},
|
||||
set binary(opt) {
|
||||
Object.assign(resolveSeq.binaryOptions, opt);
|
||||
},
|
||||
get bool() {
|
||||
return resolveSeq.boolOptions;
|
||||
},
|
||||
set bool(opt) {
|
||||
Object.assign(resolveSeq.boolOptions, opt);
|
||||
},
|
||||
get int() {
|
||||
return resolveSeq.intOptions;
|
||||
},
|
||||
set int(opt) {
|
||||
Object.assign(resolveSeq.intOptions, opt);
|
||||
},
|
||||
get null() {
|
||||
return resolveSeq.nullOptions;
|
||||
},
|
||||
set null(opt) {
|
||||
Object.assign(resolveSeq.nullOptions, opt);
|
||||
},
|
||||
get str() {
|
||||
return resolveSeq.strOptions;
|
||||
},
|
||||
set str(opt) {
|
||||
Object.assign(resolveSeq.strOptions, opt);
|
||||
}
|
||||
};
|
||||
const documentOptions = {
|
||||
'1.0': {
|
||||
schema: 'yaml-1.1',
|
||||
merge: true,
|
||||
tagPrefixes: [{
|
||||
handle: '!',
|
||||
prefix: PlainValue.defaultTagPrefix
|
||||
}, {
|
||||
handle: '!!',
|
||||
prefix: 'tag:private.yaml.org,2002:'
|
||||
}]
|
||||
},
|
||||
1.1: {
|
||||
schema: 'yaml-1.1',
|
||||
merge: true,
|
||||
tagPrefixes: [{
|
||||
handle: '!',
|
||||
prefix: '!'
|
||||
}, {
|
||||
handle: '!!',
|
||||
prefix: PlainValue.defaultTagPrefix
|
||||
}]
|
||||
},
|
||||
1.2: {
|
||||
schema: 'core',
|
||||
merge: false,
|
||||
tagPrefixes: [{
|
||||
handle: '!',
|
||||
prefix: '!'
|
||||
}, {
|
||||
handle: '!!',
|
||||
prefix: PlainValue.defaultTagPrefix
|
||||
}]
|
||||
}
|
||||
};
|
||||
|
||||
function stringifyTag(doc, tag) {
|
||||
if ((doc.version || doc.options.version) === '1.0') {
|
||||
const priv = tag.match(/^tag:private\.yaml\.org,2002:([^:/]+)$/);
|
||||
if (priv) return '!' + priv[1];
|
||||
const vocab = tag.match(/^tag:([a-zA-Z0-9-]+)\.yaml\.org,2002:(.*)/);
|
||||
return vocab ? `!${vocab[1]}/${vocab[2]}` : `!${tag.replace(/^tag:/, '')}`;
|
||||
}
|
||||
let p = doc.tagPrefixes.find(p => tag.indexOf(p.prefix) === 0);
|
||||
if (!p) {
|
||||
const dtp = doc.getDefaults().tagPrefixes;
|
||||
p = dtp && dtp.find(p => tag.indexOf(p.prefix) === 0);
|
||||
}
|
||||
if (!p) return tag[0] === '!' ? tag : `!<${tag}>`;
|
||||
const suffix = tag.substr(p.prefix.length).replace(/[!,[\]{}]/g, ch => ({
|
||||
'!': '%21',
|
||||
',': '%2C',
|
||||
'[': '%5B',
|
||||
']': '%5D',
|
||||
'{': '%7B',
|
||||
'}': '%7D'
|
||||
})[ch]);
|
||||
return p.handle + suffix;
|
||||
}
|
||||
|
||||
function getTagObject(tags, item) {
|
||||
if (item instanceof resolveSeq.Alias) return resolveSeq.Alias;
|
||||
if (item.tag) {
|
||||
const match = tags.filter(t => t.tag === item.tag);
|
||||
if (match.length > 0) return match.find(t => t.format === item.format) || match[0];
|
||||
}
|
||||
let tagObj, obj;
|
||||
if (item instanceof resolveSeq.Scalar) {
|
||||
obj = item.value;
|
||||
// TODO: deprecate/remove class check
|
||||
const match = tags.filter(t => t.identify && t.identify(obj) || t.class && obj instanceof t.class);
|
||||
tagObj = match.find(t => t.format === item.format) || match.find(t => !t.format);
|
||||
} else {
|
||||
obj = item;
|
||||
tagObj = tags.find(t => t.nodeClass && obj instanceof t.nodeClass);
|
||||
}
|
||||
if (!tagObj) {
|
||||
const name = obj && obj.constructor ? obj.constructor.name : typeof obj;
|
||||
throw new Error(`Tag not resolved for ${name} value`);
|
||||
}
|
||||
return tagObj;
|
||||
}
|
||||
|
||||
// needs to be called before value stringifier to allow for circular anchor refs
|
||||
function stringifyProps(node, tagObj, {
|
||||
anchors,
|
||||
doc
|
||||
}) {
|
||||
const props = [];
|
||||
const anchor = doc.anchors.getName(node);
|
||||
if (anchor) {
|
||||
anchors[anchor] = node;
|
||||
props.push(`&${anchor}`);
|
||||
}
|
||||
if (node.tag) {
|
||||
props.push(stringifyTag(doc, node.tag));
|
||||
} else if (!tagObj.default) {
|
||||
props.push(stringifyTag(doc, tagObj.tag));
|
||||
}
|
||||
return props.join(' ');
|
||||
}
|
||||
function stringify(item, ctx, onComment, onChompKeep) {
|
||||
const {
|
||||
anchors,
|
||||
schema
|
||||
} = ctx.doc;
|
||||
let tagObj;
|
||||
if (!(item instanceof resolveSeq.Node)) {
|
||||
const createCtx = {
|
||||
aliasNodes: [],
|
||||
onTagObj: o => tagObj = o,
|
||||
prevObjects: new Map()
|
||||
};
|
||||
item = schema.createNode(item, true, null, createCtx);
|
||||
for (const alias of createCtx.aliasNodes) {
|
||||
alias.source = alias.source.node;
|
||||
let name = anchors.getName(alias.source);
|
||||
if (!name) {
|
||||
name = anchors.newName();
|
||||
anchors.map[name] = alias.source;
|
||||
}
|
||||
}
|
||||
}
|
||||
if (item instanceof resolveSeq.Pair) return item.toString(ctx, onComment, onChompKeep);
|
||||
if (!tagObj) tagObj = getTagObject(schema.tags, item);
|
||||
const props = stringifyProps(item, tagObj, ctx);
|
||||
if (props.length > 0) ctx.indentAtStart = (ctx.indentAtStart || 0) + props.length + 1;
|
||||
const str = typeof tagObj.stringify === 'function' ? tagObj.stringify(item, ctx, onComment, onChompKeep) : item instanceof resolveSeq.Scalar ? resolveSeq.stringifyString(item, ctx, onComment, onChompKeep) : item.toString(ctx, onComment, onChompKeep);
|
||||
if (!props) return str;
|
||||
return item instanceof resolveSeq.Scalar || str[0] === '{' || str[0] === '[' ? `${props} ${str}` : `${props}\n${ctx.indent}${str}`;
|
||||
}
|
||||
|
||||
class Anchors {
|
||||
static validAnchorNode(node) {
|
||||
return node instanceof resolveSeq.Scalar || node instanceof resolveSeq.YAMLSeq || node instanceof resolveSeq.YAMLMap;
|
||||
}
|
||||
constructor(prefix) {
|
||||
PlainValue._defineProperty(this, "map", Object.create(null));
|
||||
this.prefix = prefix;
|
||||
}
|
||||
createAlias(node, name) {
|
||||
this.setAnchor(node, name);
|
||||
return new resolveSeq.Alias(node);
|
||||
}
|
||||
createMergePair(...sources) {
|
||||
const merge = new resolveSeq.Merge();
|
||||
merge.value.items = sources.map(s => {
|
||||
if (s instanceof resolveSeq.Alias) {
|
||||
if (s.source instanceof resolveSeq.YAMLMap) return s;
|
||||
} else if (s instanceof resolveSeq.YAMLMap) {
|
||||
return this.createAlias(s);
|
||||
}
|
||||
throw new Error('Merge sources must be Map nodes or their Aliases');
|
||||
});
|
||||
return merge;
|
||||
}
|
||||
getName(node) {
|
||||
const {
|
||||
map
|
||||
} = this;
|
||||
return Object.keys(map).find(a => map[a] === node);
|
||||
}
|
||||
getNames() {
|
||||
return Object.keys(this.map);
|
||||
}
|
||||
getNode(name) {
|
||||
return this.map[name];
|
||||
}
|
||||
newName(prefix) {
|
||||
if (!prefix) prefix = this.prefix;
|
||||
const names = Object.keys(this.map);
|
||||
for (let i = 1; true; ++i) {
|
||||
const name = `${prefix}${i}`;
|
||||
if (!names.includes(name)) return name;
|
||||
}
|
||||
}
|
||||
|
||||
// During parsing, map & aliases contain CST nodes
|
||||
resolveNodes() {
|
||||
const {
|
||||
map,
|
||||
_cstAliases
|
||||
} = this;
|
||||
Object.keys(map).forEach(a => {
|
||||
map[a] = map[a].resolved;
|
||||
});
|
||||
_cstAliases.forEach(a => {
|
||||
a.source = a.source.resolved;
|
||||
});
|
||||
delete this._cstAliases;
|
||||
}
|
||||
setAnchor(node, name) {
|
||||
if (node != null && !Anchors.validAnchorNode(node)) {
|
||||
throw new Error('Anchors may only be set for Scalar, Seq and Map nodes');
|
||||
}
|
||||
if (name && /[\x00-\x19\s,[\]{}]/.test(name)) {
|
||||
throw new Error('Anchor names must not contain whitespace or control characters');
|
||||
}
|
||||
const {
|
||||
map
|
||||
} = this;
|
||||
const prev = node && Object.keys(map).find(a => map[a] === node);
|
||||
if (prev) {
|
||||
if (!name) {
|
||||
return prev;
|
||||
} else if (prev !== name) {
|
||||
delete map[prev];
|
||||
map[name] = node;
|
||||
}
|
||||
} else {
|
||||
if (!name) {
|
||||
if (!node) return null;
|
||||
name = this.newName();
|
||||
}
|
||||
map[name] = node;
|
||||
}
|
||||
return name;
|
||||
}
|
||||
}
|
||||
|
||||
const visit = (node, tags) => {
|
||||
if (node && typeof node === 'object') {
|
||||
const {
|
||||
tag
|
||||
} = node;
|
||||
if (node instanceof resolveSeq.Collection) {
|
||||
if (tag) tags[tag] = true;
|
||||
node.items.forEach(n => visit(n, tags));
|
||||
} else if (node instanceof resolveSeq.Pair) {
|
||||
visit(node.key, tags);
|
||||
visit(node.value, tags);
|
||||
} else if (node instanceof resolveSeq.Scalar) {
|
||||
if (tag) tags[tag] = true;
|
||||
}
|
||||
}
|
||||
return tags;
|
||||
};
|
||||
const listTagNames = node => Object.keys(visit(node, {}));
|
||||
|
||||
function parseContents(doc, contents) {
|
||||
const comments = {
|
||||
before: [],
|
||||
after: []
|
||||
};
|
||||
let body = undefined;
|
||||
let spaceBefore = false;
|
||||
for (const node of contents) {
|
||||
if (node.valueRange) {
|
||||
if (body !== undefined) {
|
||||
const msg = 'Document contains trailing content not separated by a ... or --- line';
|
||||
doc.errors.push(new PlainValue.YAMLSyntaxError(node, msg));
|
||||
break;
|
||||
}
|
||||
const res = resolveSeq.resolveNode(doc, node);
|
||||
if (spaceBefore) {
|
||||
res.spaceBefore = true;
|
||||
spaceBefore = false;
|
||||
}
|
||||
body = res;
|
||||
} else if (node.comment !== null) {
|
||||
const cc = body === undefined ? comments.before : comments.after;
|
||||
cc.push(node.comment);
|
||||
} else if (node.type === PlainValue.Type.BLANK_LINE) {
|
||||
spaceBefore = true;
|
||||
if (body === undefined && comments.before.length > 0 && !doc.commentBefore) {
|
||||
// space-separated comments at start are parsed as document comments
|
||||
doc.commentBefore = comments.before.join('\n');
|
||||
comments.before = [];
|
||||
}
|
||||
}
|
||||
}
|
||||
doc.contents = body || null;
|
||||
if (!body) {
|
||||
doc.comment = comments.before.concat(comments.after).join('\n') || null;
|
||||
} else {
|
||||
const cb = comments.before.join('\n');
|
||||
if (cb) {
|
||||
const cbNode = body instanceof resolveSeq.Collection && body.items[0] ? body.items[0] : body;
|
||||
cbNode.commentBefore = cbNode.commentBefore ? `${cb}\n${cbNode.commentBefore}` : cb;
|
||||
}
|
||||
doc.comment = comments.after.join('\n') || null;
|
||||
}
|
||||
}
|
||||
|
||||
function resolveTagDirective({
|
||||
tagPrefixes
|
||||
}, directive) {
|
||||
const [handle, prefix] = directive.parameters;
|
||||
if (!handle || !prefix) {
|
||||
const msg = 'Insufficient parameters given for %TAG directive';
|
||||
throw new PlainValue.YAMLSemanticError(directive, msg);
|
||||
}
|
||||
if (tagPrefixes.some(p => p.handle === handle)) {
|
||||
const msg = 'The %TAG directive must only be given at most once per handle in the same document.';
|
||||
throw new PlainValue.YAMLSemanticError(directive, msg);
|
||||
}
|
||||
return {
|
||||
handle,
|
||||
prefix
|
||||
};
|
||||
}
|
||||
function resolveYamlDirective(doc, directive) {
|
||||
let [version] = directive.parameters;
|
||||
if (directive.name === 'YAML:1.0') version = '1.0';
|
||||
if (!version) {
|
||||
const msg = 'Insufficient parameters given for %YAML directive';
|
||||
throw new PlainValue.YAMLSemanticError(directive, msg);
|
||||
}
|
||||
if (!documentOptions[version]) {
|
||||
const v0 = doc.version || doc.options.version;
|
||||
const msg = `Document will be parsed as YAML ${v0} rather than YAML ${version}`;
|
||||
doc.warnings.push(new PlainValue.YAMLWarning(directive, msg));
|
||||
}
|
||||
return version;
|
||||
}
|
||||
function parseDirectives(doc, directives, prevDoc) {
|
||||
const directiveComments = [];
|
||||
let hasDirectives = false;
|
||||
for (const directive of directives) {
|
||||
const {
|
||||
comment,
|
||||
name
|
||||
} = directive;
|
||||
switch (name) {
|
||||
case 'TAG':
|
||||
try {
|
||||
doc.tagPrefixes.push(resolveTagDirective(doc, directive));
|
||||
} catch (error) {
|
||||
doc.errors.push(error);
|
||||
}
|
||||
hasDirectives = true;
|
||||
break;
|
||||
case 'YAML':
|
||||
case 'YAML:1.0':
|
||||
if (doc.version) {
|
||||
const msg = 'The %YAML directive must only be given at most once per document.';
|
||||
doc.errors.push(new PlainValue.YAMLSemanticError(directive, msg));
|
||||
}
|
||||
try {
|
||||
doc.version = resolveYamlDirective(doc, directive);
|
||||
} catch (error) {
|
||||
doc.errors.push(error);
|
||||
}
|
||||
hasDirectives = true;
|
||||
break;
|
||||
default:
|
||||
if (name) {
|
||||
const msg = `YAML only supports %TAG and %YAML directives, and not %${name}`;
|
||||
doc.warnings.push(new PlainValue.YAMLWarning(directive, msg));
|
||||
}
|
||||
}
|
||||
if (comment) directiveComments.push(comment);
|
||||
}
|
||||
if (prevDoc && !hasDirectives && '1.1' === (doc.version || prevDoc.version || doc.options.version)) {
|
||||
const copyTagPrefix = ({
|
||||
handle,
|
||||
prefix
|
||||
}) => ({
|
||||
handle,
|
||||
prefix
|
||||
});
|
||||
doc.tagPrefixes = prevDoc.tagPrefixes.map(copyTagPrefix);
|
||||
doc.version = prevDoc.version;
|
||||
}
|
||||
doc.commentBefore = directiveComments.join('\n') || null;
|
||||
}
|
||||
|
||||
function assertCollection(contents) {
|
||||
if (contents instanceof resolveSeq.Collection) return true;
|
||||
throw new Error('Expected a YAML collection as document contents');
|
||||
}
|
||||
class Document {
|
||||
constructor(options) {
|
||||
this.anchors = new Anchors(options.anchorPrefix);
|
||||
this.commentBefore = null;
|
||||
this.comment = null;
|
||||
this.contents = null;
|
||||
this.directivesEndMarker = null;
|
||||
this.errors = [];
|
||||
this.options = options;
|
||||
this.schema = null;
|
||||
this.tagPrefixes = [];
|
||||
this.version = null;
|
||||
this.warnings = [];
|
||||
}
|
||||
add(value) {
|
||||
assertCollection(this.contents);
|
||||
return this.contents.add(value);
|
||||
}
|
||||
addIn(path, value) {
|
||||
assertCollection(this.contents);
|
||||
this.contents.addIn(path, value);
|
||||
}
|
||||
delete(key) {
|
||||
assertCollection(this.contents);
|
||||
return this.contents.delete(key);
|
||||
}
|
||||
deleteIn(path) {
|
||||
if (resolveSeq.isEmptyPath(path)) {
|
||||
if (this.contents == null) return false;
|
||||
this.contents = null;
|
||||
return true;
|
||||
}
|
||||
assertCollection(this.contents);
|
||||
return this.contents.deleteIn(path);
|
||||
}
|
||||
getDefaults() {
|
||||
return Document.defaults[this.version] || Document.defaults[this.options.version] || {};
|
||||
}
|
||||
get(key, keepScalar) {
|
||||
return this.contents instanceof resolveSeq.Collection ? this.contents.get(key, keepScalar) : undefined;
|
||||
}
|
||||
getIn(path, keepScalar) {
|
||||
if (resolveSeq.isEmptyPath(path)) return !keepScalar && this.contents instanceof resolveSeq.Scalar ? this.contents.value : this.contents;
|
||||
return this.contents instanceof resolveSeq.Collection ? this.contents.getIn(path, keepScalar) : undefined;
|
||||
}
|
||||
has(key) {
|
||||
return this.contents instanceof resolveSeq.Collection ? this.contents.has(key) : false;
|
||||
}
|
||||
hasIn(path) {
|
||||
if (resolveSeq.isEmptyPath(path)) return this.contents !== undefined;
|
||||
return this.contents instanceof resolveSeq.Collection ? this.contents.hasIn(path) : false;
|
||||
}
|
||||
set(key, value) {
|
||||
assertCollection(this.contents);
|
||||
this.contents.set(key, value);
|
||||
}
|
||||
setIn(path, value) {
|
||||
if (resolveSeq.isEmptyPath(path)) this.contents = value;else {
|
||||
assertCollection(this.contents);
|
||||
this.contents.setIn(path, value);
|
||||
}
|
||||
}
|
||||
setSchema(id, customTags) {
|
||||
if (!id && !customTags && this.schema) return;
|
||||
if (typeof id === 'number') id = id.toFixed(1);
|
||||
if (id === '1.0' || id === '1.1' || id === '1.2') {
|
||||
if (this.version) this.version = id;else this.options.version = id;
|
||||
delete this.options.schema;
|
||||
} else if (id && typeof id === 'string') {
|
||||
this.options.schema = id;
|
||||
}
|
||||
if (Array.isArray(customTags)) this.options.customTags = customTags;
|
||||
const opt = Object.assign({}, this.getDefaults(), this.options);
|
||||
this.schema = new Schema.Schema(opt);
|
||||
}
|
||||
parse(node, prevDoc) {
|
||||
if (this.options.keepCstNodes) this.cstNode = node;
|
||||
if (this.options.keepNodeTypes) this.type = 'DOCUMENT';
|
||||
const {
|
||||
directives = [],
|
||||
contents = [],
|
||||
directivesEndMarker,
|
||||
error,
|
||||
valueRange
|
||||
} = node;
|
||||
if (error) {
|
||||
if (!error.source) error.source = this;
|
||||
this.errors.push(error);
|
||||
}
|
||||
parseDirectives(this, directives, prevDoc);
|
||||
if (directivesEndMarker) this.directivesEndMarker = true;
|
||||
this.range = valueRange ? [valueRange.start, valueRange.end] : null;
|
||||
this.setSchema();
|
||||
this.anchors._cstAliases = [];
|
||||
parseContents(this, contents);
|
||||
this.anchors.resolveNodes();
|
||||
if (this.options.prettyErrors) {
|
||||
for (const error of this.errors) if (error instanceof PlainValue.YAMLError) error.makePretty();
|
||||
for (const warn of this.warnings) if (warn instanceof PlainValue.YAMLError) warn.makePretty();
|
||||
}
|
||||
return this;
|
||||
}
|
||||
listNonDefaultTags() {
|
||||
return listTagNames(this.contents).filter(t => t.indexOf(Schema.Schema.defaultPrefix) !== 0);
|
||||
}
|
||||
setTagPrefix(handle, prefix) {
|
||||
if (handle[0] !== '!' || handle[handle.length - 1] !== '!') throw new Error('Handle must start and end with !');
|
||||
if (prefix) {
|
||||
const prev = this.tagPrefixes.find(p => p.handle === handle);
|
||||
if (prev) prev.prefix = prefix;else this.tagPrefixes.push({
|
||||
handle,
|
||||
prefix
|
||||
});
|
||||
} else {
|
||||
this.tagPrefixes = this.tagPrefixes.filter(p => p.handle !== handle);
|
||||
}
|
||||
}
|
||||
toJSON(arg, onAnchor) {
|
||||
const {
|
||||
keepBlobsInJSON,
|
||||
mapAsMap,
|
||||
maxAliasCount
|
||||
} = this.options;
|
||||
const keep = keepBlobsInJSON && (typeof arg !== 'string' || !(this.contents instanceof resolveSeq.Scalar));
|
||||
const ctx = {
|
||||
doc: this,
|
||||
indentStep: ' ',
|
||||
keep,
|
||||
mapAsMap: keep && !!mapAsMap,
|
||||
maxAliasCount,
|
||||
stringify // Requiring directly in Pair would create circular dependencies
|
||||
};
|
||||
const anchorNames = Object.keys(this.anchors.map);
|
||||
if (anchorNames.length > 0) ctx.anchors = new Map(anchorNames.map(name => [this.anchors.map[name], {
|
||||
alias: [],
|
||||
aliasCount: 0,
|
||||
count: 1
|
||||
}]));
|
||||
const res = resolveSeq.toJSON(this.contents, arg, ctx);
|
||||
if (typeof onAnchor === 'function' && ctx.anchors) for (const {
|
||||
count,
|
||||
res
|
||||
} of ctx.anchors.values()) onAnchor(res, count);
|
||||
return res;
|
||||
}
|
||||
toString() {
|
||||
if (this.errors.length > 0) throw new Error('Document with errors cannot be stringified');
|
||||
const indentSize = this.options.indent;
|
||||
if (!Number.isInteger(indentSize) || indentSize <= 0) {
|
||||
const s = JSON.stringify(indentSize);
|
||||
throw new Error(`"indent" option must be a positive integer, not ${s}`);
|
||||
}
|
||||
this.setSchema();
|
||||
const lines = [];
|
||||
let hasDirectives = false;
|
||||
if (this.version) {
|
||||
let vd = '%YAML 1.2';
|
||||
if (this.schema.name === 'yaml-1.1') {
|
||||
if (this.version === '1.0') vd = '%YAML:1.0';else if (this.version === '1.1') vd = '%YAML 1.1';
|
||||
}
|
||||
lines.push(vd);
|
||||
hasDirectives = true;
|
||||
}
|
||||
const tagNames = this.listNonDefaultTags();
|
||||
this.tagPrefixes.forEach(({
|
||||
handle,
|
||||
prefix
|
||||
}) => {
|
||||
if (tagNames.some(t => t.indexOf(prefix) === 0)) {
|
||||
lines.push(`%TAG ${handle} ${prefix}`);
|
||||
hasDirectives = true;
|
||||
}
|
||||
});
|
||||
if (hasDirectives || this.directivesEndMarker) lines.push('---');
|
||||
if (this.commentBefore) {
|
||||
if (hasDirectives || !this.directivesEndMarker) lines.unshift('');
|
||||
lines.unshift(this.commentBefore.replace(/^/gm, '#'));
|
||||
}
|
||||
const ctx = {
|
||||
anchors: Object.create(null),
|
||||
doc: this,
|
||||
indent: '',
|
||||
indentStep: ' '.repeat(indentSize),
|
||||
stringify // Requiring directly in nodes would create circular dependencies
|
||||
};
|
||||
let chompKeep = false;
|
||||
let contentComment = null;
|
||||
if (this.contents) {
|
||||
if (this.contents instanceof resolveSeq.Node) {
|
||||
if (this.contents.spaceBefore && (hasDirectives || this.directivesEndMarker)) lines.push('');
|
||||
if (this.contents.commentBefore) lines.push(this.contents.commentBefore.replace(/^/gm, '#'));
|
||||
// top-level block scalars need to be indented if followed by a comment
|
||||
ctx.forceBlockIndent = !!this.comment;
|
||||
contentComment = this.contents.comment;
|
||||
}
|
||||
const onChompKeep = contentComment ? null : () => chompKeep = true;
|
||||
const body = stringify(this.contents, ctx, () => contentComment = null, onChompKeep);
|
||||
lines.push(resolveSeq.addComment(body, '', contentComment));
|
||||
} else if (this.contents !== undefined) {
|
||||
lines.push(stringify(this.contents, ctx));
|
||||
}
|
||||
if (this.comment) {
|
||||
if ((!chompKeep || contentComment) && lines[lines.length - 1] !== '') lines.push('');
|
||||
lines.push(this.comment.replace(/^/gm, '#'));
|
||||
}
|
||||
return lines.join('\n') + '\n';
|
||||
}
|
||||
}
|
||||
PlainValue._defineProperty(Document, "defaults", documentOptions);
|
||||
|
||||
exports.Document = Document;
|
||||
exports.defaultOptions = defaultOptions;
|
||||
exports.scalarOptions = scalarOptions;
|
||||
765
node_modules/cosmiconfig/node_modules/yaml/dist/PlainValue-516d5bc2.js
generated
vendored
Normal file
765
node_modules/cosmiconfig/node_modules/yaml/dist/PlainValue-516d5bc2.js
generated
vendored
Normal file
@@ -0,0 +1,765 @@
|
||||
'use strict';
|
||||
|
||||
const Char = {
|
||||
ANCHOR: '&',
|
||||
COMMENT: '#',
|
||||
TAG: '!',
|
||||
DIRECTIVES_END: '-',
|
||||
DOCUMENT_END: '.'
|
||||
};
|
||||
const Type = {
|
||||
ALIAS: 'ALIAS',
|
||||
BLANK_LINE: 'BLANK_LINE',
|
||||
BLOCK_FOLDED: 'BLOCK_FOLDED',
|
||||
BLOCK_LITERAL: 'BLOCK_LITERAL',
|
||||
COMMENT: 'COMMENT',
|
||||
DIRECTIVE: 'DIRECTIVE',
|
||||
DOCUMENT: 'DOCUMENT',
|
||||
FLOW_MAP: 'FLOW_MAP',
|
||||
FLOW_SEQ: 'FLOW_SEQ',
|
||||
MAP: 'MAP',
|
||||
MAP_KEY: 'MAP_KEY',
|
||||
MAP_VALUE: 'MAP_VALUE',
|
||||
PLAIN: 'PLAIN',
|
||||
QUOTE_DOUBLE: 'QUOTE_DOUBLE',
|
||||
QUOTE_SINGLE: 'QUOTE_SINGLE',
|
||||
SEQ: 'SEQ',
|
||||
SEQ_ITEM: 'SEQ_ITEM'
|
||||
};
|
||||
const defaultTagPrefix = 'tag:yaml.org,2002:';
|
||||
const defaultTags = {
|
||||
MAP: 'tag:yaml.org,2002:map',
|
||||
SEQ: 'tag:yaml.org,2002:seq',
|
||||
STR: 'tag:yaml.org,2002:str'
|
||||
};
|
||||
|
||||
function findLineStarts(src) {
|
||||
const ls = [0];
|
||||
let offset = src.indexOf('\n');
|
||||
while (offset !== -1) {
|
||||
offset += 1;
|
||||
ls.push(offset);
|
||||
offset = src.indexOf('\n', offset);
|
||||
}
|
||||
return ls;
|
||||
}
|
||||
function getSrcInfo(cst) {
|
||||
let lineStarts, src;
|
||||
if (typeof cst === 'string') {
|
||||
lineStarts = findLineStarts(cst);
|
||||
src = cst;
|
||||
} else {
|
||||
if (Array.isArray(cst)) cst = cst[0];
|
||||
if (cst && cst.context) {
|
||||
if (!cst.lineStarts) cst.lineStarts = findLineStarts(cst.context.src);
|
||||
lineStarts = cst.lineStarts;
|
||||
src = cst.context.src;
|
||||
}
|
||||
}
|
||||
return {
|
||||
lineStarts,
|
||||
src
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* @typedef {Object} LinePos - One-indexed position in the source
|
||||
* @property {number} line
|
||||
* @property {number} col
|
||||
*/
|
||||
|
||||
/**
|
||||
* Determine the line/col position matching a character offset.
|
||||
*
|
||||
* Accepts a source string or a CST document as the second parameter. With
|
||||
* the latter, starting indices for lines are cached in the document as
|
||||
* `lineStarts: number[]`.
|
||||
*
|
||||
* Returns a one-indexed `{ line, col }` location if found, or
|
||||
* `undefined` otherwise.
|
||||
*
|
||||
* @param {number} offset
|
||||
* @param {string|Document|Document[]} cst
|
||||
* @returns {?LinePos}
|
||||
*/
|
||||
function getLinePos(offset, cst) {
|
||||
if (typeof offset !== 'number' || offset < 0) return null;
|
||||
const {
|
||||
lineStarts,
|
||||
src
|
||||
} = getSrcInfo(cst);
|
||||
if (!lineStarts || !src || offset > src.length) return null;
|
||||
for (let i = 0; i < lineStarts.length; ++i) {
|
||||
const start = lineStarts[i];
|
||||
if (offset < start) {
|
||||
return {
|
||||
line: i,
|
||||
col: offset - lineStarts[i - 1] + 1
|
||||
};
|
||||
}
|
||||
if (offset === start) return {
|
||||
line: i + 1,
|
||||
col: 1
|
||||
};
|
||||
}
|
||||
const line = lineStarts.length;
|
||||
return {
|
||||
line,
|
||||
col: offset - lineStarts[line - 1] + 1
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Get a specified line from the source.
|
||||
*
|
||||
* Accepts a source string or a CST document as the second parameter. With
|
||||
* the latter, starting indices for lines are cached in the document as
|
||||
* `lineStarts: number[]`.
|
||||
*
|
||||
* Returns the line as a string if found, or `null` otherwise.
|
||||
*
|
||||
* @param {number} line One-indexed line number
|
||||
* @param {string|Document|Document[]} cst
|
||||
* @returns {?string}
|
||||
*/
|
||||
function getLine(line, cst) {
|
||||
const {
|
||||
lineStarts,
|
||||
src
|
||||
} = getSrcInfo(cst);
|
||||
if (!lineStarts || !(line >= 1) || line > lineStarts.length) return null;
|
||||
const start = lineStarts[line - 1];
|
||||
let end = lineStarts[line]; // undefined for last line; that's ok for slice()
|
||||
while (end && end > start && src[end - 1] === '\n') --end;
|
||||
return src.slice(start, end);
|
||||
}
|
||||
|
||||
/**
|
||||
* Pretty-print the starting line from the source indicated by the range `pos`
|
||||
*
|
||||
* Trims output to `maxWidth` chars while keeping the starting column visible,
|
||||
* using `…` at either end to indicate dropped characters.
|
||||
*
|
||||
* Returns a two-line string (or `null`) with `\n` as separator; the second line
|
||||
* will hold appropriately indented `^` marks indicating the column range.
|
||||
*
|
||||
* @param {Object} pos
|
||||
* @param {LinePos} pos.start
|
||||
* @param {LinePos} [pos.end]
|
||||
* @param {string|Document|Document[]*} cst
|
||||
* @param {number} [maxWidth=80]
|
||||
* @returns {?string}
|
||||
*/
|
||||
function getPrettyContext({
|
||||
start,
|
||||
end
|
||||
}, cst, maxWidth = 80) {
|
||||
let src = getLine(start.line, cst);
|
||||
if (!src) return null;
|
||||
let {
|
||||
col
|
||||
} = start;
|
||||
if (src.length > maxWidth) {
|
||||
if (col <= maxWidth - 10) {
|
||||
src = src.substr(0, maxWidth - 1) + '…';
|
||||
} else {
|
||||
const halfWidth = Math.round(maxWidth / 2);
|
||||
if (src.length > col + halfWidth) src = src.substr(0, col + halfWidth - 1) + '…';
|
||||
col -= src.length - maxWidth;
|
||||
src = '…' + src.substr(1 - maxWidth);
|
||||
}
|
||||
}
|
||||
let errLen = 1;
|
||||
let errEnd = '';
|
||||
if (end) {
|
||||
if (end.line === start.line && col + (end.col - start.col) <= maxWidth + 1) {
|
||||
errLen = end.col - start.col;
|
||||
} else {
|
||||
errLen = Math.min(src.length + 1, maxWidth) - col;
|
||||
errEnd = '…';
|
||||
}
|
||||
}
|
||||
const offset = col > 1 ? ' '.repeat(col - 1) : '';
|
||||
const err = '^'.repeat(errLen);
|
||||
return `${src}\n${offset}${err}${errEnd}`;
|
||||
}
|
||||
|
||||
class Range {
|
||||
static copy(orig) {
|
||||
return new Range(orig.start, orig.end);
|
||||
}
|
||||
constructor(start, end) {
|
||||
this.start = start;
|
||||
this.end = end || start;
|
||||
}
|
||||
isEmpty() {
|
||||
return typeof this.start !== 'number' || !this.end || this.end <= this.start;
|
||||
}
|
||||
|
||||
/**
|
||||
* Set `origStart` and `origEnd` to point to the original source range for
|
||||
* this node, which may differ due to dropped CR characters.
|
||||
*
|
||||
* @param {number[]} cr - Positions of dropped CR characters
|
||||
* @param {number} offset - Starting index of `cr` from the last call
|
||||
* @returns {number} - The next offset, matching the one found for `origStart`
|
||||
*/
|
||||
setOrigRange(cr, offset) {
|
||||
const {
|
||||
start,
|
||||
end
|
||||
} = this;
|
||||
if (cr.length === 0 || end <= cr[0]) {
|
||||
this.origStart = start;
|
||||
this.origEnd = end;
|
||||
return offset;
|
||||
}
|
||||
let i = offset;
|
||||
while (i < cr.length) {
|
||||
if (cr[i] > start) break;else ++i;
|
||||
}
|
||||
this.origStart = start + i;
|
||||
const nextOffset = i;
|
||||
while (i < cr.length) {
|
||||
// if end was at \n, it should now be at \r
|
||||
if (cr[i] >= end) break;else ++i;
|
||||
}
|
||||
this.origEnd = end + i;
|
||||
return nextOffset;
|
||||
}
|
||||
}
|
||||
|
||||
/** Root class of all nodes */
|
||||
class Node {
|
||||
static addStringTerminator(src, offset, str) {
|
||||
if (str[str.length - 1] === '\n') return str;
|
||||
const next = Node.endOfWhiteSpace(src, offset);
|
||||
return next >= src.length || src[next] === '\n' ? str + '\n' : str;
|
||||
}
|
||||
|
||||
// ^(---|...)
|
||||
static atDocumentBoundary(src, offset, sep) {
|
||||
const ch0 = src[offset];
|
||||
if (!ch0) return true;
|
||||
const prev = src[offset - 1];
|
||||
if (prev && prev !== '\n') return false;
|
||||
if (sep) {
|
||||
if (ch0 !== sep) return false;
|
||||
} else {
|
||||
if (ch0 !== Char.DIRECTIVES_END && ch0 !== Char.DOCUMENT_END) return false;
|
||||
}
|
||||
const ch1 = src[offset + 1];
|
||||
const ch2 = src[offset + 2];
|
||||
if (ch1 !== ch0 || ch2 !== ch0) return false;
|
||||
const ch3 = src[offset + 3];
|
||||
return !ch3 || ch3 === '\n' || ch3 === '\t' || ch3 === ' ';
|
||||
}
|
||||
static endOfIdentifier(src, offset) {
|
||||
let ch = src[offset];
|
||||
const isVerbatim = ch === '<';
|
||||
const notOk = isVerbatim ? ['\n', '\t', ' ', '>'] : ['\n', '\t', ' ', '[', ']', '{', '}', ','];
|
||||
while (ch && notOk.indexOf(ch) === -1) ch = src[offset += 1];
|
||||
if (isVerbatim && ch === '>') offset += 1;
|
||||
return offset;
|
||||
}
|
||||
static endOfIndent(src, offset) {
|
||||
let ch = src[offset];
|
||||
while (ch === ' ') ch = src[offset += 1];
|
||||
return offset;
|
||||
}
|
||||
static endOfLine(src, offset) {
|
||||
let ch = src[offset];
|
||||
while (ch && ch !== '\n') ch = src[offset += 1];
|
||||
return offset;
|
||||
}
|
||||
static endOfWhiteSpace(src, offset) {
|
||||
let ch = src[offset];
|
||||
while (ch === '\t' || ch === ' ') ch = src[offset += 1];
|
||||
return offset;
|
||||
}
|
||||
static startOfLine(src, offset) {
|
||||
let ch = src[offset - 1];
|
||||
if (ch === '\n') return offset;
|
||||
while (ch && ch !== '\n') ch = src[offset -= 1];
|
||||
return offset + 1;
|
||||
}
|
||||
|
||||
/**
|
||||
* End of indentation, or null if the line's indent level is not more
|
||||
* than `indent`
|
||||
*
|
||||
* @param {string} src
|
||||
* @param {number} indent
|
||||
* @param {number} lineStart
|
||||
* @returns {?number}
|
||||
*/
|
||||
static endOfBlockIndent(src, indent, lineStart) {
|
||||
const inEnd = Node.endOfIndent(src, lineStart);
|
||||
if (inEnd > lineStart + indent) {
|
||||
return inEnd;
|
||||
} else {
|
||||
const wsEnd = Node.endOfWhiteSpace(src, inEnd);
|
||||
const ch = src[wsEnd];
|
||||
if (!ch || ch === '\n') return wsEnd;
|
||||
}
|
||||
return null;
|
||||
}
|
||||
static atBlank(src, offset, endAsBlank) {
|
||||
const ch = src[offset];
|
||||
return ch === '\n' || ch === '\t' || ch === ' ' || endAsBlank && !ch;
|
||||
}
|
||||
static nextNodeIsIndented(ch, indentDiff, indicatorAsIndent) {
|
||||
if (!ch || indentDiff < 0) return false;
|
||||
if (indentDiff > 0) return true;
|
||||
return indicatorAsIndent && ch === '-';
|
||||
}
|
||||
|
||||
// should be at line or string end, or at next non-whitespace char
|
||||
static normalizeOffset(src, offset) {
|
||||
const ch = src[offset];
|
||||
return !ch ? offset : ch !== '\n' && src[offset - 1] === '\n' ? offset - 1 : Node.endOfWhiteSpace(src, offset);
|
||||
}
|
||||
|
||||
// fold single newline into space, multiple newlines to N - 1 newlines
|
||||
// presumes src[offset] === '\n'
|
||||
static foldNewline(src, offset, indent) {
|
||||
let inCount = 0;
|
||||
let error = false;
|
||||
let fold = '';
|
||||
let ch = src[offset + 1];
|
||||
while (ch === ' ' || ch === '\t' || ch === '\n') {
|
||||
switch (ch) {
|
||||
case '\n':
|
||||
inCount = 0;
|
||||
offset += 1;
|
||||
fold += '\n';
|
||||
break;
|
||||
case '\t':
|
||||
if (inCount <= indent) error = true;
|
||||
offset = Node.endOfWhiteSpace(src, offset + 2) - 1;
|
||||
break;
|
||||
case ' ':
|
||||
inCount += 1;
|
||||
offset += 1;
|
||||
break;
|
||||
}
|
||||
ch = src[offset + 1];
|
||||
}
|
||||
if (!fold) fold = ' ';
|
||||
if (ch && inCount <= indent) error = true;
|
||||
return {
|
||||
fold,
|
||||
offset,
|
||||
error
|
||||
};
|
||||
}
|
||||
constructor(type, props, context) {
|
||||
Object.defineProperty(this, 'context', {
|
||||
value: context || null,
|
||||
writable: true
|
||||
});
|
||||
this.error = null;
|
||||
this.range = null;
|
||||
this.valueRange = null;
|
||||
this.props = props || [];
|
||||
this.type = type;
|
||||
this.value = null;
|
||||
}
|
||||
getPropValue(idx, key, skipKey) {
|
||||
if (!this.context) return null;
|
||||
const {
|
||||
src
|
||||
} = this.context;
|
||||
const prop = this.props[idx];
|
||||
return prop && src[prop.start] === key ? src.slice(prop.start + (skipKey ? 1 : 0), prop.end) : null;
|
||||
}
|
||||
get anchor() {
|
||||
for (let i = 0; i < this.props.length; ++i) {
|
||||
const anchor = this.getPropValue(i, Char.ANCHOR, true);
|
||||
if (anchor != null) return anchor;
|
||||
}
|
||||
return null;
|
||||
}
|
||||
get comment() {
|
||||
const comments = [];
|
||||
for (let i = 0; i < this.props.length; ++i) {
|
||||
const comment = this.getPropValue(i, Char.COMMENT, true);
|
||||
if (comment != null) comments.push(comment);
|
||||
}
|
||||
return comments.length > 0 ? comments.join('\n') : null;
|
||||
}
|
||||
commentHasRequiredWhitespace(start) {
|
||||
const {
|
||||
src
|
||||
} = this.context;
|
||||
if (this.header && start === this.header.end) return false;
|
||||
if (!this.valueRange) return false;
|
||||
const {
|
||||
end
|
||||
} = this.valueRange;
|
||||
return start !== end || Node.atBlank(src, end - 1);
|
||||
}
|
||||
get hasComment() {
|
||||
if (this.context) {
|
||||
const {
|
||||
src
|
||||
} = this.context;
|
||||
for (let i = 0; i < this.props.length; ++i) {
|
||||
if (src[this.props[i].start] === Char.COMMENT) return true;
|
||||
}
|
||||
}
|
||||
return false;
|
||||
}
|
||||
get hasProps() {
|
||||
if (this.context) {
|
||||
const {
|
||||
src
|
||||
} = this.context;
|
||||
for (let i = 0; i < this.props.length; ++i) {
|
||||
if (src[this.props[i].start] !== Char.COMMENT) return true;
|
||||
}
|
||||
}
|
||||
return false;
|
||||
}
|
||||
get includesTrailingLines() {
|
||||
return false;
|
||||
}
|
||||
get jsonLike() {
|
||||
const jsonLikeTypes = [Type.FLOW_MAP, Type.FLOW_SEQ, Type.QUOTE_DOUBLE, Type.QUOTE_SINGLE];
|
||||
return jsonLikeTypes.indexOf(this.type) !== -1;
|
||||
}
|
||||
get rangeAsLinePos() {
|
||||
if (!this.range || !this.context) return undefined;
|
||||
const start = getLinePos(this.range.start, this.context.root);
|
||||
if (!start) return undefined;
|
||||
const end = getLinePos(this.range.end, this.context.root);
|
||||
return {
|
||||
start,
|
||||
end
|
||||
};
|
||||
}
|
||||
get rawValue() {
|
||||
if (!this.valueRange || !this.context) return null;
|
||||
const {
|
||||
start,
|
||||
end
|
||||
} = this.valueRange;
|
||||
return this.context.src.slice(start, end);
|
||||
}
|
||||
get tag() {
|
||||
for (let i = 0; i < this.props.length; ++i) {
|
||||
const tag = this.getPropValue(i, Char.TAG, false);
|
||||
if (tag != null) {
|
||||
if (tag[1] === '<') {
|
||||
return {
|
||||
verbatim: tag.slice(2, -1)
|
||||
};
|
||||
} else {
|
||||
// eslint-disable-next-line no-unused-vars
|
||||
const [_, handle, suffix] = tag.match(/^(.*!)([^!]*)$/);
|
||||
return {
|
||||
handle,
|
||||
suffix
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
return null;
|
||||
}
|
||||
get valueRangeContainsNewline() {
|
||||
if (!this.valueRange || !this.context) return false;
|
||||
const {
|
||||
start,
|
||||
end
|
||||
} = this.valueRange;
|
||||
const {
|
||||
src
|
||||
} = this.context;
|
||||
for (let i = start; i < end; ++i) {
|
||||
if (src[i] === '\n') return true;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
parseComment(start) {
|
||||
const {
|
||||
src
|
||||
} = this.context;
|
||||
if (src[start] === Char.COMMENT) {
|
||||
const end = Node.endOfLine(src, start + 1);
|
||||
const commentRange = new Range(start, end);
|
||||
this.props.push(commentRange);
|
||||
return end;
|
||||
}
|
||||
return start;
|
||||
}
|
||||
|
||||
/**
|
||||
* Populates the `origStart` and `origEnd` values of all ranges for this
|
||||
* node. Extended by child classes to handle descendant nodes.
|
||||
*
|
||||
* @param {number[]} cr - Positions of dropped CR characters
|
||||
* @param {number} offset - Starting index of `cr` from the last call
|
||||
* @returns {number} - The next offset, matching the one found for `origStart`
|
||||
*/
|
||||
setOrigRanges(cr, offset) {
|
||||
if (this.range) offset = this.range.setOrigRange(cr, offset);
|
||||
if (this.valueRange) this.valueRange.setOrigRange(cr, offset);
|
||||
this.props.forEach(prop => prop.setOrigRange(cr, offset));
|
||||
return offset;
|
||||
}
|
||||
toString() {
|
||||
const {
|
||||
context: {
|
||||
src
|
||||
},
|
||||
range,
|
||||
value
|
||||
} = this;
|
||||
if (value != null) return value;
|
||||
const str = src.slice(range.start, range.end);
|
||||
return Node.addStringTerminator(src, range.end, str);
|
||||
}
|
||||
}
|
||||
|
||||
class YAMLError extends Error {
|
||||
constructor(name, source, message) {
|
||||
if (!message || !(source instanceof Node)) throw new Error(`Invalid arguments for new ${name}`);
|
||||
super();
|
||||
this.name = name;
|
||||
this.message = message;
|
||||
this.source = source;
|
||||
}
|
||||
makePretty() {
|
||||
if (!this.source) return;
|
||||
this.nodeType = this.source.type;
|
||||
const cst = this.source.context && this.source.context.root;
|
||||
if (typeof this.offset === 'number') {
|
||||
this.range = new Range(this.offset, this.offset + 1);
|
||||
const start = cst && getLinePos(this.offset, cst);
|
||||
if (start) {
|
||||
const end = {
|
||||
line: start.line,
|
||||
col: start.col + 1
|
||||
};
|
||||
this.linePos = {
|
||||
start,
|
||||
end
|
||||
};
|
||||
}
|
||||
delete this.offset;
|
||||
} else {
|
||||
this.range = this.source.range;
|
||||
this.linePos = this.source.rangeAsLinePos;
|
||||
}
|
||||
if (this.linePos) {
|
||||
const {
|
||||
line,
|
||||
col
|
||||
} = this.linePos.start;
|
||||
this.message += ` at line ${line}, column ${col}`;
|
||||
const ctx = cst && getPrettyContext(this.linePos, cst);
|
||||
if (ctx) this.message += `:\n\n${ctx}\n`;
|
||||
}
|
||||
delete this.source;
|
||||
}
|
||||
}
|
||||
class YAMLReferenceError extends YAMLError {
|
||||
constructor(source, message) {
|
||||
super('YAMLReferenceError', source, message);
|
||||
}
|
||||
}
|
||||
class YAMLSemanticError extends YAMLError {
|
||||
constructor(source, message) {
|
||||
super('YAMLSemanticError', source, message);
|
||||
}
|
||||
}
|
||||
class YAMLSyntaxError extends YAMLError {
|
||||
constructor(source, message) {
|
||||
super('YAMLSyntaxError', source, message);
|
||||
}
|
||||
}
|
||||
class YAMLWarning extends YAMLError {
|
||||
constructor(source, message) {
|
||||
super('YAMLWarning', source, message);
|
||||
}
|
||||
}
|
||||
|
||||
function _defineProperty(e, r, t) {
|
||||
return (r = _toPropertyKey(r)) in e ? Object.defineProperty(e, r, {
|
||||
value: t,
|
||||
enumerable: !0,
|
||||
configurable: !0,
|
||||
writable: !0
|
||||
}) : e[r] = t, e;
|
||||
}
|
||||
function _toPrimitive(t, r) {
|
||||
if ("object" != typeof t || !t) return t;
|
||||
var e = t[Symbol.toPrimitive];
|
||||
if (void 0 !== e) {
|
||||
var i = e.call(t, r || "default");
|
||||
if ("object" != typeof i) return i;
|
||||
throw new TypeError("@@toPrimitive must return a primitive value.");
|
||||
}
|
||||
return ("string" === r ? String : Number)(t);
|
||||
}
|
||||
function _toPropertyKey(t) {
|
||||
var i = _toPrimitive(t, "string");
|
||||
return "symbol" == typeof i ? i : i + "";
|
||||
}
|
||||
|
||||
class PlainValue extends Node {
|
||||
static endOfLine(src, start, inFlow) {
|
||||
let ch = src[start];
|
||||
let offset = start;
|
||||
while (ch && ch !== '\n') {
|
||||
if (inFlow && (ch === '[' || ch === ']' || ch === '{' || ch === '}' || ch === ',')) break;
|
||||
const next = src[offset + 1];
|
||||
if (ch === ':' && (!next || next === '\n' || next === '\t' || next === ' ' || inFlow && next === ',')) break;
|
||||
if ((ch === ' ' || ch === '\t') && next === '#') break;
|
||||
offset += 1;
|
||||
ch = next;
|
||||
}
|
||||
return offset;
|
||||
}
|
||||
get strValue() {
|
||||
if (!this.valueRange || !this.context) return null;
|
||||
let {
|
||||
start,
|
||||
end
|
||||
} = this.valueRange;
|
||||
const {
|
||||
src
|
||||
} = this.context;
|
||||
let ch = src[end - 1];
|
||||
while (start < end && (ch === '\n' || ch === '\t' || ch === ' ')) ch = src[--end - 1];
|
||||
let str = '';
|
||||
for (let i = start; i < end; ++i) {
|
||||
const ch = src[i];
|
||||
if (ch === '\n') {
|
||||
const {
|
||||
fold,
|
||||
offset
|
||||
} = Node.foldNewline(src, i, -1);
|
||||
str += fold;
|
||||
i = offset;
|
||||
} else if (ch === ' ' || ch === '\t') {
|
||||
// trim trailing whitespace
|
||||
const wsStart = i;
|
||||
let next = src[i + 1];
|
||||
while (i < end && (next === ' ' || next === '\t')) {
|
||||
i += 1;
|
||||
next = src[i + 1];
|
||||
}
|
||||
if (next !== '\n') str += i > wsStart ? src.slice(wsStart, i + 1) : ch;
|
||||
} else {
|
||||
str += ch;
|
||||
}
|
||||
}
|
||||
const ch0 = src[start];
|
||||
switch (ch0) {
|
||||
case '\t':
|
||||
{
|
||||
const msg = 'Plain value cannot start with a tab character';
|
||||
const errors = [new YAMLSemanticError(this, msg)];
|
||||
return {
|
||||
errors,
|
||||
str
|
||||
};
|
||||
}
|
||||
case '@':
|
||||
case '`':
|
||||
{
|
||||
const msg = `Plain value cannot start with reserved character ${ch0}`;
|
||||
const errors = [new YAMLSemanticError(this, msg)];
|
||||
return {
|
||||
errors,
|
||||
str
|
||||
};
|
||||
}
|
||||
default:
|
||||
return str;
|
||||
}
|
||||
}
|
||||
parseBlockValue(start) {
|
||||
const {
|
||||
indent,
|
||||
inFlow,
|
||||
src
|
||||
} = this.context;
|
||||
let offset = start;
|
||||
let valueEnd = start;
|
||||
for (let ch = src[offset]; ch === '\n'; ch = src[offset]) {
|
||||
if (Node.atDocumentBoundary(src, offset + 1)) break;
|
||||
const end = Node.endOfBlockIndent(src, indent, offset + 1);
|
||||
if (end === null || src[end] === '#') break;
|
||||
if (src[end] === '\n') {
|
||||
offset = end;
|
||||
} else {
|
||||
valueEnd = PlainValue.endOfLine(src, end, inFlow);
|
||||
offset = valueEnd;
|
||||
}
|
||||
}
|
||||
if (this.valueRange.isEmpty()) this.valueRange.start = start;
|
||||
this.valueRange.end = valueEnd;
|
||||
return valueEnd;
|
||||
}
|
||||
|
||||
/**
|
||||
* Parses a plain value from the source
|
||||
*
|
||||
* Accepted forms are:
|
||||
* ```
|
||||
* #comment
|
||||
*
|
||||
* first line
|
||||
*
|
||||
* first line #comment
|
||||
*
|
||||
* first line
|
||||
* block
|
||||
* lines
|
||||
*
|
||||
* #comment
|
||||
* block
|
||||
* lines
|
||||
* ```
|
||||
* where block lines are empty or have an indent level greater than `indent`.
|
||||
*
|
||||
* @param {ParseContext} context
|
||||
* @param {number} start - Index of first character
|
||||
* @returns {number} - Index of the character after this scalar, may be `\n`
|
||||
*/
|
||||
parse(context, start) {
|
||||
this.context = context;
|
||||
const {
|
||||
inFlow,
|
||||
src
|
||||
} = context;
|
||||
let offset = start;
|
||||
const ch = src[offset];
|
||||
if (ch && ch !== '#' && ch !== '\n') {
|
||||
offset = PlainValue.endOfLine(src, start, inFlow);
|
||||
}
|
||||
this.valueRange = new Range(start, offset);
|
||||
offset = Node.endOfWhiteSpace(src, offset);
|
||||
offset = this.parseComment(offset);
|
||||
if (!this.hasComment || this.valueRange.isEmpty()) {
|
||||
offset = this.parseBlockValue(offset);
|
||||
}
|
||||
return offset;
|
||||
}
|
||||
}
|
||||
|
||||
exports.Char = Char;
|
||||
exports.Node = Node;
|
||||
exports.PlainValue = PlainValue;
|
||||
exports.Range = Range;
|
||||
exports.Type = Type;
|
||||
exports.YAMLError = YAMLError;
|
||||
exports.YAMLReferenceError = YAMLReferenceError;
|
||||
exports.YAMLSemanticError = YAMLSemanticError;
|
||||
exports.YAMLSyntaxError = YAMLSyntaxError;
|
||||
exports.YAMLWarning = YAMLWarning;
|
||||
exports._defineProperty = _defineProperty;
|
||||
exports.defaultTagPrefix = defaultTagPrefix;
|
||||
exports.defaultTags = defaultTags;
|
||||
469
node_modules/cosmiconfig/node_modules/yaml/dist/Schema-bcc6c2d7.js
generated
vendored
Normal file
469
node_modules/cosmiconfig/node_modules/yaml/dist/Schema-bcc6c2d7.js
generated
vendored
Normal file
@@ -0,0 +1,469 @@
|
||||
'use strict';
|
||||
|
||||
var PlainValue = require('./PlainValue-516d5bc2.js');
|
||||
var resolveSeq = require('./resolveSeq-95613e94.js');
|
||||
var warnings = require('./warnings-793925ce.js');
|
||||
|
||||
function createMap(schema, obj, ctx) {
|
||||
const map = new resolveSeq.YAMLMap(schema);
|
||||
if (obj instanceof Map) {
|
||||
for (const [key, value] of obj) map.items.push(schema.createPair(key, value, ctx));
|
||||
} else if (obj && typeof obj === 'object') {
|
||||
for (const key of Object.keys(obj)) map.items.push(schema.createPair(key, obj[key], ctx));
|
||||
}
|
||||
if (typeof schema.sortMapEntries === 'function') {
|
||||
map.items.sort(schema.sortMapEntries);
|
||||
}
|
||||
return map;
|
||||
}
|
||||
const map = {
|
||||
createNode: createMap,
|
||||
default: true,
|
||||
nodeClass: resolveSeq.YAMLMap,
|
||||
tag: 'tag:yaml.org,2002:map',
|
||||
resolve: resolveSeq.resolveMap
|
||||
};
|
||||
|
||||
function createSeq(schema, obj, ctx) {
|
||||
const seq = new resolveSeq.YAMLSeq(schema);
|
||||
if (obj && obj[Symbol.iterator]) {
|
||||
for (const it of obj) {
|
||||
const v = schema.createNode(it, ctx.wrapScalars, null, ctx);
|
||||
seq.items.push(v);
|
||||
}
|
||||
}
|
||||
return seq;
|
||||
}
|
||||
const seq = {
|
||||
createNode: createSeq,
|
||||
default: true,
|
||||
nodeClass: resolveSeq.YAMLSeq,
|
||||
tag: 'tag:yaml.org,2002:seq',
|
||||
resolve: resolveSeq.resolveSeq
|
||||
};
|
||||
|
||||
const string = {
|
||||
identify: value => typeof value === 'string',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:str',
|
||||
resolve: resolveSeq.resolveString,
|
||||
stringify(item, ctx, onComment, onChompKeep) {
|
||||
ctx = Object.assign({
|
||||
actualString: true
|
||||
}, ctx);
|
||||
return resolveSeq.stringifyString(item, ctx, onComment, onChompKeep);
|
||||
},
|
||||
options: resolveSeq.strOptions
|
||||
};
|
||||
|
||||
const failsafe = [map, seq, string];
|
||||
|
||||
/* global BigInt */
|
||||
const intIdentify$2 = value => typeof value === 'bigint' || Number.isInteger(value);
|
||||
const intResolve$1 = (src, part, radix) => resolveSeq.intOptions.asBigInt ? BigInt(src) : parseInt(part, radix);
|
||||
function intStringify$1(node, radix, prefix) {
|
||||
const {
|
||||
value
|
||||
} = node;
|
||||
if (intIdentify$2(value) && value >= 0) return prefix + value.toString(radix);
|
||||
return resolveSeq.stringifyNumber(node);
|
||||
}
|
||||
const nullObj = {
|
||||
identify: value => value == null,
|
||||
createNode: (schema, value, ctx) => ctx.wrapScalars ? new resolveSeq.Scalar(null) : null,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:null',
|
||||
test: /^(?:~|[Nn]ull|NULL)?$/,
|
||||
resolve: () => null,
|
||||
options: resolveSeq.nullOptions,
|
||||
stringify: () => resolveSeq.nullOptions.nullStr
|
||||
};
|
||||
const boolObj = {
|
||||
identify: value => typeof value === 'boolean',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:bool',
|
||||
test: /^(?:[Tt]rue|TRUE|[Ff]alse|FALSE)$/,
|
||||
resolve: str => str[0] === 't' || str[0] === 'T',
|
||||
options: resolveSeq.boolOptions,
|
||||
stringify: ({
|
||||
value
|
||||
}) => value ? resolveSeq.boolOptions.trueStr : resolveSeq.boolOptions.falseStr
|
||||
};
|
||||
const octObj = {
|
||||
identify: value => intIdentify$2(value) && value >= 0,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:int',
|
||||
format: 'OCT',
|
||||
test: /^0o([0-7]+)$/,
|
||||
resolve: (str, oct) => intResolve$1(str, oct, 8),
|
||||
options: resolveSeq.intOptions,
|
||||
stringify: node => intStringify$1(node, 8, '0o')
|
||||
};
|
||||
const intObj = {
|
||||
identify: intIdentify$2,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:int',
|
||||
test: /^[-+]?[0-9]+$/,
|
||||
resolve: str => intResolve$1(str, str, 10),
|
||||
options: resolveSeq.intOptions,
|
||||
stringify: resolveSeq.stringifyNumber
|
||||
};
|
||||
const hexObj = {
|
||||
identify: value => intIdentify$2(value) && value >= 0,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:int',
|
||||
format: 'HEX',
|
||||
test: /^0x([0-9a-fA-F]+)$/,
|
||||
resolve: (str, hex) => intResolve$1(str, hex, 16),
|
||||
options: resolveSeq.intOptions,
|
||||
stringify: node => intStringify$1(node, 16, '0x')
|
||||
};
|
||||
const nanObj = {
|
||||
identify: value => typeof value === 'number',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:float',
|
||||
test: /^(?:[-+]?\.inf|(\.nan))$/i,
|
||||
resolve: (str, nan) => nan ? NaN : str[0] === '-' ? Number.NEGATIVE_INFINITY : Number.POSITIVE_INFINITY,
|
||||
stringify: resolveSeq.stringifyNumber
|
||||
};
|
||||
const expObj = {
|
||||
identify: value => typeof value === 'number',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:float',
|
||||
format: 'EXP',
|
||||
test: /^[-+]?(?:\.[0-9]+|[0-9]+(?:\.[0-9]*)?)[eE][-+]?[0-9]+$/,
|
||||
resolve: str => parseFloat(str),
|
||||
stringify: ({
|
||||
value
|
||||
}) => Number(value).toExponential()
|
||||
};
|
||||
const floatObj = {
|
||||
identify: value => typeof value === 'number',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:float',
|
||||
test: /^[-+]?(?:\.([0-9]+)|[0-9]+\.([0-9]*))$/,
|
||||
resolve(str, frac1, frac2) {
|
||||
const frac = frac1 || frac2;
|
||||
const node = new resolveSeq.Scalar(parseFloat(str));
|
||||
if (frac && frac[frac.length - 1] === '0') node.minFractionDigits = frac.length;
|
||||
return node;
|
||||
},
|
||||
stringify: resolveSeq.stringifyNumber
|
||||
};
|
||||
const core = failsafe.concat([nullObj, boolObj, octObj, intObj, hexObj, nanObj, expObj, floatObj]);
|
||||
|
||||
/* global BigInt */
|
||||
const intIdentify$1 = value => typeof value === 'bigint' || Number.isInteger(value);
|
||||
const stringifyJSON = ({
|
||||
value
|
||||
}) => JSON.stringify(value);
|
||||
const json = [map, seq, {
|
||||
identify: value => typeof value === 'string',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:str',
|
||||
resolve: resolveSeq.resolveString,
|
||||
stringify: stringifyJSON
|
||||
}, {
|
||||
identify: value => value == null,
|
||||
createNode: (schema, value, ctx) => ctx.wrapScalars ? new resolveSeq.Scalar(null) : null,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:null',
|
||||
test: /^null$/,
|
||||
resolve: () => null,
|
||||
stringify: stringifyJSON
|
||||
}, {
|
||||
identify: value => typeof value === 'boolean',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:bool',
|
||||
test: /^true|false$/,
|
||||
resolve: str => str === 'true',
|
||||
stringify: stringifyJSON
|
||||
}, {
|
||||
identify: intIdentify$1,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:int',
|
||||
test: /^-?(?:0|[1-9][0-9]*)$/,
|
||||
resolve: str => resolveSeq.intOptions.asBigInt ? BigInt(str) : parseInt(str, 10),
|
||||
stringify: ({
|
||||
value
|
||||
}) => intIdentify$1(value) ? value.toString() : JSON.stringify(value)
|
||||
}, {
|
||||
identify: value => typeof value === 'number',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:float',
|
||||
test: /^-?(?:0|[1-9][0-9]*)(?:\.[0-9]*)?(?:[eE][-+]?[0-9]+)?$/,
|
||||
resolve: str => parseFloat(str),
|
||||
stringify: stringifyJSON
|
||||
}];
|
||||
json.scalarFallback = str => {
|
||||
throw new SyntaxError(`Unresolved plain scalar ${JSON.stringify(str)}`);
|
||||
};
|
||||
|
||||
/* global BigInt */
|
||||
const boolStringify = ({
|
||||
value
|
||||
}) => value ? resolveSeq.boolOptions.trueStr : resolveSeq.boolOptions.falseStr;
|
||||
const intIdentify = value => typeof value === 'bigint' || Number.isInteger(value);
|
||||
function intResolve(sign, src, radix) {
|
||||
let str = src.replace(/_/g, '');
|
||||
if (resolveSeq.intOptions.asBigInt) {
|
||||
switch (radix) {
|
||||
case 2:
|
||||
str = `0b${str}`;
|
||||
break;
|
||||
case 8:
|
||||
str = `0o${str}`;
|
||||
break;
|
||||
case 16:
|
||||
str = `0x${str}`;
|
||||
break;
|
||||
}
|
||||
const n = BigInt(str);
|
||||
return sign === '-' ? BigInt(-1) * n : n;
|
||||
}
|
||||
const n = parseInt(str, radix);
|
||||
return sign === '-' ? -1 * n : n;
|
||||
}
|
||||
function intStringify(node, radix, prefix) {
|
||||
const {
|
||||
value
|
||||
} = node;
|
||||
if (intIdentify(value)) {
|
||||
const str = value.toString(radix);
|
||||
return value < 0 ? '-' + prefix + str.substr(1) : prefix + str;
|
||||
}
|
||||
return resolveSeq.stringifyNumber(node);
|
||||
}
|
||||
const yaml11 = failsafe.concat([{
|
||||
identify: value => value == null,
|
||||
createNode: (schema, value, ctx) => ctx.wrapScalars ? new resolveSeq.Scalar(null) : null,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:null',
|
||||
test: /^(?:~|[Nn]ull|NULL)?$/,
|
||||
resolve: () => null,
|
||||
options: resolveSeq.nullOptions,
|
||||
stringify: () => resolveSeq.nullOptions.nullStr
|
||||
}, {
|
||||
identify: value => typeof value === 'boolean',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:bool',
|
||||
test: /^(?:Y|y|[Yy]es|YES|[Tt]rue|TRUE|[Oo]n|ON)$/,
|
||||
resolve: () => true,
|
||||
options: resolveSeq.boolOptions,
|
||||
stringify: boolStringify
|
||||
}, {
|
||||
identify: value => typeof value === 'boolean',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:bool',
|
||||
test: /^(?:N|n|[Nn]o|NO|[Ff]alse|FALSE|[Oo]ff|OFF)$/i,
|
||||
resolve: () => false,
|
||||
options: resolveSeq.boolOptions,
|
||||
stringify: boolStringify
|
||||
}, {
|
||||
identify: intIdentify,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:int',
|
||||
format: 'BIN',
|
||||
test: /^([-+]?)0b([0-1_]+)$/,
|
||||
resolve: (str, sign, bin) => intResolve(sign, bin, 2),
|
||||
stringify: node => intStringify(node, 2, '0b')
|
||||
}, {
|
||||
identify: intIdentify,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:int',
|
||||
format: 'OCT',
|
||||
test: /^([-+]?)0([0-7_]+)$/,
|
||||
resolve: (str, sign, oct) => intResolve(sign, oct, 8),
|
||||
stringify: node => intStringify(node, 8, '0')
|
||||
}, {
|
||||
identify: intIdentify,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:int',
|
||||
test: /^([-+]?)([0-9][0-9_]*)$/,
|
||||
resolve: (str, sign, abs) => intResolve(sign, abs, 10),
|
||||
stringify: resolveSeq.stringifyNumber
|
||||
}, {
|
||||
identify: intIdentify,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:int',
|
||||
format: 'HEX',
|
||||
test: /^([-+]?)0x([0-9a-fA-F_]+)$/,
|
||||
resolve: (str, sign, hex) => intResolve(sign, hex, 16),
|
||||
stringify: node => intStringify(node, 16, '0x')
|
||||
}, {
|
||||
identify: value => typeof value === 'number',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:float',
|
||||
test: /^(?:[-+]?\.inf|(\.nan))$/i,
|
||||
resolve: (str, nan) => nan ? NaN : str[0] === '-' ? Number.NEGATIVE_INFINITY : Number.POSITIVE_INFINITY,
|
||||
stringify: resolveSeq.stringifyNumber
|
||||
}, {
|
||||
identify: value => typeof value === 'number',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:float',
|
||||
format: 'EXP',
|
||||
test: /^[-+]?([0-9][0-9_]*)?(\.[0-9_]*)?[eE][-+]?[0-9]+$/,
|
||||
resolve: str => parseFloat(str.replace(/_/g, '')),
|
||||
stringify: ({
|
||||
value
|
||||
}) => Number(value).toExponential()
|
||||
}, {
|
||||
identify: value => typeof value === 'number',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:float',
|
||||
test: /^[-+]?(?:[0-9][0-9_]*)?\.([0-9_]*)$/,
|
||||
resolve(str, frac) {
|
||||
const node = new resolveSeq.Scalar(parseFloat(str.replace(/_/g, '')));
|
||||
if (frac) {
|
||||
const f = frac.replace(/_/g, '');
|
||||
if (f[f.length - 1] === '0') node.minFractionDigits = f.length;
|
||||
}
|
||||
return node;
|
||||
},
|
||||
stringify: resolveSeq.stringifyNumber
|
||||
}], warnings.binary, warnings.omap, warnings.pairs, warnings.set, warnings.intTime, warnings.floatTime, warnings.timestamp);
|
||||
|
||||
const schemas = {
|
||||
core,
|
||||
failsafe,
|
||||
json,
|
||||
yaml11
|
||||
};
|
||||
const tags = {
|
||||
binary: warnings.binary,
|
||||
bool: boolObj,
|
||||
float: floatObj,
|
||||
floatExp: expObj,
|
||||
floatNaN: nanObj,
|
||||
floatTime: warnings.floatTime,
|
||||
int: intObj,
|
||||
intHex: hexObj,
|
||||
intOct: octObj,
|
||||
intTime: warnings.intTime,
|
||||
map,
|
||||
null: nullObj,
|
||||
omap: warnings.omap,
|
||||
pairs: warnings.pairs,
|
||||
seq,
|
||||
set: warnings.set,
|
||||
timestamp: warnings.timestamp
|
||||
};
|
||||
|
||||
function findTagObject(value, tagName, tags) {
|
||||
if (tagName) {
|
||||
const match = tags.filter(t => t.tag === tagName);
|
||||
const tagObj = match.find(t => !t.format) || match[0];
|
||||
if (!tagObj) throw new Error(`Tag ${tagName} not found`);
|
||||
return tagObj;
|
||||
}
|
||||
|
||||
// TODO: deprecate/remove class check
|
||||
return tags.find(t => (t.identify && t.identify(value) || t.class && value instanceof t.class) && !t.format);
|
||||
}
|
||||
function createNode(value, tagName, ctx) {
|
||||
if (value instanceof resolveSeq.Node) return value;
|
||||
const {
|
||||
defaultPrefix,
|
||||
onTagObj,
|
||||
prevObjects,
|
||||
schema,
|
||||
wrapScalars
|
||||
} = ctx;
|
||||
if (tagName && tagName.startsWith('!!')) tagName = defaultPrefix + tagName.slice(2);
|
||||
let tagObj = findTagObject(value, tagName, schema.tags);
|
||||
if (!tagObj) {
|
||||
if (typeof value.toJSON === 'function') value = value.toJSON();
|
||||
if (!value || typeof value !== 'object') return wrapScalars ? new resolveSeq.Scalar(value) : value;
|
||||
tagObj = value instanceof Map ? map : value[Symbol.iterator] ? seq : map;
|
||||
}
|
||||
if (onTagObj) {
|
||||
onTagObj(tagObj);
|
||||
delete ctx.onTagObj;
|
||||
}
|
||||
|
||||
// Detect duplicate references to the same object & use Alias nodes for all
|
||||
// after first. The `obj` wrapper allows for circular references to resolve.
|
||||
const obj = {
|
||||
value: undefined,
|
||||
node: undefined
|
||||
};
|
||||
if (value && typeof value === 'object' && prevObjects) {
|
||||
const prev = prevObjects.get(value);
|
||||
if (prev) {
|
||||
const alias = new resolveSeq.Alias(prev); // leaves source dirty; must be cleaned by caller
|
||||
ctx.aliasNodes.push(alias); // defined along with prevObjects
|
||||
return alias;
|
||||
}
|
||||
obj.value = value;
|
||||
prevObjects.set(value, obj);
|
||||
}
|
||||
obj.node = tagObj.createNode ? tagObj.createNode(ctx.schema, value, ctx) : wrapScalars ? new resolveSeq.Scalar(value) : value;
|
||||
if (tagName && obj.node instanceof resolveSeq.Node) obj.node.tag = tagName;
|
||||
return obj.node;
|
||||
}
|
||||
|
||||
function getSchemaTags(schemas, knownTags, customTags, schemaId) {
|
||||
let tags = schemas[schemaId.replace(/\W/g, '')]; // 'yaml-1.1' -> 'yaml11'
|
||||
if (!tags) {
|
||||
const keys = Object.keys(schemas).map(key => JSON.stringify(key)).join(', ');
|
||||
throw new Error(`Unknown schema "${schemaId}"; use one of ${keys}`);
|
||||
}
|
||||
if (Array.isArray(customTags)) {
|
||||
for (const tag of customTags) tags = tags.concat(tag);
|
||||
} else if (typeof customTags === 'function') {
|
||||
tags = customTags(tags.slice());
|
||||
}
|
||||
for (let i = 0; i < tags.length; ++i) {
|
||||
const tag = tags[i];
|
||||
if (typeof tag === 'string') {
|
||||
const tagObj = knownTags[tag];
|
||||
if (!tagObj) {
|
||||
const keys = Object.keys(knownTags).map(key => JSON.stringify(key)).join(', ');
|
||||
throw new Error(`Unknown custom tag "${tag}"; use one of ${keys}`);
|
||||
}
|
||||
tags[i] = tagObj;
|
||||
}
|
||||
}
|
||||
return tags;
|
||||
}
|
||||
|
||||
const sortMapEntriesByKey = (a, b) => a.key < b.key ? -1 : a.key > b.key ? 1 : 0;
|
||||
class Schema {
|
||||
// TODO: remove in v2
|
||||
|
||||
constructor({
|
||||
customTags,
|
||||
merge,
|
||||
schema,
|
||||
sortMapEntries,
|
||||
tags: deprecatedCustomTags
|
||||
}) {
|
||||
this.merge = !!merge;
|
||||
this.name = schema;
|
||||
this.sortMapEntries = sortMapEntries === true ? sortMapEntriesByKey : sortMapEntries || null;
|
||||
if (!customTags && deprecatedCustomTags) warnings.warnOptionDeprecation('tags', 'customTags');
|
||||
this.tags = getSchemaTags(schemas, tags, customTags || deprecatedCustomTags, schema);
|
||||
}
|
||||
createNode(value, wrapScalars, tagName, ctx) {
|
||||
const baseCtx = {
|
||||
defaultPrefix: Schema.defaultPrefix,
|
||||
schema: this,
|
||||
wrapScalars
|
||||
};
|
||||
const createCtx = ctx ? Object.assign(ctx, baseCtx) : baseCtx;
|
||||
return createNode(value, tagName, createCtx);
|
||||
}
|
||||
createPair(key, value, ctx) {
|
||||
if (!ctx) ctx = {
|
||||
wrapScalars: true
|
||||
};
|
||||
const k = this.createNode(key, ctx.wrapScalars, null, ctx);
|
||||
const v = this.createNode(value, ctx.wrapScalars, null, ctx);
|
||||
return new resolveSeq.Pair(k, v);
|
||||
}
|
||||
}
|
||||
PlainValue._defineProperty(Schema, "defaultPrefix", PlainValue.defaultTagPrefix);
|
||||
// TODO: remove in v2
|
||||
PlainValue._defineProperty(Schema, "defaultTags", PlainValue.defaultTags);
|
||||
|
||||
exports.Schema = Schema;
|
||||
67
node_modules/cosmiconfig/node_modules/yaml/dist/index.js
generated
vendored
Normal file
67
node_modules/cosmiconfig/node_modules/yaml/dist/index.js
generated
vendored
Normal file
@@ -0,0 +1,67 @@
|
||||
'use strict';
|
||||
|
||||
var parseCst = require('./parse-cst.js');
|
||||
var Document$1 = require('./Document-a8d0fbf9.js');
|
||||
var Schema = require('./Schema-bcc6c2d7.js');
|
||||
var PlainValue = require('./PlainValue-516d5bc2.js');
|
||||
var warnings = require('./warnings-793925ce.js');
|
||||
require('./resolveSeq-95613e94.js');
|
||||
|
||||
function createNode(value, wrapScalars = true, tag) {
|
||||
if (tag === undefined && typeof wrapScalars === 'string') {
|
||||
tag = wrapScalars;
|
||||
wrapScalars = true;
|
||||
}
|
||||
const options = Object.assign({}, Document$1.Document.defaults[Document$1.defaultOptions.version], Document$1.defaultOptions);
|
||||
const schema = new Schema.Schema(options);
|
||||
return schema.createNode(value, wrapScalars, tag);
|
||||
}
|
||||
class Document extends Document$1.Document {
|
||||
constructor(options) {
|
||||
super(Object.assign({}, Document$1.defaultOptions, options));
|
||||
}
|
||||
}
|
||||
function parseAllDocuments(src, options) {
|
||||
const stream = [];
|
||||
let prev;
|
||||
for (const cstDoc of parseCst.parse(src)) {
|
||||
const doc = new Document(options);
|
||||
doc.parse(cstDoc, prev);
|
||||
stream.push(doc);
|
||||
prev = doc;
|
||||
}
|
||||
return stream;
|
||||
}
|
||||
function parseDocument(src, options) {
|
||||
const cst = parseCst.parse(src);
|
||||
const doc = new Document(options).parse(cst[0]);
|
||||
if (cst.length > 1) {
|
||||
const errMsg = 'Source contains multiple documents; please use YAML.parseAllDocuments()';
|
||||
doc.errors.unshift(new PlainValue.YAMLSemanticError(cst[1], errMsg));
|
||||
}
|
||||
return doc;
|
||||
}
|
||||
function parse(src, options) {
|
||||
const doc = parseDocument(src, options);
|
||||
doc.warnings.forEach(warning => warnings.warn(warning));
|
||||
if (doc.errors.length > 0) throw doc.errors[0];
|
||||
return doc.toJSON();
|
||||
}
|
||||
function stringify(value, options) {
|
||||
const doc = new Document(options);
|
||||
doc.contents = value;
|
||||
return String(doc);
|
||||
}
|
||||
const YAML = {
|
||||
createNode,
|
||||
defaultOptions: Document$1.defaultOptions,
|
||||
Document,
|
||||
parse,
|
||||
parseAllDocuments,
|
||||
parseCST: parseCst.parse,
|
||||
parseDocument,
|
||||
scalarOptions: Document$1.scalarOptions,
|
||||
stringify
|
||||
};
|
||||
|
||||
exports.YAML = YAML;
|
||||
16
node_modules/cosmiconfig/node_modules/yaml/dist/legacy-exports.js
generated
vendored
Normal file
16
node_modules/cosmiconfig/node_modules/yaml/dist/legacy-exports.js
generated
vendored
Normal file
@@ -0,0 +1,16 @@
|
||||
'use strict';
|
||||
|
||||
var warnings = require('./warnings-793925ce.js');
|
||||
require('./PlainValue-516d5bc2.js');
|
||||
require('./resolveSeq-95613e94.js');
|
||||
|
||||
|
||||
|
||||
exports.binary = warnings.binary;
|
||||
exports.floatTime = warnings.floatTime;
|
||||
exports.intTime = warnings.intTime;
|
||||
exports.omap = warnings.omap;
|
||||
exports.pairs = warnings.pairs;
|
||||
exports.set = warnings.set;
|
||||
exports.timestamp = warnings.timestamp;
|
||||
exports.warnFileDeprecation = warnings.warnFileDeprecation;
|
||||
1507
node_modules/cosmiconfig/node_modules/yaml/dist/parse-cst.js
generated
vendored
Normal file
1507
node_modules/cosmiconfig/node_modules/yaml/dist/parse-cst.js
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
1859
node_modules/cosmiconfig/node_modules/yaml/dist/resolveSeq-95613e94.js
generated
vendored
Normal file
1859
node_modules/cosmiconfig/node_modules/yaml/dist/resolveSeq-95613e94.js
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
146
node_modules/cosmiconfig/node_modules/yaml/dist/test-events.js
generated
vendored
Normal file
146
node_modules/cosmiconfig/node_modules/yaml/dist/test-events.js
generated
vendored
Normal file
@@ -0,0 +1,146 @@
|
||||
'use strict';
|
||||
|
||||
var parseCst = require('./parse-cst.js');
|
||||
var Document = require('./Document-a8d0fbf9.js');
|
||||
require('./PlainValue-516d5bc2.js');
|
||||
require('./resolveSeq-95613e94.js');
|
||||
require('./Schema-bcc6c2d7.js');
|
||||
require('./warnings-793925ce.js');
|
||||
|
||||
// test harness for yaml-test-suite event tests
|
||||
function testEvents(src, options) {
|
||||
const opt = Object.assign({
|
||||
keepCstNodes: true,
|
||||
keepNodeTypes: true,
|
||||
version: '1.2'
|
||||
}, options);
|
||||
const docs = parseCst.parse(src).map(cstDoc => new Document.Document(opt).parse(cstDoc));
|
||||
const errDoc = docs.find(doc => doc.errors.length > 0);
|
||||
const error = errDoc ? errDoc.errors[0].message : null;
|
||||
const events = ['+STR'];
|
||||
try {
|
||||
for (let i = 0; i < docs.length; ++i) {
|
||||
const doc = docs[i];
|
||||
let root = doc.contents;
|
||||
if (Array.isArray(root)) root = root[0];
|
||||
const [rootStart, rootEnd] = doc.range || [0, 0];
|
||||
let e = doc.errors[0] && doc.errors[0].source;
|
||||
if (e && e.type === 'SEQ_ITEM') e = e.node;
|
||||
if (e && (e.type === 'DOCUMENT' || e.range.start < rootStart)) throw new Error();
|
||||
let docStart = '+DOC';
|
||||
const pre = src.slice(0, rootStart);
|
||||
const explicitDoc = /---\s*$/.test(pre);
|
||||
if (explicitDoc) docStart += ' ---';else if (!doc.contents) continue;
|
||||
events.push(docStart);
|
||||
addEvents(events, doc, e, root);
|
||||
if (doc.contents && doc.contents.length > 1) throw new Error();
|
||||
let docEnd = '-DOC';
|
||||
if (rootEnd) {
|
||||
const post = src.slice(rootEnd);
|
||||
if (/^\.\.\./.test(post)) docEnd += ' ...';
|
||||
}
|
||||
events.push(docEnd);
|
||||
}
|
||||
} catch (e) {
|
||||
return {
|
||||
events,
|
||||
error: error || e
|
||||
};
|
||||
}
|
||||
events.push('-STR');
|
||||
return {
|
||||
events,
|
||||
error
|
||||
};
|
||||
}
|
||||
function addEvents(events, doc, e, node) {
|
||||
if (!node) {
|
||||
events.push('=VAL :');
|
||||
return;
|
||||
}
|
||||
if (e && node.cstNode === e) throw new Error();
|
||||
let props = '';
|
||||
let anchor = doc.anchors.getName(node);
|
||||
if (anchor) {
|
||||
if (/\d$/.test(anchor)) {
|
||||
const alt = anchor.replace(/\d$/, '');
|
||||
if (doc.anchors.getNode(alt)) anchor = alt;
|
||||
}
|
||||
props = ` &${anchor}`;
|
||||
}
|
||||
if (node.cstNode && node.cstNode.tag) {
|
||||
const {
|
||||
handle,
|
||||
suffix
|
||||
} = node.cstNode.tag;
|
||||
props += handle === '!' && !suffix ? ' <!>' : ` <${node.tag}>`;
|
||||
}
|
||||
let scalar = null;
|
||||
switch (node.type) {
|
||||
case 'ALIAS':
|
||||
{
|
||||
let alias = doc.anchors.getName(node.source);
|
||||
if (/\d$/.test(alias)) {
|
||||
const alt = alias.replace(/\d$/, '');
|
||||
if (doc.anchors.getNode(alt)) alias = alt;
|
||||
}
|
||||
events.push(`=ALI${props} *${alias}`);
|
||||
}
|
||||
break;
|
||||
case 'BLOCK_FOLDED':
|
||||
scalar = '>';
|
||||
break;
|
||||
case 'BLOCK_LITERAL':
|
||||
scalar = '|';
|
||||
break;
|
||||
case 'PLAIN':
|
||||
scalar = ':';
|
||||
break;
|
||||
case 'QUOTE_DOUBLE':
|
||||
scalar = '"';
|
||||
break;
|
||||
case 'QUOTE_SINGLE':
|
||||
scalar = "'";
|
||||
break;
|
||||
case 'PAIR':
|
||||
events.push(`+MAP${props}`);
|
||||
addEvents(events, doc, e, node.key);
|
||||
addEvents(events, doc, e, node.value);
|
||||
events.push('-MAP');
|
||||
break;
|
||||
case 'FLOW_SEQ':
|
||||
case 'SEQ':
|
||||
{
|
||||
const ev = node.type === 'FLOW_SEQ' ? '+SEQ []' : '+SEQ';
|
||||
events.push(`${ev}${props}`);
|
||||
node.items.forEach(item => {
|
||||
addEvents(events, doc, e, item);
|
||||
});
|
||||
events.push('-SEQ');
|
||||
break;
|
||||
}
|
||||
case 'FLOW_MAP':
|
||||
case 'MAP':
|
||||
{
|
||||
const ev = node.type === 'FLOW_SEQ' ? '+MAP {}' : '+MAP';
|
||||
events.push(`${ev}${props}`);
|
||||
node.items.forEach(({
|
||||
key,
|
||||
value
|
||||
}) => {
|
||||
addEvents(events, doc, e, key);
|
||||
addEvents(events, doc, e, value);
|
||||
});
|
||||
events.push('-MAP');
|
||||
break;
|
||||
}
|
||||
default:
|
||||
throw new Error(`Unexpected node type ${node.type}`);
|
||||
}
|
||||
if (scalar) {
|
||||
const value = node.cstNode.strValue.replace(/\\/g, '\\\\').replace(/\0/g, '\\0').replace(/\x07/g, '\\a').replace(/\x08/g, '\\b').replace(/\t/g, '\\t').replace(/\n/g, '\\n').replace(/\v/g, '\\v').replace(/\f/g, '\\f').replace(/\r/g, '\\r').replace(/\x1b/g, '\\e');
|
||||
events.push(`=VAL${props} ${scalar}${value}`);
|
||||
}
|
||||
}
|
||||
|
||||
exports.testEvents = testEvents;
|
||||
23
node_modules/cosmiconfig/node_modules/yaml/dist/types.js
generated
vendored
Normal file
23
node_modules/cosmiconfig/node_modules/yaml/dist/types.js
generated
vendored
Normal file
@@ -0,0 +1,23 @@
|
||||
'use strict';
|
||||
|
||||
var resolveSeq = require('./resolveSeq-95613e94.js');
|
||||
var Schema = require('./Schema-bcc6c2d7.js');
|
||||
require('./PlainValue-516d5bc2.js');
|
||||
require('./warnings-793925ce.js');
|
||||
|
||||
|
||||
|
||||
exports.Alias = resolveSeq.Alias;
|
||||
exports.Collection = resolveSeq.Collection;
|
||||
exports.Merge = resolveSeq.Merge;
|
||||
exports.Node = resolveSeq.Node;
|
||||
exports.Pair = resolveSeq.Pair;
|
||||
exports.Scalar = resolveSeq.Scalar;
|
||||
exports.YAMLMap = resolveSeq.YAMLMap;
|
||||
exports.YAMLSeq = resolveSeq.YAMLSeq;
|
||||
exports.binaryOptions = resolveSeq.binaryOptions;
|
||||
exports.boolOptions = resolveSeq.boolOptions;
|
||||
exports.intOptions = resolveSeq.intOptions;
|
||||
exports.nullOptions = resolveSeq.nullOptions;
|
||||
exports.strOptions = resolveSeq.strOptions;
|
||||
exports.Schema = Schema.Schema;
|
||||
19
node_modules/cosmiconfig/node_modules/yaml/dist/util.js
generated
vendored
Normal file
19
node_modules/cosmiconfig/node_modules/yaml/dist/util.js
generated
vendored
Normal file
@@ -0,0 +1,19 @@
|
||||
'use strict';
|
||||
|
||||
var resolveSeq = require('./resolveSeq-95613e94.js');
|
||||
var PlainValue = require('./PlainValue-516d5bc2.js');
|
||||
|
||||
|
||||
|
||||
exports.findPair = resolveSeq.findPair;
|
||||
exports.parseMap = resolveSeq.resolveMap;
|
||||
exports.parseSeq = resolveSeq.resolveSeq;
|
||||
exports.stringifyNumber = resolveSeq.stringifyNumber;
|
||||
exports.stringifyString = resolveSeq.stringifyString;
|
||||
exports.toJSON = resolveSeq.toJSON;
|
||||
exports.Type = PlainValue.Type;
|
||||
exports.YAMLError = PlainValue.YAMLError;
|
||||
exports.YAMLReferenceError = PlainValue.YAMLReferenceError;
|
||||
exports.YAMLSemanticError = PlainValue.YAMLSemanticError;
|
||||
exports.YAMLSyntaxError = PlainValue.YAMLSyntaxError;
|
||||
exports.YAMLWarning = PlainValue.YAMLWarning;
|
||||
359
node_modules/cosmiconfig/node_modules/yaml/dist/warnings-793925ce.js
generated
vendored
Normal file
359
node_modules/cosmiconfig/node_modules/yaml/dist/warnings-793925ce.js
generated
vendored
Normal file
@@ -0,0 +1,359 @@
|
||||
'use strict';
|
||||
|
||||
var PlainValue = require('./PlainValue-516d5bc2.js');
|
||||
var resolveSeq = require('./resolveSeq-95613e94.js');
|
||||
|
||||
/* global atob, btoa, Buffer */
|
||||
const binary = {
|
||||
identify: value => value instanceof Uint8Array,
|
||||
// Buffer inherits from Uint8Array
|
||||
default: false,
|
||||
tag: 'tag:yaml.org,2002:binary',
|
||||
/**
|
||||
* Returns a Buffer in node and an Uint8Array in browsers
|
||||
*
|
||||
* To use the resulting buffer as an image, you'll want to do something like:
|
||||
*
|
||||
* const blob = new Blob([buffer], { type: 'image/jpeg' })
|
||||
* document.querySelector('#photo').src = URL.createObjectURL(blob)
|
||||
*/
|
||||
resolve: (doc, node) => {
|
||||
const src = resolveSeq.resolveString(doc, node);
|
||||
if (typeof Buffer === 'function') {
|
||||
return Buffer.from(src, 'base64');
|
||||
} else if (typeof atob === 'function') {
|
||||
// On IE 11, atob() can't handle newlines
|
||||
const str = atob(src.replace(/[\n\r]/g, ''));
|
||||
const buffer = new Uint8Array(str.length);
|
||||
for (let i = 0; i < str.length; ++i) buffer[i] = str.charCodeAt(i);
|
||||
return buffer;
|
||||
} else {
|
||||
const msg = 'This environment does not support reading binary tags; either Buffer or atob is required';
|
||||
doc.errors.push(new PlainValue.YAMLReferenceError(node, msg));
|
||||
return null;
|
||||
}
|
||||
},
|
||||
options: resolveSeq.binaryOptions,
|
||||
stringify: ({
|
||||
comment,
|
||||
type,
|
||||
value
|
||||
}, ctx, onComment, onChompKeep) => {
|
||||
let src;
|
||||
if (typeof Buffer === 'function') {
|
||||
src = value instanceof Buffer ? value.toString('base64') : Buffer.from(value.buffer).toString('base64');
|
||||
} else if (typeof btoa === 'function') {
|
||||
let s = '';
|
||||
for (let i = 0; i < value.length; ++i) s += String.fromCharCode(value[i]);
|
||||
src = btoa(s);
|
||||
} else {
|
||||
throw new Error('This environment does not support writing binary tags; either Buffer or btoa is required');
|
||||
}
|
||||
if (!type) type = resolveSeq.binaryOptions.defaultType;
|
||||
if (type === PlainValue.Type.QUOTE_DOUBLE) {
|
||||
value = src;
|
||||
} else {
|
||||
const {
|
||||
lineWidth
|
||||
} = resolveSeq.binaryOptions;
|
||||
const n = Math.ceil(src.length / lineWidth);
|
||||
const lines = new Array(n);
|
||||
for (let i = 0, o = 0; i < n; ++i, o += lineWidth) {
|
||||
lines[i] = src.substr(o, lineWidth);
|
||||
}
|
||||
value = lines.join(type === PlainValue.Type.BLOCK_LITERAL ? '\n' : ' ');
|
||||
}
|
||||
return resolveSeq.stringifyString({
|
||||
comment,
|
||||
type,
|
||||
value
|
||||
}, ctx, onComment, onChompKeep);
|
||||
}
|
||||
};
|
||||
|
||||
function parsePairs(doc, cst) {
|
||||
const seq = resolveSeq.resolveSeq(doc, cst);
|
||||
for (let i = 0; i < seq.items.length; ++i) {
|
||||
let item = seq.items[i];
|
||||
if (item instanceof resolveSeq.Pair) continue;else if (item instanceof resolveSeq.YAMLMap) {
|
||||
if (item.items.length > 1) {
|
||||
const msg = 'Each pair must have its own sequence indicator';
|
||||
throw new PlainValue.YAMLSemanticError(cst, msg);
|
||||
}
|
||||
const pair = item.items[0] || new resolveSeq.Pair();
|
||||
if (item.commentBefore) pair.commentBefore = pair.commentBefore ? `${item.commentBefore}\n${pair.commentBefore}` : item.commentBefore;
|
||||
if (item.comment) pair.comment = pair.comment ? `${item.comment}\n${pair.comment}` : item.comment;
|
||||
item = pair;
|
||||
}
|
||||
seq.items[i] = item instanceof resolveSeq.Pair ? item : new resolveSeq.Pair(item);
|
||||
}
|
||||
return seq;
|
||||
}
|
||||
function createPairs(schema, iterable, ctx) {
|
||||
const pairs = new resolveSeq.YAMLSeq(schema);
|
||||
pairs.tag = 'tag:yaml.org,2002:pairs';
|
||||
for (const it of iterable) {
|
||||
let key, value;
|
||||
if (Array.isArray(it)) {
|
||||
if (it.length === 2) {
|
||||
key = it[0];
|
||||
value = it[1];
|
||||
} else throw new TypeError(`Expected [key, value] tuple: ${it}`);
|
||||
} else if (it && it instanceof Object) {
|
||||
const keys = Object.keys(it);
|
||||
if (keys.length === 1) {
|
||||
key = keys[0];
|
||||
value = it[key];
|
||||
} else throw new TypeError(`Expected { key: value } tuple: ${it}`);
|
||||
} else {
|
||||
key = it;
|
||||
}
|
||||
const pair = schema.createPair(key, value, ctx);
|
||||
pairs.items.push(pair);
|
||||
}
|
||||
return pairs;
|
||||
}
|
||||
const pairs = {
|
||||
default: false,
|
||||
tag: 'tag:yaml.org,2002:pairs',
|
||||
resolve: parsePairs,
|
||||
createNode: createPairs
|
||||
};
|
||||
|
||||
class YAMLOMap extends resolveSeq.YAMLSeq {
|
||||
constructor() {
|
||||
super();
|
||||
PlainValue._defineProperty(this, "add", resolveSeq.YAMLMap.prototype.add.bind(this));
|
||||
PlainValue._defineProperty(this, "delete", resolveSeq.YAMLMap.prototype.delete.bind(this));
|
||||
PlainValue._defineProperty(this, "get", resolveSeq.YAMLMap.prototype.get.bind(this));
|
||||
PlainValue._defineProperty(this, "has", resolveSeq.YAMLMap.prototype.has.bind(this));
|
||||
PlainValue._defineProperty(this, "set", resolveSeq.YAMLMap.prototype.set.bind(this));
|
||||
this.tag = YAMLOMap.tag;
|
||||
}
|
||||
toJSON(_, ctx) {
|
||||
const map = new Map();
|
||||
if (ctx && ctx.onCreate) ctx.onCreate(map);
|
||||
for (const pair of this.items) {
|
||||
let key, value;
|
||||
if (pair instanceof resolveSeq.Pair) {
|
||||
key = resolveSeq.toJSON(pair.key, '', ctx);
|
||||
value = resolveSeq.toJSON(pair.value, key, ctx);
|
||||
} else {
|
||||
key = resolveSeq.toJSON(pair, '', ctx);
|
||||
}
|
||||
if (map.has(key)) throw new Error('Ordered maps must not include duplicate keys');
|
||||
map.set(key, value);
|
||||
}
|
||||
return map;
|
||||
}
|
||||
}
|
||||
PlainValue._defineProperty(YAMLOMap, "tag", 'tag:yaml.org,2002:omap');
|
||||
function parseOMap(doc, cst) {
|
||||
const pairs = parsePairs(doc, cst);
|
||||
const seenKeys = [];
|
||||
for (const {
|
||||
key
|
||||
} of pairs.items) {
|
||||
if (key instanceof resolveSeq.Scalar) {
|
||||
if (seenKeys.includes(key.value)) {
|
||||
const msg = 'Ordered maps must not include duplicate keys';
|
||||
throw new PlainValue.YAMLSemanticError(cst, msg);
|
||||
} else {
|
||||
seenKeys.push(key.value);
|
||||
}
|
||||
}
|
||||
}
|
||||
return Object.assign(new YAMLOMap(), pairs);
|
||||
}
|
||||
function createOMap(schema, iterable, ctx) {
|
||||
const pairs = createPairs(schema, iterable, ctx);
|
||||
const omap = new YAMLOMap();
|
||||
omap.items = pairs.items;
|
||||
return omap;
|
||||
}
|
||||
const omap = {
|
||||
identify: value => value instanceof Map,
|
||||
nodeClass: YAMLOMap,
|
||||
default: false,
|
||||
tag: 'tag:yaml.org,2002:omap',
|
||||
resolve: parseOMap,
|
||||
createNode: createOMap
|
||||
};
|
||||
|
||||
class YAMLSet extends resolveSeq.YAMLMap {
|
||||
constructor() {
|
||||
super();
|
||||
this.tag = YAMLSet.tag;
|
||||
}
|
||||
add(key) {
|
||||
const pair = key instanceof resolveSeq.Pair ? key : new resolveSeq.Pair(key);
|
||||
const prev = resolveSeq.findPair(this.items, pair.key);
|
||||
if (!prev) this.items.push(pair);
|
||||
}
|
||||
get(key, keepPair) {
|
||||
const pair = resolveSeq.findPair(this.items, key);
|
||||
return !keepPair && pair instanceof resolveSeq.Pair ? pair.key instanceof resolveSeq.Scalar ? pair.key.value : pair.key : pair;
|
||||
}
|
||||
set(key, value) {
|
||||
if (typeof value !== 'boolean') throw new Error(`Expected boolean value for set(key, value) in a YAML set, not ${typeof value}`);
|
||||
const prev = resolveSeq.findPair(this.items, key);
|
||||
if (prev && !value) {
|
||||
this.items.splice(this.items.indexOf(prev), 1);
|
||||
} else if (!prev && value) {
|
||||
this.items.push(new resolveSeq.Pair(key));
|
||||
}
|
||||
}
|
||||
toJSON(_, ctx) {
|
||||
return super.toJSON(_, ctx, Set);
|
||||
}
|
||||
toString(ctx, onComment, onChompKeep) {
|
||||
if (!ctx) return JSON.stringify(this);
|
||||
if (this.hasAllNullValues()) return super.toString(ctx, onComment, onChompKeep);else throw new Error('Set items must all have null values');
|
||||
}
|
||||
}
|
||||
PlainValue._defineProperty(YAMLSet, "tag", 'tag:yaml.org,2002:set');
|
||||
function parseSet(doc, cst) {
|
||||
const map = resolveSeq.resolveMap(doc, cst);
|
||||
if (!map.hasAllNullValues()) throw new PlainValue.YAMLSemanticError(cst, 'Set items must all have null values');
|
||||
return Object.assign(new YAMLSet(), map);
|
||||
}
|
||||
function createSet(schema, iterable, ctx) {
|
||||
const set = new YAMLSet();
|
||||
for (const value of iterable) set.items.push(schema.createPair(value, null, ctx));
|
||||
return set;
|
||||
}
|
||||
const set = {
|
||||
identify: value => value instanceof Set,
|
||||
nodeClass: YAMLSet,
|
||||
default: false,
|
||||
tag: 'tag:yaml.org,2002:set',
|
||||
resolve: parseSet,
|
||||
createNode: createSet
|
||||
};
|
||||
|
||||
const parseSexagesimal = (sign, parts) => {
|
||||
const n = parts.split(':').reduce((n, p) => n * 60 + Number(p), 0);
|
||||
return sign === '-' ? -n : n;
|
||||
};
|
||||
|
||||
// hhhh:mm:ss.sss
|
||||
const stringifySexagesimal = ({
|
||||
value
|
||||
}) => {
|
||||
if (isNaN(value) || !isFinite(value)) return resolveSeq.stringifyNumber(value);
|
||||
let sign = '';
|
||||
if (value < 0) {
|
||||
sign = '-';
|
||||
value = Math.abs(value);
|
||||
}
|
||||
const parts = [value % 60]; // seconds, including ms
|
||||
if (value < 60) {
|
||||
parts.unshift(0); // at least one : is required
|
||||
} else {
|
||||
value = Math.round((value - parts[0]) / 60);
|
||||
parts.unshift(value % 60); // minutes
|
||||
if (value >= 60) {
|
||||
value = Math.round((value - parts[0]) / 60);
|
||||
parts.unshift(value); // hours
|
||||
}
|
||||
}
|
||||
return sign + parts.map(n => n < 10 ? '0' + String(n) : String(n)).join(':').replace(/000000\d*$/, '') // % 60 may introduce error
|
||||
;
|
||||
};
|
||||
const intTime = {
|
||||
identify: value => typeof value === 'number',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:int',
|
||||
format: 'TIME',
|
||||
test: /^([-+]?)([0-9][0-9_]*(?::[0-5]?[0-9])+)$/,
|
||||
resolve: (str, sign, parts) => parseSexagesimal(sign, parts.replace(/_/g, '')),
|
||||
stringify: stringifySexagesimal
|
||||
};
|
||||
const floatTime = {
|
||||
identify: value => typeof value === 'number',
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:float',
|
||||
format: 'TIME',
|
||||
test: /^([-+]?)([0-9][0-9_]*(?::[0-5]?[0-9])+\.[0-9_]*)$/,
|
||||
resolve: (str, sign, parts) => parseSexagesimal(sign, parts.replace(/_/g, '')),
|
||||
stringify: stringifySexagesimal
|
||||
};
|
||||
const timestamp = {
|
||||
identify: value => value instanceof Date,
|
||||
default: true,
|
||||
tag: 'tag:yaml.org,2002:timestamp',
|
||||
// If the time zone is omitted, the timestamp is assumed to be specified in UTC. The time part
|
||||
// may be omitted altogether, resulting in a date format. In such a case, the time part is
|
||||
// assumed to be 00:00:00Z (start of day, UTC).
|
||||
test: RegExp('^(?:' + '([0-9]{4})-([0-9]{1,2})-([0-9]{1,2})' +
|
||||
// YYYY-Mm-Dd
|
||||
'(?:(?:t|T|[ \\t]+)' +
|
||||
// t | T | whitespace
|
||||
'([0-9]{1,2}):([0-9]{1,2}):([0-9]{1,2}(\\.[0-9]+)?)' +
|
||||
// Hh:Mm:Ss(.ss)?
|
||||
'(?:[ \\t]*(Z|[-+][012]?[0-9](?::[0-9]{2})?))?' +
|
||||
// Z | +5 | -03:30
|
||||
')?' + ')$'),
|
||||
resolve: (str, year, month, day, hour, minute, second, millisec, tz) => {
|
||||
if (millisec) millisec = (millisec + '00').substr(1, 3);
|
||||
let date = Date.UTC(year, month - 1, day, hour || 0, minute || 0, second || 0, millisec || 0);
|
||||
if (tz && tz !== 'Z') {
|
||||
let d = parseSexagesimal(tz[0], tz.slice(1));
|
||||
if (Math.abs(d) < 30) d *= 60;
|
||||
date -= 60000 * d;
|
||||
}
|
||||
return new Date(date);
|
||||
},
|
||||
stringify: ({
|
||||
value
|
||||
}) => value.toISOString().replace(/((T00:00)?:00)?\.000Z$/, '')
|
||||
};
|
||||
|
||||
/* global console, process, YAML_SILENCE_DEPRECATION_WARNINGS, YAML_SILENCE_WARNINGS */
|
||||
|
||||
function shouldWarn(deprecation) {
|
||||
const env = typeof process !== 'undefined' && process.env || {};
|
||||
if (deprecation) {
|
||||
if (typeof YAML_SILENCE_DEPRECATION_WARNINGS !== 'undefined') return !YAML_SILENCE_DEPRECATION_WARNINGS;
|
||||
return !env.YAML_SILENCE_DEPRECATION_WARNINGS;
|
||||
}
|
||||
if (typeof YAML_SILENCE_WARNINGS !== 'undefined') return !YAML_SILENCE_WARNINGS;
|
||||
return !env.YAML_SILENCE_WARNINGS;
|
||||
}
|
||||
function warn(warning, type) {
|
||||
if (shouldWarn(false)) {
|
||||
const emit = typeof process !== 'undefined' && process.emitWarning;
|
||||
// This will throw in Jest if `warning` is an Error instance due to
|
||||
// https://github.com/facebook/jest/issues/2549
|
||||
if (emit) emit(warning, type);else {
|
||||
// eslint-disable-next-line no-console
|
||||
console.warn(type ? `${type}: ${warning}` : warning);
|
||||
}
|
||||
}
|
||||
}
|
||||
function warnFileDeprecation(filename) {
|
||||
if (shouldWarn(true)) {
|
||||
const path = filename.replace(/.*yaml[/\\]/i, '').replace(/\.js$/, '').replace(/\\/g, '/');
|
||||
warn(`The endpoint 'yaml/${path}' will be removed in a future release.`, 'DeprecationWarning');
|
||||
}
|
||||
}
|
||||
const warned = {};
|
||||
function warnOptionDeprecation(name, alternative) {
|
||||
if (!warned[name] && shouldWarn(true)) {
|
||||
warned[name] = true;
|
||||
let msg = `The option '${name}' will be removed in a future release`;
|
||||
msg += alternative ? `, use '${alternative}' instead.` : '.';
|
||||
warn(msg, 'DeprecationWarning');
|
||||
}
|
||||
}
|
||||
|
||||
exports.binary = binary;
|
||||
exports.floatTime = floatTime;
|
||||
exports.intTime = intTime;
|
||||
exports.omap = omap;
|
||||
exports.pairs = pairs;
|
||||
exports.set = set;
|
||||
exports.timestamp = timestamp;
|
||||
exports.warn = warn;
|
||||
exports.warnFileDeprecation = warnFileDeprecation;
|
||||
exports.warnOptionDeprecation = warnOptionDeprecation;
|
||||
372
node_modules/cosmiconfig/node_modules/yaml/index.d.ts
generated
vendored
Normal file
372
node_modules/cosmiconfig/node_modules/yaml/index.d.ts
generated
vendored
Normal file
@@ -0,0 +1,372 @@
|
||||
import { CST } from './parse-cst'
|
||||
import {
|
||||
AST,
|
||||
Alias,
|
||||
Collection,
|
||||
Merge,
|
||||
Node,
|
||||
Scalar,
|
||||
Schema,
|
||||
YAMLMap,
|
||||
YAMLSeq
|
||||
} from './types'
|
||||
import { Type, YAMLError, YAMLWarning } from './util'
|
||||
|
||||
export { AST, CST }
|
||||
export { default as parseCST } from './parse-cst'
|
||||
|
||||
/**
|
||||
* `yaml` defines document-specific options in three places: as an argument of
|
||||
* parse, create and stringify calls, in the values of `YAML.defaultOptions`,
|
||||
* and in the version-dependent `YAML.Document.defaults` object. Values set in
|
||||
* `YAML.defaultOptions` override version-dependent defaults, and argument
|
||||
* options override both.
|
||||
*/
|
||||
export const defaultOptions: Options
|
||||
|
||||
export interface Options extends Schema.Options {
|
||||
/**
|
||||
* Default prefix for anchors.
|
||||
*
|
||||
* Default: `'a'`, resulting in anchors `a1`, `a2`, etc.
|
||||
*/
|
||||
anchorPrefix?: string
|
||||
/**
|
||||
* The number of spaces to use when indenting code.
|
||||
*
|
||||
* Default: `2`
|
||||
*/
|
||||
indent?: number
|
||||
/**
|
||||
* Whether block sequences should be indented.
|
||||
*
|
||||
* Default: `true`
|
||||
*/
|
||||
indentSeq?: boolean
|
||||
/**
|
||||
* Allow non-JSON JavaScript objects to remain in the `toJSON` output.
|
||||
* Relevant with the YAML 1.1 `!!timestamp` and `!!binary` tags as well as BigInts.
|
||||
*
|
||||
* Default: `true`
|
||||
*/
|
||||
keepBlobsInJSON?: boolean
|
||||
/**
|
||||
* Include references in the AST to each node's corresponding CST node.
|
||||
*
|
||||
* Default: `false`
|
||||
*/
|
||||
keepCstNodes?: boolean
|
||||
/**
|
||||
* Store the original node type when parsing documents.
|
||||
*
|
||||
* Default: `true`
|
||||
*/
|
||||
keepNodeTypes?: boolean
|
||||
/**
|
||||
* When outputting JS, use Map rather than Object to represent mappings.
|
||||
*
|
||||
* Default: `false`
|
||||
*/
|
||||
mapAsMap?: boolean
|
||||
/**
|
||||
* Prevent exponential entity expansion attacks by limiting data aliasing count;
|
||||
* set to `-1` to disable checks; `0` disallows all alias nodes.
|
||||
*
|
||||
* Default: `100`
|
||||
*/
|
||||
maxAliasCount?: number
|
||||
/**
|
||||
* Include line position & node type directly in errors; drop their verbose source and context.
|
||||
*
|
||||
* Default: `false`
|
||||
*/
|
||||
prettyErrors?: boolean
|
||||
/**
|
||||
* When stringifying, require keys to be scalars and to use implicit rather than explicit notation.
|
||||
*
|
||||
* Default: `false`
|
||||
*/
|
||||
simpleKeys?: boolean
|
||||
/**
|
||||
* The YAML version used by documents without a `%YAML` directive.
|
||||
*
|
||||
* Default: `"1.2"`
|
||||
*/
|
||||
version?: '1.0' | '1.1' | '1.2'
|
||||
}
|
||||
|
||||
/**
|
||||
* Some customization options are availabe to control the parsing and
|
||||
* stringification of scalars. Note that these values are used by all documents.
|
||||
*/
|
||||
export const scalarOptions: {
|
||||
binary: scalarOptions.Binary
|
||||
bool: scalarOptions.Bool
|
||||
int: scalarOptions.Int
|
||||
null: scalarOptions.Null
|
||||
str: scalarOptions.Str
|
||||
}
|
||||
export namespace scalarOptions {
|
||||
interface Binary {
|
||||
/**
|
||||
* The type of string literal used to stringify `!!binary` values.
|
||||
*
|
||||
* Default: `'BLOCK_LITERAL'`
|
||||
*/
|
||||
defaultType: Scalar.Type
|
||||
/**
|
||||
* Maximum line width for `!!binary`.
|
||||
*
|
||||
* Default: `76`
|
||||
*/
|
||||
lineWidth: number
|
||||
}
|
||||
|
||||
interface Bool {
|
||||
/**
|
||||
* String representation for `true`. With the core schema, use `'true' | 'True' | 'TRUE'`.
|
||||
*
|
||||
* Default: `'true'`
|
||||
*/
|
||||
trueStr: string
|
||||
/**
|
||||
* String representation for `false`. With the core schema, use `'false' | 'False' | 'FALSE'`.
|
||||
*
|
||||
* Default: `'false'`
|
||||
*/
|
||||
falseStr: string
|
||||
}
|
||||
|
||||
interface Int {
|
||||
/**
|
||||
* Whether integers should be parsed into BigInt values.
|
||||
*
|
||||
* Default: `false`
|
||||
*/
|
||||
asBigInt: boolean
|
||||
}
|
||||
|
||||
interface Null {
|
||||
/**
|
||||
* String representation for `null`. With the core schema, use `'null' | 'Null' | 'NULL' | '~' | ''`.
|
||||
*
|
||||
* Default: `'null'`
|
||||
*/
|
||||
nullStr: string
|
||||
}
|
||||
|
||||
interface Str {
|
||||
/**
|
||||
* The default type of string literal used to stringify values
|
||||
*
|
||||
* Default: `'PLAIN'`
|
||||
*/
|
||||
defaultType: Scalar.Type
|
||||
doubleQuoted: {
|
||||
/**
|
||||
* Whether to restrict double-quoted strings to use JSON-compatible syntax.
|
||||
*
|
||||
* Default: `false`
|
||||
*/
|
||||
jsonEncoding: boolean
|
||||
/**
|
||||
* Minimum length to use multiple lines to represent the value.
|
||||
*
|
||||
* Default: `40`
|
||||
*/
|
||||
minMultiLineLength: number
|
||||
}
|
||||
fold: {
|
||||
/**
|
||||
* Maximum line width (set to `0` to disable folding).
|
||||
*
|
||||
* Default: `80`
|
||||
*/
|
||||
lineWidth: number
|
||||
/**
|
||||
* Minimum width for highly-indented content.
|
||||
*
|
||||
* Default: `20`
|
||||
*/
|
||||
minContentWidth: number
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export class Document extends Collection {
|
||||
cstNode?: CST.Document
|
||||
constructor(options?: Options)
|
||||
tag: never
|
||||
directivesEndMarker?: boolean
|
||||
type: Type.DOCUMENT
|
||||
/**
|
||||
* Anchors associated with the document's nodes;
|
||||
* also provides alias & merge node creators.
|
||||
*/
|
||||
anchors: Document.Anchors
|
||||
/** The document contents. */
|
||||
contents: any
|
||||
/** Errors encountered during parsing. */
|
||||
errors: YAMLError[]
|
||||
/**
|
||||
* The schema used with the document. Use `setSchema()` to change or
|
||||
* initialise.
|
||||
*/
|
||||
schema?: Schema
|
||||
/**
|
||||
* Array of prefixes; each will have a string `handle` that
|
||||
* starts and ends with `!` and a string `prefix` that the handle will be replaced by.
|
||||
*/
|
||||
tagPrefixes: Document.TagPrefix[]
|
||||
/**
|
||||
* The parsed version of the source document;
|
||||
* if true-ish, stringified output will include a `%YAML` directive.
|
||||
*/
|
||||
version?: string
|
||||
/** Warnings encountered during parsing. */
|
||||
warnings: YAMLWarning[]
|
||||
/**
|
||||
* List the tags used in the document that are not in the default
|
||||
* `tag:yaml.org,2002:` namespace.
|
||||
*/
|
||||
listNonDefaultTags(): string[]
|
||||
/** Parse a CST into this document */
|
||||
parse(cst: CST.Document): this
|
||||
/**
|
||||
* When a document is created with `new YAML.Document()`, the schema object is
|
||||
* not set as it may be influenced by parsed directives; call this with no
|
||||
* arguments to set it manually, or with arguments to change the schema used
|
||||
* by the document.
|
||||
**/
|
||||
setSchema(
|
||||
id?: Options['version'] | Schema.Name,
|
||||
customTags?: (Schema.TagId | Schema.Tag)[]
|
||||
): void
|
||||
/** Set `handle` as a shorthand string for the `prefix` tag namespace. */
|
||||
setTagPrefix(handle: string, prefix: string): void
|
||||
/**
|
||||
* A plain JavaScript representation of the document `contents`.
|
||||
*
|
||||
* @param arg Used by `JSON.stringify` to indicate the array index or property
|
||||
* name. If its value is a `string` and the document `contents` has a scalar
|
||||
* value, the `keepBlobsInJSON` option has no effect.
|
||||
* @param onAnchor If defined, called with the resolved `value` and reference
|
||||
* `count` for each anchor in the document.
|
||||
* */
|
||||
toJSON(arg?: string, onAnchor?: (value: any, count: number) => void): any
|
||||
/** A YAML representation of the document. */
|
||||
toString(): string
|
||||
}
|
||||
|
||||
export namespace Document {
|
||||
interface Parsed extends Document {
|
||||
contents: Scalar | YAMLMap | YAMLSeq | null
|
||||
/** The schema used with the document. */
|
||||
schema: Schema
|
||||
}
|
||||
|
||||
interface Anchors {
|
||||
/**
|
||||
* Create a new `Alias` node, adding the required anchor for `node`.
|
||||
* If `name` is empty, a new anchor name will be generated.
|
||||
*/
|
||||
createAlias(node: Node, name?: string): Alias
|
||||
/**
|
||||
* Create a new `Merge` node with the given source nodes.
|
||||
* Non-`Alias` sources will be automatically wrapped.
|
||||
*/
|
||||
createMergePair(...nodes: Node[]): Merge
|
||||
/** The anchor name associated with `node`, if set. */
|
||||
getName(node: Node): undefined | string
|
||||
/** List of all defined anchor names. */
|
||||
getNames(): string[]
|
||||
/** The node associated with the anchor `name`, if set. */
|
||||
getNode(name: string): undefined | Node
|
||||
/**
|
||||
* Find an available anchor name with the given `prefix` and a
|
||||
* numerical suffix.
|
||||
*/
|
||||
newName(prefix: string): string
|
||||
/**
|
||||
* Associate an anchor with `node`. If `name` is empty, a new name will be generated.
|
||||
* To remove an anchor, use `setAnchor(null, name)`.
|
||||
*/
|
||||
setAnchor(node: Node | null, name?: string): void | string
|
||||
}
|
||||
|
||||
interface TagPrefix {
|
||||
handle: string
|
||||
prefix: string
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Recursively turns objects into collections. Generic objects as well as `Map`
|
||||
* and its descendants become mappings, while arrays and other iterable objects
|
||||
* result in sequences.
|
||||
*
|
||||
* The primary purpose of this function is to enable attaching comments or other
|
||||
* metadata to a value, or to otherwise exert more fine-grained control over the
|
||||
* stringified output. To that end, you'll need to assign its return value to
|
||||
* the `contents` of a Document (or somewhere within said contents), as the
|
||||
* document's schema is required for YAML string output.
|
||||
*
|
||||
* @param wrapScalars If undefined or `true`, also wraps plain values in
|
||||
* `Scalar` objects; if `false` and `value` is not an object, it will be
|
||||
* returned directly.
|
||||
* @param tag Use to specify the collection type, e.g. `"!!omap"`. Note that
|
||||
* this requires the corresponding tag to be available based on the default
|
||||
* options. To use a specific document's schema, use `doc.schema.createNode`.
|
||||
*/
|
||||
export function createNode(
|
||||
value: any,
|
||||
wrapScalars?: true,
|
||||
tag?: string
|
||||
): YAMLMap | YAMLSeq | Scalar
|
||||
|
||||
/**
|
||||
* YAML.createNode recursively turns objects into Map and arrays to Seq collections.
|
||||
* Its primary use is to enable attaching comments or other metadata to a value,
|
||||
* or to otherwise exert more fine-grained control over the stringified output.
|
||||
*
|
||||
* Doesn't wrap plain values in Scalar objects.
|
||||
*/
|
||||
export function createNode(
|
||||
value: any,
|
||||
wrapScalars: false,
|
||||
tag?: string
|
||||
): YAMLMap | YAMLSeq | string | number | boolean | null
|
||||
|
||||
/**
|
||||
* Parse an input string into a single YAML.Document.
|
||||
*/
|
||||
export function parseDocument(str: string, options?: Options): Document.Parsed
|
||||
|
||||
/**
|
||||
* Parse the input as a stream of YAML documents.
|
||||
*
|
||||
* Documents should be separated from each other by `...` or `---` marker lines.
|
||||
*/
|
||||
export function parseAllDocuments(
|
||||
str: string,
|
||||
options?: Options
|
||||
): Document.Parsed[]
|
||||
|
||||
/**
|
||||
* Parse an input string into JavaScript.
|
||||
*
|
||||
* Only supports input consisting of a single YAML document; for multi-document
|
||||
* support you should use `YAML.parseAllDocuments`. May throw on error, and may
|
||||
* log warnings using `console.warn`.
|
||||
*
|
||||
* @param str A string with YAML formatting.
|
||||
* @returns The value will match the type of the root value of the parsed YAML
|
||||
* document, so Maps become objects, Sequences arrays, and scalars result in
|
||||
* nulls, booleans, numbers and strings.
|
||||
*/
|
||||
export function parse(str: string, options?: Options): any
|
||||
|
||||
/**
|
||||
* @returns Will always include \n as the last character, as is expected of YAML documents.
|
||||
*/
|
||||
export function stringify(value: any, options?: Options): string
|
||||
1
node_modules/cosmiconfig/node_modules/yaml/index.js
generated
vendored
Normal file
1
node_modules/cosmiconfig/node_modules/yaml/index.js
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
module.exports = require('./dist').YAML
|
||||
2
node_modules/cosmiconfig/node_modules/yaml/map.js
generated
vendored
Normal file
2
node_modules/cosmiconfig/node_modules/yaml/map.js
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
module.exports = require('./dist/types').YAMLMap
|
||||
require('./dist/legacy-exports').warnFileDeprecation(__filename)
|
||||
105
node_modules/cosmiconfig/node_modules/yaml/package.json
generated
vendored
Normal file
105
node_modules/cosmiconfig/node_modules/yaml/package.json
generated
vendored
Normal file
@@ -0,0 +1,105 @@
|
||||
{
|
||||
"name": "yaml",
|
||||
"version": "1.10.3",
|
||||
"license": "ISC",
|
||||
"author": "Eemeli Aro <eemeli@gmail.com>",
|
||||
"repository": "github:eemeli/yaml",
|
||||
"description": "JavaScript parser and stringifier for YAML",
|
||||
"keywords": [
|
||||
"YAML",
|
||||
"parser",
|
||||
"stringifier"
|
||||
],
|
||||
"homepage": "https://eemeli.org/yaml/v1/",
|
||||
"files": [
|
||||
"browser/",
|
||||
"dist/",
|
||||
"types/",
|
||||
"*.d.ts",
|
||||
"*.js",
|
||||
"*.mjs",
|
||||
"!*config.js"
|
||||
],
|
||||
"type": "commonjs",
|
||||
"main": "./index.js",
|
||||
"browser": {
|
||||
"./index.js": "./browser/index.js",
|
||||
"./map.js": "./browser/map.js",
|
||||
"./pair.js": "./browser/pair.js",
|
||||
"./parse-cst.js": "./browser/parse-cst.js",
|
||||
"./scalar.js": "./browser/scalar.js",
|
||||
"./schema.js": "./browser/schema.js",
|
||||
"./seq.js": "./browser/seq.js",
|
||||
"./types.js": "./browser/types.js",
|
||||
"./types.mjs": "./browser/types.js",
|
||||
"./types/binary.js": "./browser/types/binary.js",
|
||||
"./types/omap.js": "./browser/types/omap.js",
|
||||
"./types/pairs.js": "./browser/types/pairs.js",
|
||||
"./types/set.js": "./browser/types/set.js",
|
||||
"./types/timestamp.js": "./browser/types/timestamp.js",
|
||||
"./util.js": "./browser/util.js",
|
||||
"./util.mjs": "./browser/util.js"
|
||||
},
|
||||
"exports": {
|
||||
".": "./index.js",
|
||||
"./parse-cst": "./parse-cst.js",
|
||||
"./types": [
|
||||
{
|
||||
"import": "./types.mjs"
|
||||
},
|
||||
"./types.js"
|
||||
],
|
||||
"./util": [
|
||||
{
|
||||
"import": "./util.mjs"
|
||||
},
|
||||
"./util.js"
|
||||
],
|
||||
"./": "./"
|
||||
},
|
||||
"scripts": {
|
||||
"build": "npm run build:node && npm run build:browser",
|
||||
"build:browser": "rollup -c rollup.browser-config.js",
|
||||
"build:node": "rollup -c rollup.node-config.js",
|
||||
"clean": "git clean -fdxe node_modules",
|
||||
"lint": "eslint src/",
|
||||
"prettier": "prettier --write .",
|
||||
"start": "cross-env TRACE_LEVEL=log npm run build:node && node -i -e 'YAML=require(\".\")'",
|
||||
"test": "jest",
|
||||
"test:browsers": "cd playground && npm test",
|
||||
"test:dist": "npm run build:node && jest",
|
||||
"test:types": "tsc --lib ES2017 --noEmit tests/typings.ts",
|
||||
"docs:install": "cd docs-slate && bundle install",
|
||||
"docs:deploy": "cd docs-slate && ./deploy.sh",
|
||||
"docs": "cd docs-slate && bundle exec middleman server",
|
||||
"preversion": "npm test && npm run build"
|
||||
},
|
||||
"browserslist": "> 0.5%, not dead",
|
||||
"prettier": {
|
||||
"arrowParens": "avoid",
|
||||
"semi": false,
|
||||
"singleQuote": true,
|
||||
"trailingComma": "none"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@babel/core": "^7.12.10",
|
||||
"@babel/plugin-proposal-class-properties": "^7.12.1",
|
||||
"@babel/preset-env": "^7.12.11",
|
||||
"@rollup/plugin-babel": "^5.2.3",
|
||||
"babel-eslint": "^10.1.0",
|
||||
"babel-jest": "^26.6.3",
|
||||
"babel-plugin-trace": "^1.1.0",
|
||||
"common-tags": "^1.8.0",
|
||||
"cross-env": "^7.0.3",
|
||||
"eslint": "^7.19.0",
|
||||
"eslint-config-prettier": "^7.2.0",
|
||||
"fast-check": "^2.12.0",
|
||||
"jest": "^26.6.3",
|
||||
"prettier": "^2.2.1",
|
||||
"rollup": "^2.38.2",
|
||||
"typescript": "^4.1.3"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 6"
|
||||
}
|
||||
}
|
||||
2
node_modules/cosmiconfig/node_modules/yaml/pair.js
generated
vendored
Normal file
2
node_modules/cosmiconfig/node_modules/yaml/pair.js
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
module.exports = require('./dist/types').Pair
|
||||
require('./dist/legacy-exports').warnFileDeprecation(__filename)
|
||||
191
node_modules/cosmiconfig/node_modules/yaml/parse-cst.d.ts
generated
vendored
Normal file
191
node_modules/cosmiconfig/node_modules/yaml/parse-cst.d.ts
generated
vendored
Normal file
@@ -0,0 +1,191 @@
|
||||
import { Type, YAMLSyntaxError } from './util'
|
||||
|
||||
export default function parseCST(str: string): ParsedCST
|
||||
|
||||
export interface ParsedCST extends Array<CST.Document> {
|
||||
setOrigRanges(): boolean
|
||||
}
|
||||
|
||||
export namespace CST {
|
||||
interface Range {
|
||||
start: number
|
||||
end: number
|
||||
origStart?: number
|
||||
origEnd?: number
|
||||
isEmpty(): boolean
|
||||
}
|
||||
|
||||
interface ParseContext {
|
||||
/** Node starts at beginning of line */
|
||||
atLineStart: boolean
|
||||
/** true if currently in a collection context */
|
||||
inCollection: boolean
|
||||
/** true if currently in a flow context */
|
||||
inFlow: boolean
|
||||
/** Current level of indentation */
|
||||
indent: number
|
||||
/** Start of the current line */
|
||||
lineStart: number
|
||||
/** The parent of the node */
|
||||
parent: Node
|
||||
/** Source of the YAML document */
|
||||
src: string
|
||||
}
|
||||
|
||||
interface Node {
|
||||
context: ParseContext | null
|
||||
/** if not null, indicates a parser failure */
|
||||
error: YAMLSyntaxError | null
|
||||
/** span of context.src parsed into this node */
|
||||
range: Range | null
|
||||
valueRange: Range | null
|
||||
/** anchors, tags and comments */
|
||||
props: Range[]
|
||||
/** specific node type */
|
||||
type: Type
|
||||
/** if non-null, overrides source value */
|
||||
value: string | null
|
||||
|
||||
readonly anchor: string | null
|
||||
readonly comment: string | null
|
||||
readonly hasComment: boolean
|
||||
readonly hasProps: boolean
|
||||
readonly jsonLike: boolean
|
||||
readonly rangeAsLinePos: null | {
|
||||
start: { line: number; col: number }
|
||||
end?: { line: number; col: number }
|
||||
}
|
||||
readonly rawValue: string | null
|
||||
readonly tag:
|
||||
| null
|
||||
| { verbatim: string }
|
||||
| { handle: string; suffix: string }
|
||||
readonly valueRangeContainsNewline: boolean
|
||||
}
|
||||
|
||||
interface Alias extends Node {
|
||||
type: Type.ALIAS
|
||||
/** contain the anchor without the * prefix */
|
||||
readonly rawValue: string
|
||||
}
|
||||
|
||||
type Scalar = BlockValue | PlainValue | QuoteValue
|
||||
|
||||
interface BlockValue extends Node {
|
||||
type: Type.BLOCK_FOLDED | Type.BLOCK_LITERAL
|
||||
chomping: 'CLIP' | 'KEEP' | 'STRIP'
|
||||
blockIndent: number | null
|
||||
header: Range
|
||||
readonly strValue: string | null
|
||||
}
|
||||
|
||||
interface BlockFolded extends BlockValue {
|
||||
type: Type.BLOCK_FOLDED
|
||||
}
|
||||
|
||||
interface BlockLiteral extends BlockValue {
|
||||
type: Type.BLOCK_LITERAL
|
||||
}
|
||||
|
||||
interface PlainValue extends Node {
|
||||
type: Type.PLAIN
|
||||
readonly strValue: string | null
|
||||
}
|
||||
|
||||
interface QuoteValue extends Node {
|
||||
type: Type.QUOTE_DOUBLE | Type.QUOTE_SINGLE
|
||||
readonly strValue:
|
||||
| null
|
||||
| string
|
||||
| { str: string; errors: YAMLSyntaxError[] }
|
||||
}
|
||||
|
||||
interface QuoteDouble extends QuoteValue {
|
||||
type: Type.QUOTE_DOUBLE
|
||||
}
|
||||
|
||||
interface QuoteSingle extends QuoteValue {
|
||||
type: Type.QUOTE_SINGLE
|
||||
}
|
||||
|
||||
interface Comment extends Node {
|
||||
type: Type.COMMENT
|
||||
readonly anchor: null
|
||||
readonly comment: string
|
||||
readonly rawValue: null
|
||||
readonly tag: null
|
||||
}
|
||||
|
||||
interface BlankLine extends Node {
|
||||
type: Type.BLANK_LINE
|
||||
}
|
||||
|
||||
interface MapItem extends Node {
|
||||
type: Type.MAP_KEY | Type.MAP_VALUE
|
||||
node: ContentNode | null
|
||||
}
|
||||
|
||||
interface MapKey extends MapItem {
|
||||
type: Type.MAP_KEY
|
||||
}
|
||||
|
||||
interface MapValue extends MapItem {
|
||||
type: Type.MAP_VALUE
|
||||
}
|
||||
|
||||
interface Map extends Node {
|
||||
type: Type.MAP
|
||||
/** implicit keys are not wrapped */
|
||||
items: Array<BlankLine | Comment | Alias | Scalar | MapItem>
|
||||
}
|
||||
|
||||
interface SeqItem extends Node {
|
||||
type: Type.SEQ_ITEM
|
||||
node: ContentNode | null
|
||||
}
|
||||
|
||||
interface Seq extends Node {
|
||||
type: Type.SEQ
|
||||
items: Array<BlankLine | Comment | SeqItem>
|
||||
}
|
||||
|
||||
interface FlowChar {
|
||||
char: '{' | '}' | '[' | ']' | ',' | '?' | ':'
|
||||
offset: number
|
||||
origOffset?: number
|
||||
}
|
||||
|
||||
interface FlowCollection extends Node {
|
||||
type: Type.FLOW_MAP | Type.FLOW_SEQ
|
||||
items: Array<
|
||||
FlowChar | BlankLine | Comment | Alias | Scalar | FlowCollection
|
||||
>
|
||||
}
|
||||
|
||||
interface FlowMap extends FlowCollection {
|
||||
type: Type.FLOW_MAP
|
||||
}
|
||||
|
||||
interface FlowSeq extends FlowCollection {
|
||||
type: Type.FLOW_SEQ
|
||||
}
|
||||
|
||||
type ContentNode = Alias | Scalar | Map | Seq | FlowCollection
|
||||
|
||||
interface Directive extends Node {
|
||||
type: Type.DIRECTIVE
|
||||
name: string
|
||||
readonly anchor: null
|
||||
readonly parameters: string[]
|
||||
readonly tag: null
|
||||
}
|
||||
|
||||
interface Document extends Node {
|
||||
type: Type.DOCUMENT
|
||||
directives: Array<BlankLine | Comment | Directive>
|
||||
contents: Array<BlankLine | Comment | ContentNode>
|
||||
readonly anchor: null
|
||||
readonly comment: null
|
||||
readonly tag: null
|
||||
}
|
||||
}
|
||||
1
node_modules/cosmiconfig/node_modules/yaml/parse-cst.js
generated
vendored
Normal file
1
node_modules/cosmiconfig/node_modules/yaml/parse-cst.js
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
module.exports = require('./dist/parse-cst').parse
|
||||
2
node_modules/cosmiconfig/node_modules/yaml/scalar.js
generated
vendored
Normal file
2
node_modules/cosmiconfig/node_modules/yaml/scalar.js
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
module.exports = require('./dist/types').Scalar
|
||||
require('./dist/legacy-exports').warnFileDeprecation(__filename)
|
||||
9
node_modules/cosmiconfig/node_modules/yaml/schema.js
generated
vendored
Normal file
9
node_modules/cosmiconfig/node_modules/yaml/schema.js
generated
vendored
Normal file
@@ -0,0 +1,9 @@
|
||||
const types = require('./dist/types')
|
||||
const util = require('./dist/util')
|
||||
|
||||
module.exports = types.Schema
|
||||
module.exports.nullOptions = types.nullOptions
|
||||
module.exports.strOptions = types.strOptions
|
||||
module.exports.stringify = util.stringifyString
|
||||
|
||||
require('./dist/legacy-exports').warnFileDeprecation(__filename)
|
||||
2
node_modules/cosmiconfig/node_modules/yaml/seq.js
generated
vendored
Normal file
2
node_modules/cosmiconfig/node_modules/yaml/seq.js
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
module.exports = require('./dist/types').YAMLSeq
|
||||
require('./dist/legacy-exports').warnFileDeprecation(__filename)
|
||||
407
node_modules/cosmiconfig/node_modules/yaml/types.d.ts
generated
vendored
Normal file
407
node_modules/cosmiconfig/node_modules/yaml/types.d.ts
generated
vendored
Normal file
@@ -0,0 +1,407 @@
|
||||
import { Document, scalarOptions } from './index'
|
||||
import { CST } from './parse-cst'
|
||||
import { Type } from './util'
|
||||
|
||||
export const binaryOptions: scalarOptions.Binary
|
||||
export const boolOptions: scalarOptions.Bool
|
||||
export const intOptions: scalarOptions.Int
|
||||
export const nullOptions: scalarOptions.Null
|
||||
export const strOptions: scalarOptions.Str
|
||||
|
||||
export class Schema {
|
||||
/** Default: `'tag:yaml.org,2002:'` */
|
||||
static defaultPrefix: string
|
||||
static defaultTags: {
|
||||
/** Default: `'tag:yaml.org,2002:map'` */
|
||||
MAP: string
|
||||
/** Default: `'tag:yaml.org,2002:seq'` */
|
||||
SEQ: string
|
||||
/** Default: `'tag:yaml.org,2002:str'` */
|
||||
STR: string
|
||||
}
|
||||
constructor(options: Schema.Options)
|
||||
/**
|
||||
* Convert any value into a `Node` using this schema, recursively turning
|
||||
* objects into collections.
|
||||
*
|
||||
* @param wrapScalars If `true`, also wraps plain values in `Scalar` objects;
|
||||
* if undefined or `false` and `value` is not an object, it will be returned
|
||||
* directly.
|
||||
* @param tag Use to specify the collection type, e.g. `"!!omap"`. Note that
|
||||
* this requires the corresponding tag to be available in this schema.
|
||||
*/
|
||||
createNode(
|
||||
value: any,
|
||||
wrapScalars?: boolean,
|
||||
tag?: string,
|
||||
ctx?: Schema.CreateNodeContext
|
||||
): Node
|
||||
/**
|
||||
* Convert a key and a value into a `Pair` using this schema, recursively
|
||||
* wrapping all values as `Scalar` or `Collection` nodes.
|
||||
*
|
||||
* @param ctx To not wrap scalars, use a context `{ wrapScalars: false }`
|
||||
*/
|
||||
createPair(key: any, value: any, ctx?: Schema.CreateNodeContext): Pair
|
||||
merge: boolean
|
||||
name: Schema.Name
|
||||
sortMapEntries: ((a: Pair, b: Pair) => number) | null
|
||||
tags: Schema.Tag[]
|
||||
}
|
||||
|
||||
export namespace Schema {
|
||||
type Name = 'core' | 'failsafe' | 'json' | 'yaml-1.1'
|
||||
|
||||
interface Options {
|
||||
/**
|
||||
* Array of additional tags to include in the schema, or a function that may
|
||||
* modify the schema's base tag array.
|
||||
*/
|
||||
customTags?: (TagId | Tag)[] | ((tags: Tag[]) => Tag[])
|
||||
/**
|
||||
* Enable support for `<<` merge keys.
|
||||
*
|
||||
* Default: `false` for YAML 1.2, `true` for earlier versions
|
||||
*/
|
||||
merge?: boolean
|
||||
/**
|
||||
* The base schema to use.
|
||||
*
|
||||
* Default: `"core"` for YAML 1.2, `"yaml-1.1"` for earlier versions
|
||||
*/
|
||||
schema?: Name
|
||||
/**
|
||||
* When stringifying, sort map entries. If `true`, sort by comparing key values with `<`.
|
||||
*
|
||||
* Default: `false`
|
||||
*/
|
||||
sortMapEntries?: boolean | ((a: Pair, b: Pair) => number)
|
||||
/**
|
||||
* @deprecated Use `customTags` instead.
|
||||
*/
|
||||
tags?: Options['customTags']
|
||||
}
|
||||
|
||||
interface CreateNodeContext {
|
||||
wrapScalars?: boolean
|
||||
[key: string]: any
|
||||
}
|
||||
|
||||
interface StringifyContext {
|
||||
forceBlockIndent?: boolean
|
||||
implicitKey?: boolean
|
||||
indent?: string
|
||||
indentAtStart?: number
|
||||
inFlow?: boolean
|
||||
[key: string]: any
|
||||
}
|
||||
|
||||
type TagId =
|
||||
| 'binary'
|
||||
| 'bool'
|
||||
| 'float'
|
||||
| 'floatExp'
|
||||
| 'floatNaN'
|
||||
| 'floatTime'
|
||||
| 'int'
|
||||
| 'intHex'
|
||||
| 'intOct'
|
||||
| 'intTime'
|
||||
| 'null'
|
||||
| 'omap'
|
||||
| 'pairs'
|
||||
| 'set'
|
||||
| 'timestamp'
|
||||
|
||||
type Tag = CustomTag | DefaultTag
|
||||
|
||||
interface BaseTag {
|
||||
/**
|
||||
* An optional factory function, used e.g. by collections when wrapping JS objects as AST nodes.
|
||||
*/
|
||||
createNode?: (
|
||||
schema: Schema,
|
||||
value: any,
|
||||
ctx: Schema.CreateNodeContext
|
||||
) => YAMLMap | YAMLSeq | Scalar
|
||||
/**
|
||||
* If a tag has multiple forms that should be parsed and/or stringified differently, use `format` to identify them.
|
||||
*/
|
||||
format?: string
|
||||
/**
|
||||
* Used by `YAML.createNode` to detect your data type, e.g. using `typeof` or
|
||||
* `instanceof`.
|
||||
*/
|
||||
identify(value: any): boolean
|
||||
/**
|
||||
* The `Node` child class that implements this tag. Required for collections and tags that have overlapping JS representations.
|
||||
*/
|
||||
nodeClass?: new () => any
|
||||
/**
|
||||
* Used by some tags to configure their stringification, where applicable.
|
||||
*/
|
||||
options?: object
|
||||
/**
|
||||
* Optional function stringifying the AST node in the current context. If your
|
||||
* data includes a suitable `.toString()` method, you can probably leave this
|
||||
* undefined and use the default stringifier.
|
||||
*
|
||||
* @param item The node being stringified.
|
||||
* @param ctx Contains the stringifying context variables.
|
||||
* @param onComment Callback to signal that the stringifier includes the
|
||||
* item's comment in its output.
|
||||
* @param onChompKeep Callback to signal that the output uses a block scalar
|
||||
* type with the `+` chomping indicator.
|
||||
*/
|
||||
stringify?: (
|
||||
item: Node,
|
||||
ctx: Schema.StringifyContext,
|
||||
onComment?: () => void,
|
||||
onChompKeep?: () => void
|
||||
) => string
|
||||
/**
|
||||
* The identifier for your data type, with which its stringified form will be
|
||||
* prefixed. Should either be a !-prefixed local `!tag`, or a fully qualified
|
||||
* `tag:domain,date:foo`.
|
||||
*/
|
||||
tag: string
|
||||
}
|
||||
|
||||
interface CustomTag extends BaseTag {
|
||||
/**
|
||||
* A JavaScript class that should be matched to this tag, e.g. `Date` for `!!timestamp`.
|
||||
* @deprecated Use `Tag.identify` instead
|
||||
*/
|
||||
class?: new () => any
|
||||
/**
|
||||
* Turns a CST node into an AST node. If returning a non-`Node` value, the
|
||||
* output will be wrapped as a `Scalar`.
|
||||
*/
|
||||
resolve(doc: Document, cstNode: CST.Node): Node | any
|
||||
}
|
||||
|
||||
interface DefaultTag extends BaseTag {
|
||||
/**
|
||||
* If `true`, together with `test` allows for values to be stringified without
|
||||
* an explicit tag. For most cases, it's unlikely that you'll actually want to
|
||||
* use this, even if you first think you do.
|
||||
*/
|
||||
default: true
|
||||
/**
|
||||
* Alternative form used by default tags; called with `test` match results.
|
||||
*/
|
||||
resolve(...match: string[]): Node | any
|
||||
/**
|
||||
* Together with `default` allows for values to be stringified without an
|
||||
* explicit tag and detected using a regular expression. For most cases, it's
|
||||
* unlikely that you'll actually want to use these, even if you first think
|
||||
* you do.
|
||||
*/
|
||||
test: RegExp
|
||||
}
|
||||
}
|
||||
|
||||
export class Node {
|
||||
/** A comment on or immediately after this */
|
||||
comment?: string | null
|
||||
/** A comment before this */
|
||||
commentBefore?: string | null
|
||||
/** Only available when `keepCstNodes` is set to `true` */
|
||||
cstNode?: CST.Node
|
||||
/**
|
||||
* The [start, end] range of characters of the source parsed
|
||||
* into this node (undefined for pairs or if not parsed)
|
||||
*/
|
||||
range?: [number, number] | null
|
||||
/** A blank line before this node and its commentBefore */
|
||||
spaceBefore?: boolean
|
||||
/** A fully qualified tag, if required */
|
||||
tag?: string
|
||||
/** A plain JS representation of this node */
|
||||
toJSON(arg?: any): any
|
||||
/** The type of this node */
|
||||
type?: Type | Pair.Type
|
||||
}
|
||||
|
||||
export class Scalar extends Node {
|
||||
constructor(value: any)
|
||||
type?: Scalar.Type
|
||||
/**
|
||||
* By default (undefined), numbers use decimal notation.
|
||||
* The YAML 1.2 core schema only supports 'HEX' and 'OCT'.
|
||||
*/
|
||||
format?: 'BIN' | 'HEX' | 'OCT' | 'TIME'
|
||||
value: any
|
||||
toJSON(arg?: any, ctx?: AST.NodeToJsonContext): any
|
||||
toString(): string
|
||||
}
|
||||
export namespace Scalar {
|
||||
type Type =
|
||||
| Type.BLOCK_FOLDED
|
||||
| Type.BLOCK_LITERAL
|
||||
| Type.PLAIN
|
||||
| Type.QUOTE_DOUBLE
|
||||
| Type.QUOTE_SINGLE
|
||||
}
|
||||
|
||||
export class Alias extends Node {
|
||||
type: Type.ALIAS
|
||||
source: Node
|
||||
cstNode?: CST.Alias
|
||||
toString(ctx: Schema.StringifyContext): string
|
||||
}
|
||||
|
||||
export class Pair extends Node {
|
||||
constructor(key: any, value?: any)
|
||||
type: Pair.Type.PAIR | Pair.Type.MERGE_PAIR
|
||||
/** Always Node or null when parsed, but can be set to anything. */
|
||||
key: any
|
||||
/** Always Node or null when parsed, but can be set to anything. */
|
||||
value: any
|
||||
cstNode?: never // no corresponding cstNode
|
||||
toJSON(arg?: any, ctx?: AST.NodeToJsonContext): object | Map<any, any>
|
||||
toString(
|
||||
ctx?: Schema.StringifyContext,
|
||||
onComment?: () => void,
|
||||
onChompKeep?: () => void
|
||||
): string
|
||||
}
|
||||
export namespace Pair {
|
||||
enum Type {
|
||||
PAIR = 'PAIR',
|
||||
MERGE_PAIR = 'MERGE_PAIR'
|
||||
}
|
||||
}
|
||||
|
||||
export class Merge extends Pair {
|
||||
type: Pair.Type.MERGE_PAIR
|
||||
/** Always Scalar('<<'), defined by the type specification */
|
||||
key: AST.PlainValue
|
||||
/** Always YAMLSeq<Alias(Map)>, stringified as *A if length = 1 */
|
||||
value: YAMLSeq
|
||||
toString(ctx?: Schema.StringifyContext, onComment?: () => void): string
|
||||
}
|
||||
|
||||
export class Collection extends Node {
|
||||
type?: Type.MAP | Type.FLOW_MAP | Type.SEQ | Type.FLOW_SEQ | Type.DOCUMENT
|
||||
items: any[]
|
||||
schema?: Schema
|
||||
|
||||
/**
|
||||
* Adds a value to the collection. For `!!map` and `!!omap` the value must
|
||||
* be a Pair instance or a `{ key, value }` object, which may not have a key
|
||||
* that already exists in the map.
|
||||
*/
|
||||
add(value: any): void
|
||||
addIn(path: Iterable<any>, value: any): void
|
||||
/**
|
||||
* Removes a value from the collection.
|
||||
* @returns `true` if the item was found and removed.
|
||||
*/
|
||||
delete(key: any): boolean
|
||||
deleteIn(path: Iterable<any>): boolean
|
||||
/**
|
||||
* Returns item at `key`, or `undefined` if not found. By default unwraps
|
||||
* scalar values from their surrounding node; to disable set `keepScalar` to
|
||||
* `true` (collections are always returned intact).
|
||||
*/
|
||||
get(key: any, keepScalar?: boolean): any
|
||||
getIn(path: Iterable<any>, keepScalar?: boolean): any
|
||||
/**
|
||||
* Checks if the collection includes a value with the key `key`.
|
||||
*/
|
||||
has(key: any): boolean
|
||||
hasIn(path: Iterable<any>): boolean
|
||||
/**
|
||||
* Sets a value in this collection. For `!!set`, `value` needs to be a
|
||||
* boolean to add/remove the item from the set.
|
||||
*/
|
||||
set(key: any, value: any): void
|
||||
setIn(path: Iterable<any>, value: any): void
|
||||
}
|
||||
|
||||
export class YAMLMap extends Collection {
|
||||
type?: Type.FLOW_MAP | Type.MAP
|
||||
items: Array<Pair>
|
||||
hasAllNullValues(): boolean
|
||||
toJSON(arg?: any, ctx?: AST.NodeToJsonContext): object | Map<any, any>
|
||||
toString(
|
||||
ctx?: Schema.StringifyContext,
|
||||
onComment?: () => void,
|
||||
onChompKeep?: () => void
|
||||
): string
|
||||
}
|
||||
|
||||
export class YAMLSeq extends Collection {
|
||||
type?: Type.FLOW_SEQ | Type.SEQ
|
||||
delete(key: number | string | Scalar): boolean
|
||||
get(key: number | string | Scalar, keepScalar?: boolean): any
|
||||
has(key: number | string | Scalar): boolean
|
||||
set(key: number | string | Scalar, value: any): void
|
||||
hasAllNullValues(): boolean
|
||||
toJSON(arg?: any, ctx?: AST.NodeToJsonContext): any[]
|
||||
toString(
|
||||
ctx?: Schema.StringifyContext,
|
||||
onComment?: () => void,
|
||||
onChompKeep?: () => void
|
||||
): string
|
||||
}
|
||||
|
||||
export namespace AST {
|
||||
interface NodeToJsonContext {
|
||||
anchors?: any[]
|
||||
doc: Document
|
||||
keep?: boolean
|
||||
mapAsMap?: boolean
|
||||
maxAliasCount?: number
|
||||
onCreate?: (node: Node) => void
|
||||
[key: string]: any
|
||||
}
|
||||
|
||||
interface BlockFolded extends Scalar {
|
||||
type: Type.BLOCK_FOLDED
|
||||
cstNode?: CST.BlockFolded
|
||||
}
|
||||
|
||||
interface BlockLiteral extends Scalar {
|
||||
type: Type.BLOCK_LITERAL
|
||||
cstNode?: CST.BlockLiteral
|
||||
}
|
||||
|
||||
interface PlainValue extends Scalar {
|
||||
type: Type.PLAIN
|
||||
cstNode?: CST.PlainValue
|
||||
}
|
||||
|
||||
interface QuoteDouble extends Scalar {
|
||||
type: Type.QUOTE_DOUBLE
|
||||
cstNode?: CST.QuoteDouble
|
||||
}
|
||||
|
||||
interface QuoteSingle extends Scalar {
|
||||
type: Type.QUOTE_SINGLE
|
||||
cstNode?: CST.QuoteSingle
|
||||
}
|
||||
|
||||
interface FlowMap extends YAMLMap {
|
||||
type: Type.FLOW_MAP
|
||||
cstNode?: CST.FlowMap
|
||||
}
|
||||
|
||||
interface BlockMap extends YAMLMap {
|
||||
type: Type.MAP
|
||||
cstNode?: CST.Map
|
||||
}
|
||||
|
||||
interface FlowSeq extends YAMLSeq {
|
||||
type: Type.FLOW_SEQ
|
||||
items: Array<Node>
|
||||
cstNode?: CST.FlowSeq
|
||||
}
|
||||
|
||||
interface BlockSeq extends YAMLSeq {
|
||||
type: Type.SEQ
|
||||
items: Array<Node | null>
|
||||
cstNode?: CST.Seq
|
||||
}
|
||||
}
|
||||
17
node_modules/cosmiconfig/node_modules/yaml/types.js
generated
vendored
Normal file
17
node_modules/cosmiconfig/node_modules/yaml/types.js
generated
vendored
Normal file
@@ -0,0 +1,17 @@
|
||||
const types = require('./dist/types')
|
||||
|
||||
exports.binaryOptions = types.binaryOptions
|
||||
exports.boolOptions = types.boolOptions
|
||||
exports.intOptions = types.intOptions
|
||||
exports.nullOptions = types.nullOptions
|
||||
exports.strOptions = types.strOptions
|
||||
|
||||
exports.Schema = types.Schema
|
||||
exports.Alias = types.Alias
|
||||
exports.Collection = types.Collection
|
||||
exports.Merge = types.Merge
|
||||
exports.Node = types.Node
|
||||
exports.Pair = types.Pair
|
||||
exports.Scalar = types.Scalar
|
||||
exports.YAMLMap = types.YAMLMap
|
||||
exports.YAMLSeq = types.YAMLSeq
|
||||
17
node_modules/cosmiconfig/node_modules/yaml/types.mjs
generated
vendored
Normal file
17
node_modules/cosmiconfig/node_modules/yaml/types.mjs
generated
vendored
Normal file
@@ -0,0 +1,17 @@
|
||||
import types from './dist/types.js'
|
||||
|
||||
export const binaryOptions = types.binaryOptions
|
||||
export const boolOptions = types.boolOptions
|
||||
export const intOptions = types.intOptions
|
||||
export const nullOptions = types.nullOptions
|
||||
export const strOptions = types.strOptions
|
||||
|
||||
export const Schema = types.Schema
|
||||
export const Alias = types.Alias
|
||||
export const Collection = types.Collection
|
||||
export const Merge = types.Merge
|
||||
export const Node = types.Node
|
||||
export const Pair = types.Pair
|
||||
export const Scalar = types.Scalar
|
||||
export const YAMLMap = types.YAMLMap
|
||||
export const YAMLSeq = types.YAMLSeq
|
||||
8
node_modules/cosmiconfig/node_modules/yaml/types/binary.js
generated
vendored
Normal file
8
node_modules/cosmiconfig/node_modules/yaml/types/binary.js
generated
vendored
Normal file
@@ -0,0 +1,8 @@
|
||||
'use strict'
|
||||
Object.defineProperty(exports, '__esModule', { value: true })
|
||||
|
||||
const legacy = require('../dist/legacy-exports')
|
||||
exports.binary = legacy.binary
|
||||
exports.default = [exports.binary]
|
||||
|
||||
legacy.warnFileDeprecation(__filename)
|
||||
3
node_modules/cosmiconfig/node_modules/yaml/types/omap.js
generated
vendored
Normal file
3
node_modules/cosmiconfig/node_modules/yaml/types/omap.js
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
const legacy = require('../dist/legacy-exports')
|
||||
module.exports = legacy.omap
|
||||
legacy.warnFileDeprecation(__filename)
|
||||
3
node_modules/cosmiconfig/node_modules/yaml/types/pairs.js
generated
vendored
Normal file
3
node_modules/cosmiconfig/node_modules/yaml/types/pairs.js
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
const legacy = require('../dist/legacy-exports')
|
||||
module.exports = legacy.pairs
|
||||
legacy.warnFileDeprecation(__filename)
|
||||
3
node_modules/cosmiconfig/node_modules/yaml/types/set.js
generated
vendored
Normal file
3
node_modules/cosmiconfig/node_modules/yaml/types/set.js
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
const legacy = require('../dist/legacy-exports')
|
||||
module.exports = legacy.set
|
||||
legacy.warnFileDeprecation(__filename)
|
||||
10
node_modules/cosmiconfig/node_modules/yaml/types/timestamp.js
generated
vendored
Normal file
10
node_modules/cosmiconfig/node_modules/yaml/types/timestamp.js
generated
vendored
Normal file
@@ -0,0 +1,10 @@
|
||||
'use strict'
|
||||
Object.defineProperty(exports, '__esModule', { value: true })
|
||||
|
||||
const legacy = require('../dist/legacy-exports')
|
||||
exports.default = [legacy.intTime, legacy.floatTime, legacy.timestamp]
|
||||
exports.floatTime = legacy.floatTime
|
||||
exports.intTime = legacy.intTime
|
||||
exports.timestamp = legacy.timestamp
|
||||
|
||||
legacy.warnFileDeprecation(__filename)
|
||||
86
node_modules/cosmiconfig/node_modules/yaml/util.d.ts
generated
vendored
Normal file
86
node_modules/cosmiconfig/node_modules/yaml/util.d.ts
generated
vendored
Normal file
@@ -0,0 +1,86 @@
|
||||
import { Document } from './index'
|
||||
import { CST } from './parse-cst'
|
||||
import { AST, Pair, Scalar, Schema } from './types'
|
||||
|
||||
export function findPair(items: any[], key: Scalar | any): Pair | undefined
|
||||
|
||||
export function parseMap(doc: Document, cst: CST.Map): AST.BlockMap
|
||||
export function parseMap(doc: Document, cst: CST.FlowMap): AST.FlowMap
|
||||
export function parseSeq(doc: Document, cst: CST.Seq): AST.BlockSeq
|
||||
export function parseSeq(doc: Document, cst: CST.FlowSeq): AST.FlowSeq
|
||||
|
||||
export function stringifyNumber(item: Scalar): string
|
||||
export function stringifyString(
|
||||
item: Scalar,
|
||||
ctx: Schema.StringifyContext,
|
||||
onComment?: () => void,
|
||||
onChompKeep?: () => void
|
||||
): string
|
||||
|
||||
export function toJSON(
|
||||
value: any,
|
||||
arg?: any,
|
||||
ctx?: Schema.CreateNodeContext
|
||||
): any
|
||||
|
||||
export enum Type {
|
||||
ALIAS = 'ALIAS',
|
||||
BLANK_LINE = 'BLANK_LINE',
|
||||
BLOCK_FOLDED = 'BLOCK_FOLDED',
|
||||
BLOCK_LITERAL = 'BLOCK_LITERAL',
|
||||
COMMENT = 'COMMENT',
|
||||
DIRECTIVE = 'DIRECTIVE',
|
||||
DOCUMENT = 'DOCUMENT',
|
||||
FLOW_MAP = 'FLOW_MAP',
|
||||
FLOW_SEQ = 'FLOW_SEQ',
|
||||
MAP = 'MAP',
|
||||
MAP_KEY = 'MAP_KEY',
|
||||
MAP_VALUE = 'MAP_VALUE',
|
||||
PLAIN = 'PLAIN',
|
||||
QUOTE_DOUBLE = 'QUOTE_DOUBLE',
|
||||
QUOTE_SINGLE = 'QUOTE_SINGLE',
|
||||
SEQ = 'SEQ',
|
||||
SEQ_ITEM = 'SEQ_ITEM'
|
||||
}
|
||||
|
||||
interface LinePos {
|
||||
line: number
|
||||
col: number
|
||||
}
|
||||
|
||||
export class YAMLError extends Error {
|
||||
name:
|
||||
| 'YAMLReferenceError'
|
||||
| 'YAMLSemanticError'
|
||||
| 'YAMLSyntaxError'
|
||||
| 'YAMLWarning'
|
||||
message: string
|
||||
source?: CST.Node
|
||||
|
||||
nodeType?: Type
|
||||
range?: CST.Range
|
||||
linePos?: { start: LinePos; end: LinePos }
|
||||
|
||||
/**
|
||||
* Drops `source` and adds `nodeType`, `range` and `linePos`, as well as
|
||||
* adding details to `message`. Run automatically for document errors if
|
||||
* the `prettyErrors` option is set.
|
||||
*/
|
||||
makePretty(): void
|
||||
}
|
||||
|
||||
export class YAMLReferenceError extends YAMLError {
|
||||
name: 'YAMLReferenceError'
|
||||
}
|
||||
|
||||
export class YAMLSemanticError extends YAMLError {
|
||||
name: 'YAMLSemanticError'
|
||||
}
|
||||
|
||||
export class YAMLSyntaxError extends YAMLError {
|
||||
name: 'YAMLSyntaxError'
|
||||
}
|
||||
|
||||
export class YAMLWarning extends YAMLError {
|
||||
name: 'YAMLWarning'
|
||||
}
|
||||
16
node_modules/cosmiconfig/node_modules/yaml/util.js
generated
vendored
Normal file
16
node_modules/cosmiconfig/node_modules/yaml/util.js
generated
vendored
Normal file
@@ -0,0 +1,16 @@
|
||||
const util = require('./dist/util')
|
||||
|
||||
exports.findPair = util.findPair
|
||||
exports.toJSON = util.toJSON
|
||||
exports.parseMap = util.parseMap
|
||||
exports.parseSeq = util.parseSeq
|
||||
|
||||
exports.stringifyNumber = util.stringifyNumber
|
||||
exports.stringifyString = util.stringifyString
|
||||
exports.Type = util.Type
|
||||
|
||||
exports.YAMLError = util.YAMLError
|
||||
exports.YAMLReferenceError = util.YAMLReferenceError
|
||||
exports.YAMLSemanticError = util.YAMLSemanticError
|
||||
exports.YAMLSyntaxError = util.YAMLSyntaxError
|
||||
exports.YAMLWarning = util.YAMLWarning
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user