First commit
This commit is contained in:
207
README.md
Normal file
207
README.md
Normal file
@ -0,0 +1,207 @@
|
||||
# LDAP to OAuth2 Bridge
|
||||
|
||||
A lightweight LDAP server that authenticates users against an OAuth2/OIDC provider (such as Keycloak). This bridge allows legacy applications that only support LDAP authentication to work with modern OAuth2 identity providers.
|
||||
|
||||
## Overview
|
||||
|
||||
This bridge presents an LDAP interface to applications while performing OAuth2 Resource Owner Password Credentials (ROPC) flow authentication against your identity provider in the background.
|
||||
|
||||
**Use Case**: Integrate applications like TheLounge IRC client, which only support LDAP authentication, with Keycloak or other OAuth2/OIDC providers.
|
||||
|
||||
## Features
|
||||
|
||||
- Lightweight Node.js implementation
|
||||
- Simple LDAP bind operation support
|
||||
- OAuth2 Resource Owner Password Credentials flow
|
||||
- Configurable base DN
|
||||
- Easy integration with Keycloak and other OIDC providers
|
||||
- FreeBSD service support (includes rc.d script)
|
||||
|
||||
## Requirements
|
||||
|
||||
- Node.js 14+ and npm
|
||||
- An OAuth2/OIDC provider (Keycloak, Auth0, etc.)
|
||||
- A client configured with Direct Access Grants enabled
|
||||
|
||||
## Installation
|
||||
|
||||
### 1. Clone the repository
|
||||
|
||||
```bash
|
||||
git clone https://your-gitea-instance.com/ldap-to-oauth2.git
|
||||
cd ldap-to-oauth2
|
||||
```
|
||||
|
||||
### 2. Install dependencies
|
||||
|
||||
```bash
|
||||
npm install
|
||||
```
|
||||
|
||||
### 3. Configure OAuth2 Provider
|
||||
|
||||
#### For Keycloak:
|
||||
|
||||
1. Create a new client in your realm
|
||||
2. Set **Access Type** to `confidential`
|
||||
3. Enable **Direct Access Grants Enabled**
|
||||
4. Save and retrieve the client secret from the Credentials tab
|
||||
|
||||
#### For other providers:
|
||||
|
||||
Ensure your OAuth2 provider supports the Resource Owner Password Credentials grant type.
|
||||
|
||||
### 4. Configure the bridge
|
||||
|
||||
Edit `server.js` and update the configuration section:
|
||||
|
||||
```javascript
|
||||
const LDAP_PORT = 3893;
|
||||
const KEYCLOAK_URL = 'https://your-keycloak-domain.com';
|
||||
const KEYCLOAK_REALM = 'your-realm';
|
||||
const KEYCLOAK_CLIENT_ID = 'your-client-id';
|
||||
const KEYCLOAK_CLIENT_SECRET = 'your-client-secret';
|
||||
const BASE_DN = 'dc=example,dc=com';
|
||||
```
|
||||
|
||||
### 5. Run the server
|
||||
|
||||
```bash
|
||||
node server.js
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
Test authentication using `ldapsearch`:
|
||||
|
||||
```bash
|
||||
ldapsearch -H ldap://localhost:3893 \
|
||||
-D "cn=username,dc=example,dc=com" \
|
||||
-W \
|
||||
-b "dc=example,dc=com"
|
||||
```
|
||||
|
||||
Replace `username` with a valid username from your OAuth2 provider.
|
||||
|
||||
## FreeBSD Service Installation
|
||||
|
||||
### 1. Copy files to system directories
|
||||
|
||||
```bash
|
||||
# Copy application files
|
||||
cp -r ldap-to-oauth2 /usr/local/ldap-oauth-bridge
|
||||
|
||||
# Copy RC script
|
||||
cp ldap_oauth_bridge.rc /usr/local/etc/rc.d/ldap_oauth_bridge
|
||||
chmod +x /usr/local/etc/rc.d/ldap_oauth_bridge
|
||||
```
|
||||
|
||||
### 2. Enable and start service
|
||||
|
||||
```bash
|
||||
echo 'ldap_oauth_bridge_enable="YES"' >> /etc/rc.conf
|
||||
service ldap_oauth_bridge start
|
||||
```
|
||||
|
||||
### 3. Check service status
|
||||
|
||||
```bash
|
||||
service ldap_oauth_bridge status
|
||||
sockstat -l | grep 3893
|
||||
```
|
||||
|
||||
## Configuration Examples
|
||||
|
||||
### TheLounge IRC Client
|
||||
|
||||
In `config.js`:
|
||||
|
||||
```javascript
|
||||
ldap: {
|
||||
enable: true,
|
||||
url: "ldap://your-server:3893",
|
||||
baseDN: "dc=example,dc=com",
|
||||
searchDN: {
|
||||
rootDN: "cn=admin,dc=example,dc=com",
|
||||
rootPassword: "dummy",
|
||||
filter: "(|(uid=%s)(cn=%s))"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Generic LDAP Application
|
||||
|
||||
- **LDAP Server**: `ldap://your-server:3893`
|
||||
- **Base DN**: `dc=example,dc=com`
|
||||
- **Bind DN Format**: `cn=USERNAME,dc=example,dc=com`
|
||||
|
||||
Users authenticate with their OAuth2 provider username and password.
|
||||
|
||||
## How It Works
|
||||
|
||||
1. Application sends LDAP BIND request with username and password
|
||||
2. Bridge extracts username from the LDAP DN (e.g., `cn=alice,dc=example,dc=com` → `alice`)
|
||||
3. Bridge performs OAuth2 password grant flow with the identity provider
|
||||
4. If OAuth2 authentication succeeds, LDAP BIND succeeds
|
||||
5. If OAuth2 authentication fails, LDAP BIND fails
|
||||
|
||||
## Security Considerations
|
||||
|
||||
- **Transport Security**: Use LDAPS (LDAP over TLS) in production or ensure the bridge runs on a trusted network
|
||||
- **Port < 1024**: Requires root privileges on Unix systems. Consider using port 3893 or higher to avoid running as root
|
||||
- **Password Grant Flow**: The Resource Owner Password Credentials flow passes passwords to the bridge. Ensure secure communication channels
|
||||
- **Network Isolation**: Run the bridge on the same network as your applications or use VPN/tunneling
|
||||
|
||||
## Limitations
|
||||
|
||||
- Only supports LDAP BIND operations (authentication only)
|
||||
- Does not provide a full LDAP directory (no user enumeration, group queries, etc.)
|
||||
- Search operations return empty results
|
||||
- Requires OAuth2 provider to support Resource Owner Password Credentials grant
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Authentication fails
|
||||
|
||||
1. Check server logs for detailed error messages
|
||||
2. Verify OAuth2 client credentials are correct
|
||||
3. Ensure Direct Access Grants are enabled in your OAuth2 client
|
||||
4. Test OAuth2 authentication directly using curl:
|
||||
|
||||
```bash
|
||||
curl -X POST https://your-keycloak/realms/your-realm/protocol/openid-connect/token \
|
||||
-H "Content-Type: application/x-www-form-urlencoded" \
|
||||
-d "grant_type=password" \
|
||||
-d "client_id=your-client-id" \
|
||||
-d "client_secret=your-client-secret" \
|
||||
-d "username=testuser" \
|
||||
-d "password=testpass"
|
||||
```
|
||||
|
||||
### Connection refused
|
||||
|
||||
- Verify the server is running: `sockstat -l | grep 3893`
|
||||
- Check firewall rules
|
||||
- Ensure the application can reach the bridge server
|
||||
|
||||
### DN parsing errors
|
||||
|
||||
- Ensure username is formatted as `cn=username,dc=example,dc=com` or `uid=username,dc=example,dc=com`
|
||||
- Check server logs for DN parsing details
|
||||
|
||||
## License
|
||||
|
||||
MIT License - See LICENSE file for details
|
||||
|
||||
## Contributing
|
||||
|
||||
Contributions are welcome! Please submit pull requests or open issues on the repository.
|
||||
|
||||
## Author
|
||||
|
||||
Created for use with TheLounge IRC client and Keycloak authentication.
|
||||
|
||||
## Acknowledgments
|
||||
|
||||
- Built with [ldapjs](https://github.com/ldapjs/node-ldapjs)
|
||||
- OAuth2 requests powered by [axios](https://github.com/axios/axios)
|
||||
516
node_modules/.package-lock.json
generated
vendored
Normal file
516
node_modules/.package-lock.json
generated
vendored
Normal file
@ -0,0 +1,516 @@
|
||||
{
|
||||
"name": "ldap-oauth-bridge",
|
||||
"version": "1.0.0",
|
||||
"lockfileVersion": 3,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"node_modules/@ldapjs/asn1": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@ldapjs/asn1/-/asn1-2.0.0.tgz",
|
||||
"integrity": "sha512-G9+DkEOirNgdPmD0I8nu57ygQJKOOgFEMKknEuQvIHbGLwP3ny1mY+OTUYLCbCaGJP4sox5eYgBJRuSUpnAddA==",
|
||||
"deprecated": "This package has been decomissioned. See https://github.com/ldapjs/node-ldapjs/blob/8ffd0bc9c149088a10ec4c1ec6a18450f76ad05d/README.md",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/@ldapjs/attribute": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@ldapjs/attribute/-/attribute-1.0.0.tgz",
|
||||
"integrity": "sha512-ptMl2d/5xJ0q+RgmnqOi3Zgwk/TMJYG7dYMC0Keko+yZU6n+oFM59MjQOUht5pxJeS4FWrImhu/LebX24vJNRQ==",
|
||||
"deprecated": "This package has been decomissioned. See https://github.com/ldapjs/node-ldapjs/blob/8ffd0bc9c149088a10ec4c1ec6a18450f76ad05d/README.md",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@ldapjs/asn1": "2.0.0",
|
||||
"@ldapjs/protocol": "^1.2.1",
|
||||
"process-warning": "^2.1.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@ldapjs/change": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@ldapjs/change/-/change-1.0.0.tgz",
|
||||
"integrity": "sha512-EOQNFH1RIku3M1s0OAJOzGfAohuFYXFY4s73wOhRm4KFGhmQQ7MChOh2YtYu9Kwgvuq1B0xKciXVzHCGkB5V+Q==",
|
||||
"deprecated": "This package has been decomissioned. See https://github.com/ldapjs/node-ldapjs/blob/8ffd0bc9c149088a10ec4c1ec6a18450f76ad05d/README.md",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@ldapjs/asn1": "2.0.0",
|
||||
"@ldapjs/attribute": "1.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@ldapjs/controls": {
|
||||
"version": "2.1.0",
|
||||
"resolved": "https://registry.npmjs.org/@ldapjs/controls/-/controls-2.1.0.tgz",
|
||||
"integrity": "sha512-2pFdD1yRC9V9hXfAWvCCO2RRWK9OdIEcJIos/9cCVP9O4k72BY1bLDQQ4KpUoJnl4y/JoD4iFgM+YWT3IfITWw==",
|
||||
"deprecated": "This package has been decomissioned. See https://github.com/ldapjs/node-ldapjs/blob/8ffd0bc9c149088a10ec4c1ec6a18450f76ad05d/README.md",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@ldapjs/asn1": "^1.2.0",
|
||||
"@ldapjs/protocol": "^1.2.1"
|
||||
}
|
||||
},
|
||||
"node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1": {
|
||||
"version": "1.2.0",
|
||||
"resolved": "https://registry.npmjs.org/@ldapjs/asn1/-/asn1-1.2.0.tgz",
|
||||
"integrity": "sha512-KX/qQJ2xxzvO2/WOvr1UdQ+8P5dVvuOLk/C9b1bIkXxZss8BaR28njXdPgFCpj5aHaf1t8PmuVnea+N9YG9YMw==",
|
||||
"deprecated": "This package has been decomissioned. See https://github.com/ldapjs/node-ldapjs/blob/8ffd0bc9c149088a10ec4c1ec6a18450f76ad05d/README.md",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/@ldapjs/dn": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/@ldapjs/dn/-/dn-1.1.0.tgz",
|
||||
"integrity": "sha512-R72zH5ZeBj/Fujf/yBu78YzpJjJXG46YHFo5E4W1EqfNpo1UsVPqdLrRMXeKIsJT3x9dJVIfR6OpzgINlKpi0A==",
|
||||
"deprecated": "This package has been decomissioned. See https://github.com/ldapjs/node-ldapjs/blob/8ffd0bc9c149088a10ec4c1ec6a18450f76ad05d/README.md",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@ldapjs/asn1": "2.0.0",
|
||||
"process-warning": "^2.1.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@ldapjs/filter": {
|
||||
"version": "2.1.1",
|
||||
"resolved": "https://registry.npmjs.org/@ldapjs/filter/-/filter-2.1.1.tgz",
|
||||
"integrity": "sha512-TwPK5eEgNdUO1ABPBUQabcZ+h9heDORE4V9WNZqCtYLKc06+6+UAJ3IAbr0L0bYTnkkWC/JEQD2F+zAFsuikNw==",
|
||||
"deprecated": "This package has been decomissioned. See https://github.com/ldapjs/node-ldapjs/blob/8ffd0bc9c149088a10ec4c1ec6a18450f76ad05d/README.md",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@ldapjs/asn1": "2.0.0",
|
||||
"@ldapjs/protocol": "^1.2.1",
|
||||
"process-warning": "^2.1.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@ldapjs/messages": {
|
||||
"version": "1.3.0",
|
||||
"resolved": "https://registry.npmjs.org/@ldapjs/messages/-/messages-1.3.0.tgz",
|
||||
"integrity": "sha512-K7xZpXJ21bj92jS35wtRbdcNrwmxAtPwy4myeh9duy/eR3xQKvikVycbdWVzkYEAVE5Ce520VXNOwCHjomjCZw==",
|
||||
"deprecated": "This package has been decomissioned. See https://github.com/ldapjs/node-ldapjs/blob/8ffd0bc9c149088a10ec4c1ec6a18450f76ad05d/README.md",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@ldapjs/asn1": "^2.0.0",
|
||||
"@ldapjs/attribute": "^1.0.0",
|
||||
"@ldapjs/change": "^1.0.0",
|
||||
"@ldapjs/controls": "^2.1.0",
|
||||
"@ldapjs/dn": "^1.1.0",
|
||||
"@ldapjs/filter": "^2.1.1",
|
||||
"@ldapjs/protocol": "^1.2.1",
|
||||
"process-warning": "^2.2.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@ldapjs/protocol": {
|
||||
"version": "1.2.1",
|
||||
"resolved": "https://registry.npmjs.org/@ldapjs/protocol/-/protocol-1.2.1.tgz",
|
||||
"integrity": "sha512-O89xFDLW2gBoZWNXuXpBSM32/KealKCTb3JGtJdtUQc7RjAk8XzrRgyz02cPAwGKwKPxy0ivuC7UP9bmN87egQ==",
|
||||
"deprecated": "This package has been decomissioned. See https://github.com/ldapjs/node-ldapjs/blob/8ffd0bc9c149088a10ec4c1ec6a18450f76ad05d/README.md",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/abstract-logging": {
|
||||
"version": "2.0.1",
|
||||
"resolved": "https://registry.npmjs.org/abstract-logging/-/abstract-logging-2.0.1.tgz",
|
||||
"integrity": "sha512-2BjRTZxTPvheOvGbBslFSYOUkr+SjPtOnrLP33f+VIWLzezQpZcqVg7ja3L4dBXmzzgwT+a029jRx5PCi3JuiA==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/assert-plus": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/assert-plus/-/assert-plus-1.0.0.tgz",
|
||||
"integrity": "sha512-NfJ4UzBCcQGLDlQq7nHxH+tv3kyZ0hHQqF5BO6J7tNJeP5do1llPr8dZ8zHonfhAu0PHAdMkSo+8o0wxg9lZWw==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=0.8"
|
||||
}
|
||||
},
|
||||
"node_modules/asynckit": {
|
||||
"version": "0.4.0",
|
||||
"resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz",
|
||||
"integrity": "sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/axios": {
|
||||
"version": "1.12.2",
|
||||
"resolved": "https://registry.npmjs.org/axios/-/axios-1.12.2.tgz",
|
||||
"integrity": "sha512-vMJzPewAlRyOgxV2dU0Cuz2O8zzzx9VYtbJOaBgXFeLc4IV/Eg50n4LowmehOOR61S8ZMpc2K5Sa7g6A4jfkUw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"follow-redirects": "^1.15.6",
|
||||
"form-data": "^4.0.4",
|
||||
"proxy-from-env": "^1.1.0"
|
||||
}
|
||||
},
|
||||
"node_modules/backoff": {
|
||||
"version": "2.5.0",
|
||||
"resolved": "https://registry.npmjs.org/backoff/-/backoff-2.5.0.tgz",
|
||||
"integrity": "sha512-wC5ihrnUXmR2douXmXLCe5O3zg3GKIyvRi/hi58a/XyRxVI+3/yM0PYueQOZXPXQ9pxBislYkw+sF9b7C/RuMA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"precond": "0.2"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.6"
|
||||
}
|
||||
},
|
||||
"node_modules/call-bind-apply-helpers": {
|
||||
"version": "1.0.2",
|
||||
"resolved": "https://registry.npmjs.org/call-bind-apply-helpers/-/call-bind-apply-helpers-1.0.2.tgz",
|
||||
"integrity": "sha512-Sp1ablJ0ivDkSzjcaJdxEunN5/XvksFJ2sMBFfq6x0ryhQV/2b/KwFe21cMpmHtPOSij8K99/wSfoEuTObmuMQ==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"es-errors": "^1.3.0",
|
||||
"function-bind": "^1.1.2"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/combined-stream": {
|
||||
"version": "1.0.8",
|
||||
"resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.8.tgz",
|
||||
"integrity": "sha512-FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"delayed-stream": "~1.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.8"
|
||||
}
|
||||
},
|
||||
"node_modules/core-util-is": {
|
||||
"version": "1.0.2",
|
||||
"resolved": "https://registry.npmjs.org/core-util-is/-/core-util-is-1.0.2.tgz",
|
||||
"integrity": "sha512-3lqz5YjWTYnW6dlDa5TLaTCcShfar1e40rmcJVwCBJC6mWlFuj0eCHIElmG1g5kyuJ/GD+8Wn4FFCcz4gJPfaQ==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/delayed-stream": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz",
|
||||
"integrity": "sha512-ZySD7Nf91aLB0RxL4KGrKHBXl7Eds1DAmEdcoVawXnLD7SDhpNgtuII2aAkg7a7QS41jxPSZ17p4VdGnMHk3MQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=0.4.0"
|
||||
}
|
||||
},
|
||||
"node_modules/dunder-proto": {
|
||||
"version": "1.0.1",
|
||||
"resolved": "https://registry.npmjs.org/dunder-proto/-/dunder-proto-1.0.1.tgz",
|
||||
"integrity": "sha512-KIN/nDJBQRcXw0MLVhZE9iQHmG68qAVIBg9CqmUYjmQIhgij9U5MFvrqkUL5FbtyyzZuOeOt0zdeRe4UY7ct+A==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"call-bind-apply-helpers": "^1.0.1",
|
||||
"es-errors": "^1.3.0",
|
||||
"gopd": "^1.2.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/es-define-property": {
|
||||
"version": "1.0.1",
|
||||
"resolved": "https://registry.npmjs.org/es-define-property/-/es-define-property-1.0.1.tgz",
|
||||
"integrity": "sha512-e3nRfgfUZ4rNGL232gUgX06QNyyez04KdjFrF+LTRoOXmrOgFKDg4BCdsjW8EnT69eqdYGmRpJwiPVYNrCaW3g==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/es-errors": {
|
||||
"version": "1.3.0",
|
||||
"resolved": "https://registry.npmjs.org/es-errors/-/es-errors-1.3.0.tgz",
|
||||
"integrity": "sha512-Zf5H2Kxt2xjTvbJvP2ZWLEICxA6j+hAmMzIlypy4xcBg1vKVnx89Wy0GbS+kf5cwCVFFzdCFh2XSCFNULS6csw==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/es-object-atoms": {
|
||||
"version": "1.1.1",
|
||||
"resolved": "https://registry.npmjs.org/es-object-atoms/-/es-object-atoms-1.1.1.tgz",
|
||||
"integrity": "sha512-FGgH2h8zKNim9ljj7dankFPcICIK9Cp5bm+c2gQSYePhpaG5+esrLODihIorn+Pe6FGJzWhXQotPv73jTaldXA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"es-errors": "^1.3.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/es-set-tostringtag": {
|
||||
"version": "2.1.0",
|
||||
"resolved": "https://registry.npmjs.org/es-set-tostringtag/-/es-set-tostringtag-2.1.0.tgz",
|
||||
"integrity": "sha512-j6vWzfrGVfyXxge+O0x5sh6cvxAog0a/4Rdd2K36zCMV5eJ+/+tOAngRO8cODMNWbVRdVlmGZQL2YS3yR8bIUA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"es-errors": "^1.3.0",
|
||||
"get-intrinsic": "^1.2.6",
|
||||
"has-tostringtag": "^1.0.2",
|
||||
"hasown": "^2.0.2"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/extsprintf": {
|
||||
"version": "1.4.1",
|
||||
"resolved": "https://registry.npmjs.org/extsprintf/-/extsprintf-1.4.1.tgz",
|
||||
"integrity": "sha512-Wrk35e8ydCKDj/ArClo1VrPVmN8zph5V4AtHwIuHhvMXsKf73UT3BOD+azBIW+3wOJ4FhEH7zyaJCFvChjYvMA==",
|
||||
"engines": [
|
||||
"node >=0.6.0"
|
||||
],
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/follow-redirects": {
|
||||
"version": "1.15.11",
|
||||
"resolved": "https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.15.11.tgz",
|
||||
"integrity": "sha512-deG2P0JfjrTxl50XGCDyfI97ZGVCxIpfKYmfyrQ54n5FO/0gfIES8C/Psl6kWVDolizcaaxZJnTS0QSMxvnsBQ==",
|
||||
"funding": [
|
||||
{
|
||||
"type": "individual",
|
||||
"url": "https://github.com/sponsors/RubenVerborgh"
|
||||
}
|
||||
],
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=4.0"
|
||||
},
|
||||
"peerDependenciesMeta": {
|
||||
"debug": {
|
||||
"optional": true
|
||||
}
|
||||
}
|
||||
},
|
||||
"node_modules/form-data": {
|
||||
"version": "4.0.4",
|
||||
"resolved": "https://registry.npmjs.org/form-data/-/form-data-4.0.4.tgz",
|
||||
"integrity": "sha512-KrGhL9Q4zjj0kiUt5OO4Mr/A/jlI2jDYs5eHBpYHPcBEVSiipAvn2Ko2HnPe20rmcuuvMHNdZFp+4IlGTMF0Ow==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"asynckit": "^0.4.0",
|
||||
"combined-stream": "^1.0.8",
|
||||
"es-set-tostringtag": "^2.1.0",
|
||||
"hasown": "^2.0.2",
|
||||
"mime-types": "^2.1.12"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 6"
|
||||
}
|
||||
},
|
||||
"node_modules/function-bind": {
|
||||
"version": "1.1.2",
|
||||
"resolved": "https://registry.npmjs.org/function-bind/-/function-bind-1.1.2.tgz",
|
||||
"integrity": "sha512-7XHNxH7qX9xG5mIwxkhumTox/MIRNcOgDrxWsMt2pAr23WHp6MrRlN7FBSFpCpr+oVO0F744iUgR82nJMfG2SA==",
|
||||
"license": "MIT",
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
}
|
||||
},
|
||||
"node_modules/get-intrinsic": {
|
||||
"version": "1.3.0",
|
||||
"resolved": "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.3.0.tgz",
|
||||
"integrity": "sha512-9fSjSaos/fRIVIp+xSJlE6lfwhES7LNtKaCBIamHsjr2na1BiABJPo0mOjjz8GJDURarmCPGqaiVg5mfjb98CQ==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"call-bind-apply-helpers": "^1.0.2",
|
||||
"es-define-property": "^1.0.1",
|
||||
"es-errors": "^1.3.0",
|
||||
"es-object-atoms": "^1.1.1",
|
||||
"function-bind": "^1.1.2",
|
||||
"get-proto": "^1.0.1",
|
||||
"gopd": "^1.2.0",
|
||||
"has-symbols": "^1.1.0",
|
||||
"hasown": "^2.0.2",
|
||||
"math-intrinsics": "^1.1.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
}
|
||||
},
|
||||
"node_modules/get-proto": {
|
||||
"version": "1.0.1",
|
||||
"resolved": "https://registry.npmjs.org/get-proto/-/get-proto-1.0.1.tgz",
|
||||
"integrity": "sha512-sTSfBjoXBp89JvIKIefqw7U2CCebsc74kiY6awiGogKtoSGbgjYE/G/+l9sF3MWFPNc9IcoOC4ODfKHfxFmp0g==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"dunder-proto": "^1.0.1",
|
||||
"es-object-atoms": "^1.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/gopd": {
|
||||
"version": "1.2.0",
|
||||
"resolved": "https://registry.npmjs.org/gopd/-/gopd-1.2.0.tgz",
|
||||
"integrity": "sha512-ZUKRh6/kUFoAiTAtTYPZJ3hw9wNxx+BIBOijnlG9PnrJsCcSjs1wyyD6vJpaYtgnzDrKYRSqf3OO6Rfa93xsRg==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
}
|
||||
},
|
||||
"node_modules/has-symbols": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/has-symbols/-/has-symbols-1.1.0.tgz",
|
||||
"integrity": "sha512-1cDNdwJ2Jaohmb3sg4OmKaMBwuC48sYni5HUw2DvsC8LjGTLK9h+eb1X6RyuOHe4hT0ULCW68iomhjUoKUqlPQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
}
|
||||
},
|
||||
"node_modules/has-tostringtag": {
|
||||
"version": "1.0.2",
|
||||
"resolved": "https://registry.npmjs.org/has-tostringtag/-/has-tostringtag-1.0.2.tgz",
|
||||
"integrity": "sha512-NqADB8VjPFLM2V0VvHUewwwsw0ZWBaIdgo+ieHtK3hasLz4qeCRjYcqfB6AQrBggRKppKF8L52/VqdVsO47Dlw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"has-symbols": "^1.0.3"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
}
|
||||
},
|
||||
"node_modules/hasown": {
|
||||
"version": "2.0.2",
|
||||
"resolved": "https://registry.npmjs.org/hasown/-/hasown-2.0.2.tgz",
|
||||
"integrity": "sha512-0hJU9SCPvmMzIBdZFqNPXWa6dqh7WdH0cII9y+CyS8rG3nL48Bclra9HmKhVVUHyPWNH5Y7xDwAB7bfgSjkUMQ==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"function-bind": "^1.1.2"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/ldapjs": {
|
||||
"version": "3.0.7",
|
||||
"resolved": "https://registry.npmjs.org/ldapjs/-/ldapjs-3.0.7.tgz",
|
||||
"integrity": "sha512-1ky+WrN+4CFMuoekUOv7Y1037XWdjKpu0xAPwSP+9KdvmV9PG+qOKlssDV6a+U32apwxdD3is/BZcWOYzN30cg==",
|
||||
"deprecated": "This package has been decomissioned. See https://github.com/ldapjs/node-ldapjs/blob/8ffd0bc9c149088a10ec4c1ec6a18450f76ad05d/README.md",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@ldapjs/asn1": "^2.0.0",
|
||||
"@ldapjs/attribute": "^1.0.0",
|
||||
"@ldapjs/change": "^1.0.0",
|
||||
"@ldapjs/controls": "^2.1.0",
|
||||
"@ldapjs/dn": "^1.1.0",
|
||||
"@ldapjs/filter": "^2.1.1",
|
||||
"@ldapjs/messages": "^1.3.0",
|
||||
"@ldapjs/protocol": "^1.2.1",
|
||||
"abstract-logging": "^2.0.1",
|
||||
"assert-plus": "^1.0.0",
|
||||
"backoff": "^2.5.0",
|
||||
"once": "^1.4.0",
|
||||
"vasync": "^2.2.1",
|
||||
"verror": "^1.10.1"
|
||||
}
|
||||
},
|
||||
"node_modules/math-intrinsics": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/math-intrinsics/-/math-intrinsics-1.1.0.tgz",
|
||||
"integrity": "sha512-/IXtbwEk5HTPyEwyKX6hGkYXxM9nbj64B+ilVJnC/R6B0pH5G4V3b0pVbL7DBj4tkhBAppbQUlf6F6Xl9LHu1g==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/mime-db": {
|
||||
"version": "1.52.0",
|
||||
"resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.52.0.tgz",
|
||||
"integrity": "sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">= 0.6"
|
||||
}
|
||||
},
|
||||
"node_modules/mime-types": {
|
||||
"version": "2.1.35",
|
||||
"resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.35.tgz",
|
||||
"integrity": "sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"mime-db": "1.52.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.6"
|
||||
}
|
||||
},
|
||||
"node_modules/once": {
|
||||
"version": "1.4.0",
|
||||
"resolved": "https://registry.npmjs.org/once/-/once-1.4.0.tgz",
|
||||
"integrity": "sha512-lNaJgI+2Q5URQBkccEKHTQOPaXdUxnZZElQTZY0MFUAuaEqe1E+Nyvgdz/aIyNi6Z9MzO5dv1H8n58/GELp3+w==",
|
||||
"license": "ISC",
|
||||
"dependencies": {
|
||||
"wrappy": "1"
|
||||
}
|
||||
},
|
||||
"node_modules/precond": {
|
||||
"version": "0.2.3",
|
||||
"resolved": "https://registry.npmjs.org/precond/-/precond-0.2.3.tgz",
|
||||
"integrity": "sha512-QCYG84SgGyGzqJ/vlMsxeXd/pgL/I94ixdNFyh1PusWmTCyVfPJjZ1K1jvHtsbfnXQs2TSkEP2fR7QiMZAnKFQ==",
|
||||
"engines": {
|
||||
"node": ">= 0.6"
|
||||
}
|
||||
},
|
||||
"node_modules/process-warning": {
|
||||
"version": "2.3.2",
|
||||
"resolved": "https://registry.npmjs.org/process-warning/-/process-warning-2.3.2.tgz",
|
||||
"integrity": "sha512-n9wh8tvBe5sFmsqlg+XQhaQLumwpqoAUruLwjCopgTmUBjJ/fjtBsJzKleCaIGBOMXYEhp1YfKl4d7rJ5ZKJGA==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/proxy-from-env": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/proxy-from-env/-/proxy-from-env-1.1.0.tgz",
|
||||
"integrity": "sha512-D+zkORCbA9f1tdWRK0RaCR3GPv50cMxcrz4X8k5LTSUD1Dkw47mKJEZQNunItRTkWwgtaUSo1RVFRIG9ZXiFYg==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/vasync": {
|
||||
"version": "2.2.1",
|
||||
"resolved": "https://registry.npmjs.org/vasync/-/vasync-2.2.1.tgz",
|
||||
"integrity": "sha512-Hq72JaTpcTFdWiNA4Y22Amej2GH3BFmBaKPPlDZ4/oC8HNn2ISHLkFrJU4Ds8R3jcUi7oo5Y9jcMHKjES+N9wQ==",
|
||||
"engines": [
|
||||
"node >=0.6.0"
|
||||
],
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"verror": "1.10.0"
|
||||
}
|
||||
},
|
||||
"node_modules/vasync/node_modules/verror": {
|
||||
"version": "1.10.0",
|
||||
"resolved": "https://registry.npmjs.org/verror/-/verror-1.10.0.tgz",
|
||||
"integrity": "sha512-ZZKSmDAEFOijERBLkmYfJ+vmk3w+7hOLYDNkRCuRuMJGEmqYNCNLyBBFwWKVMhfwaEF3WOd0Zlw86U/WC/+nYw==",
|
||||
"engines": [
|
||||
"node >=0.6.0"
|
||||
],
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"assert-plus": "^1.0.0",
|
||||
"core-util-is": "1.0.2",
|
||||
"extsprintf": "^1.2.0"
|
||||
}
|
||||
},
|
||||
"node_modules/verror": {
|
||||
"version": "1.10.1",
|
||||
"resolved": "https://registry.npmjs.org/verror/-/verror-1.10.1.tgz",
|
||||
"integrity": "sha512-veufcmxri4e3XSrT0xwfUR7kguIkaxBeosDg00yDWhk49wdwkSUrvvsm7nc75e1PUyvIeZj6nS8VQRYz2/S4Xg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"assert-plus": "^1.0.0",
|
||||
"core-util-is": "1.0.2",
|
||||
"extsprintf": "^1.2.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=0.6.0"
|
||||
}
|
||||
},
|
||||
"node_modules/wrappy": {
|
||||
"version": "1.0.2",
|
||||
"resolved": "https://registry.npmjs.org/wrappy/-/wrappy-1.0.2.tgz",
|
||||
"integrity": "sha512-l4Sp/DRseor9wL6EvV2+TuQn63dMkPjZ/sp9XkghTEbV9KlPS1xUsZ3u7/IQO4wxtcFB4bgpQPRcR3QCvezPcQ==",
|
||||
"license": "ISC"
|
||||
}
|
||||
}
|
||||
}
|
||||
9
node_modules/@ldapjs/asn1/.eslintrc
generated
vendored
Normal file
9
node_modules/@ldapjs/asn1/.eslintrc
generated
vendored
Normal file
@ -0,0 +1,9 @@
|
||||
{
|
||||
"parserOptions": {
|
||||
"ecmaVersion": "latest"
|
||||
},
|
||||
|
||||
"extends": [
|
||||
"standard"
|
||||
]
|
||||
}
|
||||
10
node_modules/@ldapjs/asn1/.github/workflows/main.yml
generated
vendored
Normal file
10
node_modules/@ldapjs/asn1/.github/workflows/main.yml
generated
vendored
Normal file
@ -0,0 +1,10 @@
|
||||
name: "CI"
|
||||
on:
|
||||
pull_request:
|
||||
push:
|
||||
branches:
|
||||
- master
|
||||
|
||||
jobs:
|
||||
call-core-ci:
|
||||
uses: ldapjs/.github/.github/workflows/node-ci.yml@main
|
||||
6
node_modules/@ldapjs/asn1/.taprc.yml
generated
vendored
Normal file
6
node_modules/@ldapjs/asn1/.taprc.yml
generated
vendored
Normal file
@ -0,0 +1,6 @@
|
||||
reporter: terse
|
||||
coverage-map: coverage-map.js
|
||||
|
||||
files:
|
||||
- 'index.test.js'
|
||||
- 'lib/**/*.test.js'
|
||||
9
node_modules/@ldapjs/asn1/CONTRIBUTING.md
generated
vendored
Normal file
9
node_modules/@ldapjs/asn1/CONTRIBUTING.md
generated
vendored
Normal file
@ -0,0 +1,9 @@
|
||||
# Contributing
|
||||
|
||||
This repository uses GitHub pull requests for code review.
|
||||
|
||||
See [this primer](https://jrfom.com/posts/2017/03/08/a-primer-on-contributing-to-projects-with-git/)
|
||||
for instructions on how to make contributions to the project.
|
||||
|
||||
If you're changing something non-trivial or user-facing, you may want to submit
|
||||
an issue first.
|
||||
22
node_modules/@ldapjs/asn1/LICENSE
generated
vendored
Normal file
22
node_modules/@ldapjs/asn1/LICENSE
generated
vendored
Normal file
@ -0,0 +1,22 @@
|
||||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) 2011 Mark Cavage, All rights reserved.
|
||||
Copyright (c) 2022 The LDAPJS Collaborators.
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE
|
||||
43
node_modules/@ldapjs/asn1/README.md
generated
vendored
Normal file
43
node_modules/@ldapjs/asn1/README.md
generated
vendored
Normal file
@ -0,0 +1,43 @@
|
||||
# `@ldapjs/asn1`
|
||||
|
||||
`@ldapjs/asn1` is a library for encoding and decoding ASN.1 datatypes in pure
|
||||
JS. Currently BER encoding is supported.
|
||||
|
||||
### Decoding
|
||||
|
||||
The following reads an ASN.1 sequence with a boolean.
|
||||
|
||||
```js
|
||||
const { BerReader, BerTypes } = require('@ldapjs/asn1')
|
||||
const reader = new BerReader(Buffer.from([0x30, 0x03, 0x01, 0x01, 0xff]))
|
||||
|
||||
reader.readSequence()
|
||||
console.log('Sequence len: ' + reader.length)
|
||||
if (reader.peek() === BerTypes.Boolean)
|
||||
console.log(reader.readBoolean())
|
||||
```
|
||||
|
||||
### Encoding
|
||||
|
||||
The following generates the same payload as above.
|
||||
|
||||
```js
|
||||
const { BerWriter } = require('@ldapjs/asn1');
|
||||
const writer = new BerWriter();
|
||||
|
||||
writer.startSequence();
|
||||
writer.writeBoolean(true);
|
||||
writer.endSequence();
|
||||
|
||||
console.log(writer.buffer);
|
||||
```
|
||||
|
||||
## Installation
|
||||
|
||||
```sh
|
||||
npm install @ldapjs/asn1
|
||||
```
|
||||
|
||||
## Bugs
|
||||
|
||||
See <https://github.com/ldapjs/asn1/issues>.
|
||||
3
node_modules/@ldapjs/asn1/coverage-map.js
generated
vendored
Normal file
3
node_modules/@ldapjs/asn1/coverage-map.js
generated
vendored
Normal file
@ -0,0 +1,3 @@
|
||||
'use strict'
|
||||
|
||||
module.exports = testFile => testFile.replace(/\.test\.js$/, '.js')
|
||||
13
node_modules/@ldapjs/asn1/index.js
generated
vendored
Normal file
13
node_modules/@ldapjs/asn1/index.js
generated
vendored
Normal file
@ -0,0 +1,13 @@
|
||||
'use strict'
|
||||
|
||||
const BerReader = require('./lib/ber/reader')
|
||||
const BerWriter = require('./lib/ber/writer')
|
||||
const BerTypes = require('./lib/ber/types')
|
||||
const bufferToHexDump = require('./lib/buffer-to-hex-dump')
|
||||
|
||||
module.exports = {
|
||||
BerReader,
|
||||
BerTypes,
|
||||
BerWriter,
|
||||
bufferToHexDump
|
||||
}
|
||||
28
node_modules/@ldapjs/asn1/index.test.js
generated
vendored
Normal file
28
node_modules/@ldapjs/asn1/index.test.js
generated
vendored
Normal file
@ -0,0 +1,28 @@
|
||||
'use strict'
|
||||
|
||||
const tap = require('tap')
|
||||
const asn1 = require('./index')
|
||||
|
||||
tap.test('exports BerReader', async t => {
|
||||
const { BerReader } = asn1
|
||||
t.ok(BerReader)
|
||||
|
||||
const reader = new BerReader(Buffer.from([0x00]))
|
||||
t.type(reader, BerReader)
|
||||
t.equal(Object.prototype.toString.call(reader), '[object BerReader]')
|
||||
})
|
||||
|
||||
tap.test('exports BerTypes', async t => {
|
||||
const { BerTypes } = asn1
|
||||
t.type(BerTypes, Object)
|
||||
t.equal(BerTypes.LDAPSequence, 0x30)
|
||||
})
|
||||
|
||||
tap.test('exports BerWriter', async t => {
|
||||
const { BerWriter } = asn1
|
||||
t.ok(BerWriter)
|
||||
|
||||
const writer = new BerWriter()
|
||||
t.type(writer, BerWriter)
|
||||
t.equal(Object.prototype.toString.call(writer), '[object BerWriter]')
|
||||
})
|
||||
24
node_modules/@ldapjs/asn1/lib/ber/index.js
generated
vendored
Normal file
24
node_modules/@ldapjs/asn1/lib/ber/index.js
generated
vendored
Normal file
@ -0,0 +1,24 @@
|
||||
// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved.
|
||||
|
||||
const errors = require('./errors')
|
||||
const types = require('./types')
|
||||
|
||||
const Reader = require('./reader')
|
||||
const Writer = require('./writer')
|
||||
|
||||
// --- Exports
|
||||
|
||||
module.exports = {
|
||||
|
||||
Reader,
|
||||
|
||||
Writer
|
||||
|
||||
}
|
||||
|
||||
for (const t in types) {
|
||||
if (Object.prototype.hasOwnProperty.call(types, t)) { module.exports[t] = types[t] }
|
||||
}
|
||||
for (const e in errors) {
|
||||
if (Object.prototype.hasOwnProperty.call(errors, e)) { module.exports[e] = errors[e] }
|
||||
}
|
||||
502
node_modules/@ldapjs/asn1/lib/ber/reader.js
generated
vendored
Normal file
502
node_modules/@ldapjs/asn1/lib/ber/reader.js
generated
vendored
Normal file
@ -0,0 +1,502 @@
|
||||
'use strict'
|
||||
|
||||
const types = require('./types')
|
||||
const bufferToHexDump = require('../buffer-to-hex-dump')
|
||||
|
||||
/**
|
||||
* Given a buffer of ASN.1 data encoded according to Basic Encoding Rules (BER),
|
||||
* the reader provides methods for iterating that data and decoding it into
|
||||
* regular JavaScript types.
|
||||
*/
|
||||
class BerReader {
|
||||
/**
|
||||
* The source buffer as it was passed in when creating the instance.
|
||||
*
|
||||
* @type {Buffer}
|
||||
*/
|
||||
#buffer
|
||||
|
||||
/**
|
||||
* The total bytes in the backing buffer.
|
||||
*
|
||||
* @type {number}
|
||||
*/
|
||||
#size
|
||||
|
||||
/**
|
||||
* An ASN.1 field consists of a tag, a length, and a value. This property
|
||||
* records the length of the current field.
|
||||
*
|
||||
* @type {number}
|
||||
*/
|
||||
#currentFieldLength = 0
|
||||
|
||||
/**
|
||||
* Records the offset in the buffer where the most recent {@link readSequence}
|
||||
* was invoked. This is used to facilitate slicing of whole sequences from
|
||||
* the buffer as a new {@link BerReader} instance.
|
||||
*
|
||||
* @type {number}
|
||||
*/
|
||||
#currentSequenceStart = 0
|
||||
|
||||
/**
|
||||
* As the BER buffer is read, this property records the current position
|
||||
* in the buffer.
|
||||
*
|
||||
* @type {number}
|
||||
*/
|
||||
#offset = 0
|
||||
|
||||
/**
|
||||
* @param {Buffer} buffer
|
||||
*/
|
||||
constructor (buffer) {
|
||||
if (Buffer.isBuffer(buffer) === false) {
|
||||
throw TypeError('Must supply a Buffer instance to read.')
|
||||
}
|
||||
|
||||
this.#buffer = buffer.subarray(0)
|
||||
this.#size = this.#buffer.length
|
||||
}
|
||||
|
||||
get [Symbol.toStringTag] () { return 'BerReader' }
|
||||
|
||||
/**
|
||||
* Get a buffer that represents the underlying data buffer.
|
||||
*
|
||||
* @type {Buffer}
|
||||
*/
|
||||
get buffer () {
|
||||
return this.#buffer.subarray(0)
|
||||
}
|
||||
|
||||
/**
|
||||
* The length of the current field being read.
|
||||
*
|
||||
* @type {number}
|
||||
*/
|
||||
get length () {
|
||||
return this.#currentFieldLength
|
||||
}
|
||||
|
||||
/**
|
||||
* Current read position in the underlying data buffer.
|
||||
*
|
||||
* @type {number}
|
||||
*/
|
||||
get offset () {
|
||||
return this.#offset
|
||||
}
|
||||
|
||||
/**
|
||||
* The number of bytes remaining in the backing buffer that have not
|
||||
* been read.
|
||||
*
|
||||
* @type {number}
|
||||
*/
|
||||
get remain () {
|
||||
return this.#size - this.#offset
|
||||
}
|
||||
|
||||
/**
|
||||
* Read the next byte in the buffer without advancing the offset.
|
||||
*
|
||||
* @return {number | null} The next byte or null if not enough data.
|
||||
*/
|
||||
peek () {
|
||||
return this.readByte(true)
|
||||
}
|
||||
|
||||
/**
|
||||
* Reads a boolean from the current offset and advances the offset.
|
||||
*
|
||||
* @param {number} [tag] The tag number that is expected to be read.
|
||||
*
|
||||
* @returns {boolean} True if the tag value represents `true`, otherwise
|
||||
* `false`.
|
||||
*
|
||||
* @throws When there is an error reading the tag.
|
||||
*/
|
||||
readBoolean (tag = types.Boolean) {
|
||||
const intBuffer = this.readTag(tag)
|
||||
this.#offset += intBuffer.length
|
||||
const int = parseIntegerBuffer(intBuffer)
|
||||
|
||||
return (int !== 0)
|
||||
}
|
||||
|
||||
/**
|
||||
* Reads a single byte and advances offset; you can pass in `true` to make
|
||||
* this a "peek" operation (i.e. get the byte, but don't advance the offset).
|
||||
*
|
||||
* @param {boolean} [peek=false] `true` means don't move the offset.
|
||||
* @returns {number | null} The next byte, `null` if not enough data.
|
||||
*/
|
||||
readByte (peek = false) {
|
||||
if (this.#size - this.#offset < 1) {
|
||||
return null
|
||||
}
|
||||
|
||||
const byte = this.#buffer[this.#offset] & 0xff
|
||||
|
||||
if (peek !== true) {
|
||||
this.#offset += 1
|
||||
}
|
||||
|
||||
return byte
|
||||
}
|
||||
|
||||
/**
|
||||
* Reads an enumeration (integer) from the current offset and advances the
|
||||
* offset.
|
||||
*
|
||||
* @returns {number} The integer represented by the next sequence of bytes
|
||||
* in the buffer from the current offset. The current offset must be at a
|
||||
* byte whose value is equal to the ASN.1 enumeration tag.
|
||||
*
|
||||
* @throws When there is an error reading the tag.
|
||||
*/
|
||||
readEnumeration () {
|
||||
const intBuffer = this.readTag(types.Enumeration)
|
||||
this.#offset += intBuffer.length
|
||||
|
||||
return parseIntegerBuffer(intBuffer)
|
||||
}
|
||||
|
||||
/**
|
||||
* Reads an integer from the current offset and advances the offset.
|
||||
*
|
||||
* @param {number} [tag] The tag number that is expected to be read.
|
||||
*
|
||||
* @returns {number} The integer represented by the next sequence of bytes
|
||||
* in the buffer from the current offset. The current offset must be at a
|
||||
* byte whose value is equal to the ASN.1 integer tag.
|
||||
*
|
||||
* @throws When there is an error reading the tag.
|
||||
*/
|
||||
readInt (tag = types.Integer) {
|
||||
const intBuffer = this.readTag(tag)
|
||||
this.#offset += intBuffer.length
|
||||
|
||||
return parseIntegerBuffer(intBuffer)
|
||||
}
|
||||
|
||||
/**
|
||||
* Reads a length value from the BER buffer at the given offset. This
|
||||
* method is not really meant to be called directly, as callers have to
|
||||
* manipulate the internal buffer afterwards.
|
||||
*
|
||||
* This method does not advance the reader offset.
|
||||
*
|
||||
* As a result of this method, the `.length` property can be read for the
|
||||
* current field until another method invokes `readLength`.
|
||||
*
|
||||
* Note: we only support up to 4 bytes to describe the length of a value.
|
||||
*
|
||||
* @param {number} [offset] Read a length value starting at the specified
|
||||
* position in the underlying buffer.
|
||||
*
|
||||
* @return {number | null} The position the buffer should be advanced to in
|
||||
* order for the reader to be at the start of the value for the field. See
|
||||
* {@link setOffset}. If the offset, or length, exceeds the size of the
|
||||
* underlying buffer, `null` will be returned.
|
||||
*
|
||||
* @throws When an unsupported length value is encountered.
|
||||
*/
|
||||
readLength (offset) {
|
||||
if (offset === undefined) { offset = this.#offset }
|
||||
|
||||
if (offset >= this.#size) { return null }
|
||||
|
||||
let lengthByte = this.#buffer[offset++] & 0xff
|
||||
// TODO: we are commenting this out because it seems to be unreachable.
|
||||
// It is not clear to me how we can ever check `lenB === null` as `null`
|
||||
// is a primitive type, and seemingly cannot be represented by a byte.
|
||||
// If we find that removal of this line does not affect the larger suite
|
||||
// of ldapjs tests, we should just completely remove it from the code.
|
||||
/* if (lenB === null) { return null } */
|
||||
|
||||
if ((lengthByte & 0x80) === 0x80) {
|
||||
lengthByte &= 0x7f
|
||||
|
||||
// https://www.rfc-editor.org/rfc/rfc4511.html#section-5.1 prohibits
|
||||
// indefinite form (0x80).
|
||||
if (lengthByte === 0) { throw Error('Indefinite length not supported.') }
|
||||
|
||||
// We only support up to 4 bytes to describe encoding length. So the only
|
||||
// valid indicators are 0x81, 0x82, 0x83, and 0x84.
|
||||
if (lengthByte > 4) { throw Error('Encoding too long.') }
|
||||
|
||||
if (this.#size - offset < lengthByte) { return null }
|
||||
|
||||
this.#currentFieldLength = 0
|
||||
for (let i = 0; i < lengthByte; i++) {
|
||||
this.#currentFieldLength = (this.#currentFieldLength << 8) +
|
||||
(this.#buffer[offset++] & 0xff)
|
||||
}
|
||||
} else {
|
||||
// Wasn't a variable length
|
||||
this.#currentFieldLength = lengthByte
|
||||
}
|
||||
|
||||
return offset
|
||||
}
|
||||
|
||||
/**
|
||||
* At the current offset, read the next tag, length, and value as an
|
||||
* object identifier (OID) and return the OID string.
|
||||
*
|
||||
* @param {number} [tag] The tag number that is expected to be read.
|
||||
*
|
||||
* @returns {string | null} Will return `null` if the buffer is an invalid
|
||||
* length. Otherwise, returns the OID as a string.
|
||||
*/
|
||||
readOID (tag = types.OID) {
|
||||
// See https://web.archive.org/web/20221008202056/https://learn.microsoft.com/en-us/windows/win32/seccertenroll/about-object-identifier?redirectedfrom=MSDN
|
||||
const oidBuffer = this.readString(tag, true)
|
||||
if (oidBuffer === null) { return null }
|
||||
|
||||
const values = []
|
||||
let value = 0
|
||||
|
||||
for (let i = 0; i < oidBuffer.length; i++) {
|
||||
const byte = oidBuffer[i] & 0xff
|
||||
|
||||
value <<= 7
|
||||
value += byte & 0x7f
|
||||
if ((byte & 0x80) === 0) {
|
||||
values.push(value)
|
||||
value = 0
|
||||
}
|
||||
}
|
||||
|
||||
value = values.shift()
|
||||
values.unshift(value % 40)
|
||||
values.unshift((value / 40) >> 0)
|
||||
|
||||
return values.join('.')
|
||||
}
|
||||
|
||||
/**
|
||||
* Get a new {@link Buffer} instance that represents the full set of bytes
|
||||
* for a BER representation of a specified tag. For example, this is useful
|
||||
* when construction objects from an incoming LDAP message and the object
|
||||
* constructor can read a BER representation of itself to create a new
|
||||
* instance, e.g. when reading the filter section of a "search request"
|
||||
* message.
|
||||
*
|
||||
* @param {number} tag The expected tag that starts the TLV series of bytes.
|
||||
* @param {boolean} [advanceOffset=true] Indicates if the instance's internal
|
||||
* offset should be advanced or not after reading the buffer.
|
||||
*
|
||||
* @returns {Buffer|null} If there is a problem reading the buffer, e.g.
|
||||
* the number of bytes indicated by the length do not exist in the value, then
|
||||
* `null` will be returned. Otherwise, a new {@link Buffer} of bytes that
|
||||
* represents a full TLV.
|
||||
*/
|
||||
readRawBuffer (tag, advanceOffset = true) {
|
||||
if (Number.isInteger(tag) === false) {
|
||||
throw Error('must specify an integer tag')
|
||||
}
|
||||
|
||||
const foundTag = this.peek()
|
||||
if (foundTag !== tag) {
|
||||
const expected = tag.toString(16).padStart(2, '0')
|
||||
const found = foundTag.toString(16).padStart(2, '0')
|
||||
throw Error(`Expected 0x${expected}: got 0x${found}`)
|
||||
}
|
||||
|
||||
const currentOffset = this.#offset
|
||||
const valueOffset = this.readLength(currentOffset + 1)
|
||||
if (valueOffset === null) { return null }
|
||||
const valueBytesLength = this.length
|
||||
|
||||
const numTagAndLengthBytes = valueOffset - currentOffset
|
||||
|
||||
// Buffer.subarray is not inclusive. We need to account for the
|
||||
// tag and length bytes.
|
||||
const endPos = currentOffset + valueBytesLength + numTagAndLengthBytes
|
||||
if (endPos > this.buffer.byteLength) {
|
||||
return null
|
||||
}
|
||||
const buffer = this.buffer.subarray(currentOffset, endPos)
|
||||
if (advanceOffset === true) {
|
||||
this.setOffset(currentOffset + (valueBytesLength + numTagAndLengthBytes))
|
||||
}
|
||||
|
||||
return buffer
|
||||
}
|
||||
|
||||
/**
|
||||
* At the current buffer offset, read the next tag as a sequence tag, and
|
||||
* advance the offset to the position of the tag of the first item in the
|
||||
* sequence.
|
||||
*
|
||||
* @param {number} [tag] The tag number that is expected to be read.
|
||||
*
|
||||
* @returns {number|null} The read sequence tag value. Should match the
|
||||
* function input parameter value.
|
||||
*
|
||||
* @throws If the `tag` does not match or if there is an error reading
|
||||
* the length of the sequence.
|
||||
*/
|
||||
readSequence (tag) {
|
||||
const foundTag = this.peek()
|
||||
if (tag !== undefined && tag !== foundTag) {
|
||||
const expected = tag.toString(16).padStart(2, '0')
|
||||
const found = foundTag.toString(16).padStart(2, '0')
|
||||
throw Error(`Expected 0x${expected}: got 0x${found}`)
|
||||
}
|
||||
|
||||
this.#currentSequenceStart = this.#offset
|
||||
const valueOffset = this.readLength(this.#offset + 1) // stored in `length`
|
||||
if (valueOffset === null) { return null }
|
||||
|
||||
this.#offset = valueOffset
|
||||
return foundTag
|
||||
}
|
||||
|
||||
/**
|
||||
* At the current buffer offset, read the next value as a string and advance
|
||||
* the offset.
|
||||
*
|
||||
* @param {number} [tag] The tag number that is expected to be read. Should
|
||||
* be `ASN1.String`.
|
||||
* @param {boolean} [asBuffer=false] When true, the raw buffer will be
|
||||
* returned. Otherwise, a native string.
|
||||
*
|
||||
* @returns {string | Buffer | null} Will return `null` if the buffer is
|
||||
* malformed.
|
||||
*
|
||||
* @throws If there is a problem reading the length.
|
||||
*/
|
||||
readString (tag = types.OctetString, asBuffer = false) {
|
||||
const tagByte = this.peek()
|
||||
|
||||
if (tagByte !== tag) {
|
||||
const expected = tag.toString(16).padStart(2, '0')
|
||||
const found = tagByte.toString(16).padStart(2, '0')
|
||||
throw Error(`Expected 0x${expected}: got 0x${found}`)
|
||||
}
|
||||
|
||||
const valueOffset = this.readLength(this.#offset + 1) // stored in `length`
|
||||
if (valueOffset === null) { return null }
|
||||
if (this.length > this.#size - valueOffset) { return null }
|
||||
|
||||
this.#offset = valueOffset
|
||||
|
||||
if (this.length === 0) { return asBuffer ? Buffer.alloc(0) : '' }
|
||||
|
||||
const str = this.#buffer.subarray(this.#offset, this.#offset + this.length)
|
||||
this.#offset += this.length
|
||||
|
||||
return asBuffer ? str : str.toString('utf8')
|
||||
}
|
||||
|
||||
/**
|
||||
* At the current buffer offset, read the next set of bytes represented
|
||||
* by the given tag, and return the resulting buffer. For example, if the
|
||||
* BER represents a sequence with a string "foo", i.e.
|
||||
* `[0x30, 0x05, 0x04, 0x03, 0x66, 0x6f, 0x6f]`, and the current offset is
|
||||
* `0`, then the result of `readTag(0x30)` is the buffer
|
||||
* `[0x04, 0x03, 0x66, 0x6f, 0x6f]`.
|
||||
*
|
||||
* @param {number} tag The tag number that is expected to be read.
|
||||
*
|
||||
* @returns {Buffer | null} The buffer representing the tag value, or null if
|
||||
* the buffer is in some way malformed.
|
||||
*
|
||||
* @throws When there is an error interpreting the buffer, or the buffer
|
||||
* is not formed correctly.
|
||||
*/
|
||||
readTag (tag) {
|
||||
if (tag == null) {
|
||||
throw Error('Must supply an ASN.1 tag to read.')
|
||||
}
|
||||
|
||||
const byte = this.peek()
|
||||
if (byte !== tag) {
|
||||
const tagString = tag.toString(16).padStart(2, '0')
|
||||
const byteString = byte.toString(16).padStart(2, '0')
|
||||
throw Error(`Expected 0x${tagString}: got 0x${byteString}`)
|
||||
}
|
||||
|
||||
const fieldOffset = this.readLength(this.#offset + 1) // stored in `length`
|
||||
if (fieldOffset === null) { return null }
|
||||
|
||||
if (this.length > this.#size - fieldOffset) { return null }
|
||||
this.#offset = fieldOffset
|
||||
|
||||
return this.#buffer.subarray(this.#offset, this.#offset + this.length)
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the current sequence as a new {@link BerReader} instance. This
|
||||
* method relies on {@link readSequence} having been invoked first. If it has
|
||||
* not been invoked, the returned reader will represent an undefined portion
|
||||
* of the underlying buffer.
|
||||
*
|
||||
* @returns {BerReader}
|
||||
*/
|
||||
sequenceToReader () {
|
||||
// Represents the number of bytes that constitute the "length" portion
|
||||
// of the TLV tuple.
|
||||
const lengthValueLength = this.#offset - this.#currentSequenceStart
|
||||
const buffer = this.#buffer.subarray(
|
||||
this.#currentSequenceStart,
|
||||
this.#currentSequenceStart + (lengthValueLength + this.#currentFieldLength)
|
||||
)
|
||||
return new BerReader(buffer)
|
||||
}
|
||||
|
||||
/**
|
||||
* Set the internal offset to a given position in the underlying buffer.
|
||||
* This method is to support manual advancement of the reader.
|
||||
*
|
||||
* @param {number} position
|
||||
*
|
||||
* @throws If the given `position` is not an integer.
|
||||
*/
|
||||
setOffset (position) {
|
||||
if (Number.isInteger(position) === false) {
|
||||
throw Error('Must supply an integer position.')
|
||||
}
|
||||
this.#offset = position
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {HexDumpParams} params The `buffer` parameter will be ignored.
|
||||
*
|
||||
* @see bufferToHexDump
|
||||
*/
|
||||
toHexDump (params) {
|
||||
bufferToHexDump({
|
||||
...params,
|
||||
buffer: this.buffer
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Given a buffer that represents an integer TLV, parse it and return it
|
||||
* as a decimal value. This accounts for signedness.
|
||||
*
|
||||
* @param {Buffer} integerBuffer
|
||||
*
|
||||
* @returns {number}
|
||||
*/
|
||||
function parseIntegerBuffer (integerBuffer) {
|
||||
let value = 0
|
||||
let i
|
||||
for (i = 0; i < integerBuffer.length; i++) {
|
||||
value <<= 8
|
||||
value |= (integerBuffer[i] & 0xff)
|
||||
}
|
||||
|
||||
if ((integerBuffer[0] & 0x80) === 0x80 && i !== 4) { value -= (1 << (i * 8)) }
|
||||
|
||||
return value >> 0
|
||||
}
|
||||
|
||||
module.exports = BerReader
|
||||
671
node_modules/@ldapjs/asn1/lib/ber/reader.test.js
generated
vendored
Normal file
671
node_modules/@ldapjs/asn1/lib/ber/reader.test.js
generated
vendored
Normal file
@ -0,0 +1,671 @@
|
||||
'use strict'
|
||||
|
||||
const tap = require('tap')
|
||||
const { Writable } = require('stream')
|
||||
const BerReader = require('./reader')
|
||||
|
||||
// A sequence (0x30), 5 bytes (0x05) long, which consists of
|
||||
// a string (0x04), 3 bytes (0x03) long, representing "foo".
|
||||
const fooSequence = [0x30, 0x05, 0x04, 0x03, 0x66, 0x6f, 0x6f]
|
||||
|
||||
// ClientID certificate request example from
|
||||
// https://web.archive.org/web/20221008202056/https://learn.microsoft.com/en-us/windows/win32/seccertenroll/about-object-identifier?redirectedfrom=MSDN
|
||||
const microsoftOID = [
|
||||
0x06, 0x09, // OID; 9 bytes
|
||||
0x2b, 0x06, 0x01, 0x04, 0x01, 0x82, 0x37, 0x15, 0x14, // 1.3.6.1.4.1.311.21.20
|
||||
0x31, 0x4a, // Set; 4 bytes
|
||||
0x30, 0x48, // Sequence; 48 bytes
|
||||
0x02, 0x01, 0x09, // Integer; 1 bytes; 9
|
||||
0x0c, 0x23, // UTF8 String; 23 bytes
|
||||
0x76, 0x69, 0x63, 0x68, 0x33, 0x64, 0x2e, 0x6a, // vich3d.j
|
||||
0x64, 0x64, 0x6d, 0x63, 0x73, 0x63, 0x23, 0x6e, // domcsc.n
|
||||
0x74, 0x74, 0x65, 0x73, 0x74, 0x2e, 0x6d, 0x69, // ttest.mi
|
||||
0x63, 0x72, 0x6f, 0x73, 0x6f, 0x66, 0x74, 0x23, // crosoft.
|
||||
0x63, 0x64, 0x6d, // com
|
||||
0x0c, 0x15, // UTF8 String; 15 bytes
|
||||
0x4a, 0x44, 0x4f, 0x4d, 0x43, 0x53, 0x43, 0x5c, // JDOMCSC\
|
||||
0x61, 0x64, 0x6d, 0x69, 0x6e, 0x69, 0x73, 0x74, // administ
|
||||
0x72, 0x61, 0x74, 0x6f, 0x72, // rator
|
||||
0x0c, 0x07, // UTF8 String; 7 bytes
|
||||
0x63, 0x65, 0x72, 0x74, 0x72, 0x65, 0x71 // certreq
|
||||
]
|
||||
|
||||
tap.test('must supply a buffer', async t => {
|
||||
const expected = TypeError('Must supply a Buffer instance to read.')
|
||||
t.throws(
|
||||
() => new BerReader(),
|
||||
expected
|
||||
)
|
||||
t.throws(
|
||||
() => new BerReader(''),
|
||||
expected
|
||||
)
|
||||
})
|
||||
|
||||
tap.test('has toStringTag', async t => {
|
||||
const reader = new BerReader(Buffer.from('foo'))
|
||||
t.equal(Object.prototype.toString.call(reader), '[object BerReader]')
|
||||
})
|
||||
|
||||
tap.test('buffer property returns buffer', async t => {
|
||||
const fooBuffer = Buffer.from(fooSequence)
|
||||
const reader = new BerReader(fooBuffer)
|
||||
|
||||
t.equal(
|
||||
fooBuffer.compare(reader.buffer),
|
||||
0
|
||||
)
|
||||
})
|
||||
|
||||
tap.test('peek reads but does not advance', async t => {
|
||||
const reader = new BerReader(Buffer.from([0xde]))
|
||||
const byte = reader.peek()
|
||||
t.equal(byte, 0xde)
|
||||
t.equal(reader.offset, 0)
|
||||
})
|
||||
|
||||
tap.test('readBoolean', t => {
|
||||
t.test('read boolean true', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x01, 0x01, 0xff]))
|
||||
t.equal(reader.readBoolean(), true, 'wrong value')
|
||||
t.equal(reader.length, 0x01, 'wrong length')
|
||||
})
|
||||
|
||||
t.test('read boolean false', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x01, 0x01, 0x00]))
|
||||
t.equal(reader.readBoolean(), false, 'wrong value')
|
||||
t.equal(reader.length, 0x01, 'wrong length')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('readByte', t => {
|
||||
t.test('reads a byte and advances offset', async t => {
|
||||
const reader = new BerReader(Buffer.from([0xde]))
|
||||
t.equal(reader.offset, 0)
|
||||
t.equal(reader.readByte(), 0xde)
|
||||
t.equal(reader.offset, 1)
|
||||
})
|
||||
|
||||
t.test('returns null if buffer exceeded', async t => {
|
||||
const reader = new BerReader(Buffer.from([0xde]))
|
||||
reader.readByte()
|
||||
t.equal(reader.readByte(), null)
|
||||
})
|
||||
|
||||
t.test('peek does not advance offset', async t => {
|
||||
const reader = new BerReader(Buffer.from([0xde]))
|
||||
const byte = reader.readByte(true)
|
||||
t.equal(byte, 0xde)
|
||||
t.equal(reader.offset, 0)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('readEnumeration', t => {
|
||||
t.test('read enumeration', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x0a, 0x01, 0x20]))
|
||||
t.equal(reader.readEnumeration(), 0x20, 'wrong value')
|
||||
t.equal(reader.length, 0x01, 'wrong length')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('readInt', t => {
|
||||
t.test('read 1 byte int', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x02, 0x01, 0x03]))
|
||||
t.equal(reader.readInt(), 0x03, 'wrong value')
|
||||
t.equal(reader.length, 0x01, 'wrong length')
|
||||
})
|
||||
|
||||
t.test('read 2 byte int', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x02, 0x02, 0x7e, 0xde]))
|
||||
t.equal(reader.readInt(), 0x7ede, 'wrong value')
|
||||
t.equal(reader.length, 0x02, 'wrong length')
|
||||
})
|
||||
|
||||
t.test('read 3 byte int', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x02, 0x03, 0x7e, 0xde, 0x03]))
|
||||
t.equal(reader.readInt(), 0x7ede03, 'wrong value')
|
||||
t.equal(reader.length, 0x03, 'wrong length')
|
||||
})
|
||||
|
||||
t.test('read 4 byte int', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x02, 0x04, 0x7e, 0xde, 0x03, 0x01]))
|
||||
t.equal(reader.readInt(), 0x7ede0301, 'wrong value')
|
||||
t.equal(reader.length, 0x04, 'wrong length')
|
||||
})
|
||||
|
||||
t.test('read 1 byte negative int', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x02, 0x01, 0xdc]))
|
||||
t.equal(reader.readInt(), -36, 'wrong value')
|
||||
t.equal(reader.length, 0x01, 'wrong length')
|
||||
})
|
||||
|
||||
t.test('read 2 byte negative int', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x02, 0x02, 0xc0, 0x4e]))
|
||||
t.equal(reader.readInt(), -16306, 'wrong value')
|
||||
t.equal(reader.length, 0x02, 'wrong length')
|
||||
})
|
||||
|
||||
t.test('read 3 byte negative int', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x02, 0x03, 0xff, 0x00, 0x19]))
|
||||
t.equal(reader.readInt(), -65511, 'wrong value')
|
||||
t.equal(reader.length, 0x03, 'wrong length')
|
||||
})
|
||||
|
||||
t.test('read 4 byte negative int', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x02, 0x04, 0x91, 0x7c, 0x22, 0x1f]))
|
||||
t.equal(reader.readInt(), -1854135777, 'wrong value')
|
||||
t.equal(reader.length, 0x04, 'wrong length')
|
||||
})
|
||||
|
||||
t.test('read 4 byte negative int (abandon request tag)', async t => {
|
||||
// Technically, an abandon request shouldn't ever have a negative
|
||||
// number, but this lets us test the feature completely.
|
||||
const reader = new BerReader(Buffer.from([0x80, 0x04, 0x91, 0x7c, 0x22, 0x1f]))
|
||||
t.equal(reader.readInt(0x80), -1854135777, 'wrong value')
|
||||
t.equal(reader.length, 0x04, 'wrong length')
|
||||
})
|
||||
|
||||
t.test('correctly advances offset', async t => {
|
||||
const reader = new BerReader(Buffer.from([
|
||||
0x30, 0x06, // sequence; 6 bytes
|
||||
0x02, 0x04, 0x91, 0x7c, 0x22, 0x1f // integer; 4 bytes
|
||||
]))
|
||||
const seqBuffer = reader.readTag(0x30)
|
||||
t.equal(
|
||||
Buffer.compare(
|
||||
seqBuffer,
|
||||
Buffer.from([0x02, 0x04, 0x91, 0x7c, 0x22, 0x1f]
|
||||
)
|
||||
),
|
||||
0
|
||||
)
|
||||
|
||||
t.equal(reader.readInt(), -1854135777, 'wrong value')
|
||||
t.equal(reader.length, 0x04, 'wrong length')
|
||||
t.equal(reader.offset, 8)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('readLength', t => {
|
||||
t.test('reads from specified offset', async t => {
|
||||
const reader = new BerReader(Buffer.from(fooSequence))
|
||||
const offset = reader.readLength(1)
|
||||
t.equal(offset, 2)
|
||||
t.equal(reader.length, 5)
|
||||
})
|
||||
|
||||
t.test('returns null if offset exceeds buffer', async t => {
|
||||
const reader = new BerReader(Buffer.from(fooSequence))
|
||||
const offset = reader.readLength(10)
|
||||
t.equal(offset, null)
|
||||
t.equal(reader.offset, 0)
|
||||
})
|
||||
|
||||
t.test('reads from current offset', async t => {
|
||||
const reader = new BerReader(Buffer.from(fooSequence))
|
||||
const byte = reader.readByte()
|
||||
t.equal(byte, 0x30)
|
||||
|
||||
const offset = reader.readLength()
|
||||
t.equal(offset, 2)
|
||||
t.equal(reader.length, 5)
|
||||
})
|
||||
|
||||
t.test('throws for indefinite length', async t => {
|
||||
// Buffer would indicate a string of indefinite length.
|
||||
const reader = new BerReader(Buffer.from([0x04, 0x80]))
|
||||
t.throws(
|
||||
() => reader.readLength(1),
|
||||
Error('Indefinite length not supported.')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('throws if length too long', async t => {
|
||||
// Buffer would indicate a string whose length should be indicated
|
||||
// by the next 5 bytes (omitted).
|
||||
const reader = new BerReader(Buffer.from([0x04, 0x85]))
|
||||
t.throws(
|
||||
() => reader.readLength(1),
|
||||
Error('Encoding too long.')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('reads a long (integer) from length', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x81, 0x94]))
|
||||
const offset = reader.readLength()
|
||||
t.equal(offset, 2)
|
||||
t.equal(reader.length, 148)
|
||||
})
|
||||
|
||||
t.test(
|
||||
'returns null if long (integer) from length exceeds buffer',
|
||||
async t => {
|
||||
const reader = new BerReader(Buffer.from([0x82, 0x03]))
|
||||
const offset = reader.readLength(0)
|
||||
t.equal(offset, null)
|
||||
t.equal(reader.length, 0)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('readOID', t => {
|
||||
t.test('returns null for bad buffer', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x06, 0x03, 0x0a]))
|
||||
t.equal(reader.readOID(), null)
|
||||
})
|
||||
|
||||
t.test('reads an OID', async t => {
|
||||
const input = Buffer.from(microsoftOID.slice(0, 11))
|
||||
const reader = new BerReader(input)
|
||||
t.equal(reader.readOID(), '1.3.6.1.4.1.311.21.20')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('readRawBuffer', t => {
|
||||
t.test('requires number tag', async t => {
|
||||
const reader = new BerReader(Buffer.from([]))
|
||||
t.throws(
|
||||
() => reader.readRawBuffer(),
|
||||
Error('must specify an integer tag')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('throws if tag does not match', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x04, 0x00]))
|
||||
t.throws(
|
||||
() => reader.readRawBuffer(0x05),
|
||||
Error('Expected 0x05: got 0x04')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('reads empty string buffer', async t => {
|
||||
const buffer = Buffer.from([0x04, 0x00])
|
||||
const reader = new BerReader(buffer)
|
||||
const readBuffer = reader.readRawBuffer(0x04)
|
||||
t.equal(buffer.compare(readBuffer), 0)
|
||||
t.equal(reader.offset, 2)
|
||||
})
|
||||
|
||||
t.test('returns null for no value byte', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x04]))
|
||||
const buffer = reader.readRawBuffer(0x04)
|
||||
t.equal(buffer, null)
|
||||
t.equal(reader.offset, 0)
|
||||
})
|
||||
|
||||
t.test('returns null if value length exceeds buffer length', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x04, 0x01]))
|
||||
const buffer = reader.readRawBuffer(0x04)
|
||||
t.equal(buffer, null)
|
||||
t.equal(reader.offset, 0)
|
||||
})
|
||||
|
||||
t.test('return only next buffer', async t => {
|
||||
const buffer = Buffer.from([
|
||||
0x04, 0x03, 0x66, 0x6f, 0x6f,
|
||||
0x04, 0x03, 0x62, 0x61, 0x72,
|
||||
0x04, 0x03, 0x62, 0x61, 0x7a
|
||||
])
|
||||
const reader = new BerReader(buffer)
|
||||
reader.readString()
|
||||
|
||||
const readBuffer = reader.readRawBuffer(0x04)
|
||||
t.equal(reader.offset, 10)
|
||||
t.equal(readBuffer.compare(buffer.subarray(5, 10)), 0)
|
||||
})
|
||||
|
||||
t.test('does not advance offset', async t => {
|
||||
const buffer = Buffer.from([
|
||||
0x04, 0x03, 0x66, 0x6f, 0x6f,
|
||||
0x04, 0x03, 0x62, 0x61, 0x72,
|
||||
0x04, 0x03, 0x62, 0x61, 0x7a
|
||||
])
|
||||
const reader = new BerReader(buffer)
|
||||
reader.readString()
|
||||
|
||||
const readBuffer = reader.readRawBuffer(0x04, false)
|
||||
t.equal(reader.offset, 5)
|
||||
t.equal(readBuffer.compare(buffer.subarray(5, 10)), 0)
|
||||
})
|
||||
|
||||
t.test('reads buffer with multi-byte length', async t => {
|
||||
// 0x01b3 => 110110011 => 00000001 + 10110011 => 0x01 + 0xb3 => 435 bytes
|
||||
const bytes = [
|
||||
0x02, 0x01, 0x00, // simple integer
|
||||
0x04, 0x82, 0x01, 0xb3 // begin string sequence
|
||||
]
|
||||
for (let i = 1; i <= 435; i += 1) {
|
||||
// Create a long string of `~` characters
|
||||
bytes.push(0x7e)
|
||||
}
|
||||
// Add a null sequence terminator
|
||||
Array.prototype.push.apply(bytes, [0x30, 0x00])
|
||||
|
||||
const buffer = Buffer.from(bytes)
|
||||
const reader = new BerReader(buffer)
|
||||
t.equal(reader.readInt(), 0)
|
||||
t.equal(reader.readString(), '~'.repeat(435))
|
||||
t.equal(reader.readSequence(0x30), 0x30)
|
||||
reader.setOffset(0)
|
||||
|
||||
// Emulate what we would do to read the filter value from an LDAP
|
||||
// search request that has a very large filter:
|
||||
reader.readInt()
|
||||
const tag = reader.peek()
|
||||
t.equal(tag, 0x04)
|
||||
const rawBuffer = reader.readRawBuffer(tag)
|
||||
t.equal(rawBuffer.compare(buffer.subarray(3, bytes.length - 2)), 0)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('readSequence', t => {
|
||||
t.test('throws for tag mismatch', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x04, 0x00]))
|
||||
t.throws(
|
||||
() => reader.readSequence(0x30),
|
||||
Error('Expected 0x30: got 0x04')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('returns null when read length is null', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x30, 0x84, 0x04, 0x03]))
|
||||
t.equal(reader.readSequence(), null)
|
||||
})
|
||||
|
||||
t.test('return read sequence and advances offset', async t => {
|
||||
const reader = new BerReader(Buffer.from(fooSequence))
|
||||
const result = reader.readSequence()
|
||||
t.equal(result, 0x30)
|
||||
t.equal(reader.offset, 2)
|
||||
})
|
||||
|
||||
// Original test
|
||||
t.test('read sequence', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x30, 0x03, 0x01, 0x01, 0xff]))
|
||||
t.ok(reader)
|
||||
t.equal(reader.readSequence(), 0x30, 'wrong value')
|
||||
t.equal(reader.length, 0x03, 'wrong length')
|
||||
t.equal(reader.readBoolean(), true, 'wrong value')
|
||||
t.equal(reader.length, 0x01, 'wrong length')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('readString', t => {
|
||||
t.test('throws for tag mismatch', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x30, 0x00]))
|
||||
t.throws(
|
||||
() => reader.readString(),
|
||||
Error('Expected 0x04: got 0x30')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('returns null when read length is null', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x04, 0x84, 0x03, 0x0a]))
|
||||
t.equal(reader.readString(), null)
|
||||
})
|
||||
|
||||
t.test('returns null when value bytes too short', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x04, 0x03, 0x0a]))
|
||||
t.equal(reader.readString(), null)
|
||||
})
|
||||
|
||||
t.test('returns empty buffer for zero length string', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x04, 0x00]))
|
||||
const result = reader.readString(0x04, true)
|
||||
t.type(result, Buffer)
|
||||
t.equal(Buffer.compare(result, Buffer.alloc(0)), 0)
|
||||
})
|
||||
|
||||
t.test('returns empty string for zero length string', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x04, 0x00]))
|
||||
const result = reader.readString()
|
||||
t.type(result, 'string')
|
||||
t.equal(result, '')
|
||||
})
|
||||
|
||||
t.test('returns string as buffer', async t => {
|
||||
const reader = new BerReader(Buffer.from(fooSequence.slice(2)))
|
||||
const result = reader.readString(0x04, true)
|
||||
t.type(result, Buffer)
|
||||
|
||||
const expected = Buffer.from(fooSequence.slice(4))
|
||||
t.equal(Buffer.compare(result, expected), 0)
|
||||
})
|
||||
|
||||
t.test('returns string as string', async t => {
|
||||
const reader = new BerReader(Buffer.from(fooSequence.slice(2)))
|
||||
const result = reader.readString()
|
||||
t.type(result, 'string')
|
||||
t.equal(result, 'foo')
|
||||
})
|
||||
|
||||
// Original test
|
||||
t.test('read string', async t => {
|
||||
const dn = 'cn=foo,ou=unit,o=test'
|
||||
const buf = Buffer.alloc(dn.length + 2)
|
||||
buf[0] = 0x04
|
||||
buf[1] = Buffer.byteLength(dn)
|
||||
buf.write(dn, 2)
|
||||
const reader = new BerReader(buf)
|
||||
t.ok(reader)
|
||||
t.equal(reader.readString(), dn, 'wrong value')
|
||||
t.equal(reader.length, dn.length, 'wrong length')
|
||||
})
|
||||
|
||||
// Orignal test
|
||||
t.test('long string', async t => {
|
||||
const buf = Buffer.alloc(256)
|
||||
const s =
|
||||
'2;649;CN=Red Hat CS 71GA Demo,O=Red Hat CS 71GA Demo,C=US;' +
|
||||
'CN=RHCS Agent - admin01,UID=admin01,O=redhat,C=US [1] This is ' +
|
||||
'Teena Vradmin\'s description.'
|
||||
buf[0] = 0x04
|
||||
buf[1] = 0x81
|
||||
buf[2] = 0x94
|
||||
buf.write(s, 3)
|
||||
const ber = new BerReader(buf.subarray(0, 3 + s.length))
|
||||
t.equal(ber.readString(), s)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('readTag', t => {
|
||||
t.test('throws error for null tag', async t => {
|
||||
const expected = Error('Must supply an ASN.1 tag to read.')
|
||||
const reader = new BerReader(Buffer.from(fooSequence))
|
||||
|
||||
t.throws(
|
||||
() => reader.readTag(),
|
||||
expected
|
||||
)
|
||||
})
|
||||
|
||||
t.test('returns null for null byte tag', { skip: true })
|
||||
|
||||
t.test('throws error for tag mismatch', async t => {
|
||||
const expected = Error('Expected 0x40: got 0x30')
|
||||
const reader = new BerReader(Buffer.from(fooSequence))
|
||||
|
||||
t.throws(
|
||||
() => reader.readTag(0x40),
|
||||
expected
|
||||
)
|
||||
})
|
||||
|
||||
t.test('returns null if field length is null', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x05]))
|
||||
t.equal(reader.readTag(0x05), null)
|
||||
})
|
||||
|
||||
t.test('returns null if field length greater than available bytes', async t => {
|
||||
const reader = new BerReader(Buffer.from([0x30, 0x03, 0x04, 0xa0]))
|
||||
t.equal(reader.readTag(0x30), null)
|
||||
})
|
||||
|
||||
t.test('returns null if field length greater than available bytes', async t => {
|
||||
const reader = new BerReader(Buffer.from(fooSequence))
|
||||
const expected = Buffer.from([0x04, 0x03, 0x66, 0x6f, 0x6f])
|
||||
const result = reader.readTag(0x30)
|
||||
t.equal(Buffer.compare(result, expected), 0)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('remain', t => {
|
||||
t.test('returns the size of the buffer if nothing read', async t => {
|
||||
const reader = new BerReader(Buffer.from(fooSequence))
|
||||
t.equal(7, reader.remain)
|
||||
})
|
||||
|
||||
t.test('returns accurate remaining bytes', async t => {
|
||||
const reader = new BerReader(Buffer.from(fooSequence))
|
||||
t.equal(0x30, reader.readSequence())
|
||||
t.equal(5, reader.remain)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('setOffset', t => {
|
||||
t.test('throws if not an integer', async t => {
|
||||
const expected = Error('Must supply an integer position.')
|
||||
const reader = new BerReader(Buffer.from(fooSequence))
|
||||
|
||||
t.throws(
|
||||
() => reader.setOffset(1.2),
|
||||
expected
|
||||
)
|
||||
t.throws(
|
||||
() => reader.setOffset('2'),
|
||||
expected
|
||||
)
|
||||
})
|
||||
|
||||
t.test('sets offset', async t => {
|
||||
const reader = new BerReader(Buffer.from(fooSequence))
|
||||
t.equal(reader.offset, 0)
|
||||
|
||||
reader.setOffset(2)
|
||||
t.equal(reader.offset, 2)
|
||||
t.equal(reader.peek(), 0x04)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('sequenceToReader', t => {
|
||||
t.test('returns new reader with full sequence', async t => {
|
||||
const multiSequence = [
|
||||
0x30, 14,
|
||||
...fooSequence,
|
||||
...fooSequence
|
||||
]
|
||||
const reader = new BerReader(Buffer.from(multiSequence))
|
||||
|
||||
// Read the intial sequence and verify current position.
|
||||
t.equal(0x30, reader.readSequence())
|
||||
t.equal(2, reader.offset)
|
||||
|
||||
// Advance the buffer to the start of the first sub-sequence value.
|
||||
t.equal(0x30, reader.readSequence())
|
||||
t.equal(4, reader.offset)
|
||||
t.equal(12, reader.remain)
|
||||
|
||||
// Get a new reader the consists of the first sub-sequence and verify
|
||||
// that the original reader's position has not changed.
|
||||
const fooReader = reader.sequenceToReader()
|
||||
t.equal(fooReader.remain, 7)
|
||||
t.equal(fooReader.offset, 0)
|
||||
t.equal(reader.offset, 4)
|
||||
t.equal(0x30, fooReader.readSequence())
|
||||
t.equal('foo', fooReader.readString())
|
||||
|
||||
// The original reader should advance like normal.
|
||||
t.equal('foo', reader.readString())
|
||||
t.equal(0x30, reader.readSequence())
|
||||
t.equal('foo', reader.readString())
|
||||
t.equal(0, reader.remain)
|
||||
t.equal(16, reader.offset)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('toHexDump', t => {
|
||||
t.test('dumps buffer', t => {
|
||||
const reader = new BerReader(
|
||||
Buffer.from([0x00, 0x01, 0x02, 0x03])
|
||||
)
|
||||
const expected = '00010203'
|
||||
|
||||
let found = ''
|
||||
const destination = new Writable({
|
||||
write (chunk, encoding, callback) {
|
||||
found += chunk.toString()
|
||||
callback()
|
||||
}
|
||||
})
|
||||
|
||||
destination.on('finish', () => {
|
||||
t.equal(found, expected)
|
||||
t.end()
|
||||
})
|
||||
|
||||
reader.toHexDump({
|
||||
destination,
|
||||
closeDestination: true
|
||||
})
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
// Original test
|
||||
tap.test('anonymous LDAPv3 bind', async t => {
|
||||
const BIND = Buffer.alloc(14)
|
||||
BIND[0] = 0x30 // Sequence
|
||||
BIND[1] = 12 // len
|
||||
BIND[2] = 0x02 // ASN.1 Integer
|
||||
BIND[3] = 1 // len
|
||||
BIND[4] = 0x04 // msgid (make up 4)
|
||||
BIND[5] = 0x60 // Bind Request
|
||||
BIND[6] = 7 // len
|
||||
BIND[7] = 0x02 // ASN.1 Integer
|
||||
BIND[8] = 1 // len
|
||||
BIND[9] = 0x03 // v3
|
||||
BIND[10] = 0x04 // String (bind dn)
|
||||
BIND[11] = 0 // len
|
||||
BIND[12] = 0x80 // ContextSpecific (choice)
|
||||
BIND[13] = 0 // simple bind
|
||||
|
||||
// Start testing ^^
|
||||
const ber = new BerReader(BIND)
|
||||
t.equal(ber.readSequence(), 48, 'Not an ASN.1 Sequence')
|
||||
t.equal(ber.length, 12, 'Message length should be 12')
|
||||
t.equal(ber.readInt(), 4, 'Message id should have been 4')
|
||||
t.equal(ber.readSequence(), 96, 'Bind Request should have been 96')
|
||||
t.equal(ber.length, 7, 'Bind length should have been 7')
|
||||
t.equal(ber.readInt(), 3, 'LDAP version should have been 3')
|
||||
t.equal(ber.readString(), '', 'Bind DN should have been empty')
|
||||
t.equal(ber.length, 0, 'string length should have been 0')
|
||||
t.equal(ber.readByte(), 0x80, 'Should have been ContextSpecific (choice)')
|
||||
t.equal(ber.readByte(), 0, 'Should have been simple bind')
|
||||
t.equal(null, ber.readByte(), 'Should be out of data')
|
||||
})
|
||||
36
node_modules/@ldapjs/asn1/lib/ber/types.js
generated
vendored
Normal file
36
node_modules/@ldapjs/asn1/lib/ber/types.js
generated
vendored
Normal file
@ -0,0 +1,36 @@
|
||||
'use strict'
|
||||
|
||||
module.exports = {
|
||||
EOC: 0x0,
|
||||
Boolean: 0x01,
|
||||
Integer: 0x02,
|
||||
BitString: 0x03,
|
||||
OctetString: 0x04,
|
||||
Null: 0x05,
|
||||
OID: 0x06,
|
||||
ObjectDescriptor: 0x07,
|
||||
External: 0x08,
|
||||
Real: 0x09, // float
|
||||
Enumeration: 0x0a,
|
||||
PDV: 0x0b,
|
||||
Utf8String: 0x0c,
|
||||
RelativeOID: 0x0d,
|
||||
Sequence: 0x10,
|
||||
Set: 0x11,
|
||||
NumericString: 0x12,
|
||||
PrintableString: 0x13,
|
||||
T61String: 0x14,
|
||||
VideotexString: 0x15,
|
||||
IA5String: 0x16,
|
||||
UTCTime: 0x17,
|
||||
GeneralizedTime: 0x18,
|
||||
GraphicString: 0x19,
|
||||
VisibleString: 0x1a,
|
||||
GeneralString: 0x1c,
|
||||
UniversalString: 0x1d,
|
||||
CharacterString: 0x1e,
|
||||
BMPString: 0x1f,
|
||||
Constructor: 0x20,
|
||||
LDAPSequence: 0x30,
|
||||
Context: 0x80
|
||||
}
|
||||
466
node_modules/@ldapjs/asn1/lib/ber/writer.js
generated
vendored
Normal file
466
node_modules/@ldapjs/asn1/lib/ber/writer.js
generated
vendored
Normal file
@ -0,0 +1,466 @@
|
||||
'use strict'
|
||||
|
||||
const types = require('./types')
|
||||
const bufferToHexDump = require('../buffer-to-hex-dump')
|
||||
|
||||
class BerWriter {
|
||||
/**
|
||||
* The source buffer as it was passed in when creating the instance.
|
||||
*
|
||||
* @type {Buffer}
|
||||
*/
|
||||
#buffer
|
||||
|
||||
/**
|
||||
* The total bytes in the backing buffer.
|
||||
*
|
||||
* @type {number}
|
||||
*/
|
||||
#size
|
||||
|
||||
/**
|
||||
* As the BER buffer is written, this property records the current position
|
||||
* in the buffer.
|
||||
*
|
||||
* @type {number}
|
||||
*/
|
||||
#offset = 0
|
||||
|
||||
/**
|
||||
* A list of offsets in the buffer where we need to insert sequence tag and
|
||||
* length pairs.
|
||||
*/
|
||||
#sequenceOffsets = []
|
||||
|
||||
/**
|
||||
* Coeffecient used when increasing the buffer to accomodate writes that
|
||||
* exceed the available space left in the buffer.
|
||||
*
|
||||
* @type {number}
|
||||
*/
|
||||
#growthFactor
|
||||
|
||||
constructor ({ size = 1024, growthFactor = 8 } = {}) {
|
||||
this.#buffer = Buffer.alloc(size)
|
||||
this.#size = this.#buffer.length
|
||||
this.#offset = 0
|
||||
this.#growthFactor = growthFactor
|
||||
}
|
||||
|
||||
get [Symbol.toStringTag] () { return 'BerWriter' }
|
||||
|
||||
get buffer () {
|
||||
// TODO: handle sequence check
|
||||
|
||||
return this.#buffer.subarray(0, this.#offset)
|
||||
}
|
||||
|
||||
/**
|
||||
* The size of the backing buffer.
|
||||
*
|
||||
* @return {number}
|
||||
*/
|
||||
get size () {
|
||||
return this.#size
|
||||
}
|
||||
|
||||
/**
|
||||
* Append a raw buffer to the current writer instance. No validation to
|
||||
* determine if the buffer represents a valid BER encoding is performed.
|
||||
*
|
||||
* @param {Buffer} buffer The buffer to append. If this is not a valid BER
|
||||
* sequence of data, it will invalidate the BER represented by the `BerWriter`.
|
||||
*
|
||||
* @throws If the input is not an instance of Buffer.
|
||||
*/
|
||||
appendBuffer (buffer) {
|
||||
if (Buffer.isBuffer(buffer) === false) {
|
||||
throw Error('buffer must be an instance of Buffer')
|
||||
}
|
||||
this.#ensureBufferCapacity(buffer.length)
|
||||
buffer.copy(this.#buffer, this.#offset, 0, buffer.length)
|
||||
this.#offset += buffer.length
|
||||
}
|
||||
|
||||
/**
|
||||
* Complete a sequence started with {@link startSequence}.
|
||||
*
|
||||
* @throws When the sequence is too long and would exceed the 4 byte
|
||||
* length descriptor limitation.
|
||||
*/
|
||||
endSequence () {
|
||||
const sequenceStartOffset = this.#sequenceOffsets.pop()
|
||||
const start = sequenceStartOffset + 3
|
||||
const length = this.#offset - start
|
||||
|
||||
if (length <= 0x7f) {
|
||||
this.#shift(start, length, -2)
|
||||
this.#buffer[sequenceStartOffset] = length
|
||||
} else if (length <= 0xff) {
|
||||
this.#shift(start, length, -1)
|
||||
this.#buffer[sequenceStartOffset] = 0x81
|
||||
this.#buffer[sequenceStartOffset + 1] = length
|
||||
} else if (length <= 0xffff) {
|
||||
this.#buffer[sequenceStartOffset] = 0x82
|
||||
this.#buffer[sequenceStartOffset + 1] = length >> 8
|
||||
this.#buffer[sequenceStartOffset + 2] = length
|
||||
} else if (length <= 0xffffff) {
|
||||
this.#shift(start, length, 1)
|
||||
this.#buffer[sequenceStartOffset] = 0x83
|
||||
this.#buffer[sequenceStartOffset + 1] = length >> 16
|
||||
this.#buffer[sequenceStartOffset + 2] = length >> 8
|
||||
this.#buffer[sequenceStartOffset + 3] = length
|
||||
} else {
|
||||
throw Error('sequence too long')
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Write a sequence tag to the buffer and advance the offset to the starting
|
||||
* position of the value. Sequences must be completed with a subsequent
|
||||
* invocation of {@link endSequence}.
|
||||
*
|
||||
* @param {number} [tag=0x30] The tag to use for the sequence.
|
||||
*
|
||||
* @throws When the tag is not a number.
|
||||
*/
|
||||
startSequence (tag = (types.Sequence | types.Constructor)) {
|
||||
if (typeof tag !== 'number') {
|
||||
throw TypeError('tag must be a Number')
|
||||
}
|
||||
|
||||
this.writeByte(tag)
|
||||
this.#sequenceOffsets.push(this.#offset)
|
||||
this.#ensureBufferCapacity(3)
|
||||
this.#offset += 3
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {HexDumpParams} params The `buffer` parameter will be ignored.
|
||||
*
|
||||
* @see bufferToHexDump
|
||||
*/
|
||||
toHexDump (params) {
|
||||
bufferToHexDump({
|
||||
...params,
|
||||
buffer: this.buffer
|
||||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* Write a boolean TLV to the buffer.
|
||||
*
|
||||
* @param {boolean} boolValue
|
||||
* @param {tag} [number=0x01] A custom tag for the boolean.
|
||||
*
|
||||
* @throws When a parameter is of the wrong type.
|
||||
*/
|
||||
writeBoolean (boolValue, tag = types.Boolean) {
|
||||
if (typeof boolValue !== 'boolean') {
|
||||
throw TypeError('boolValue must be a Boolean')
|
||||
}
|
||||
if (typeof tag !== 'number') {
|
||||
throw TypeError('tag must be a Number')
|
||||
}
|
||||
|
||||
this.#ensureBufferCapacity(3)
|
||||
this.#buffer[this.#offset++] = tag
|
||||
this.#buffer[this.#offset++] = 0x01
|
||||
this.#buffer[this.#offset++] = boolValue === true ? 0xff : 0x00
|
||||
}
|
||||
|
||||
/**
|
||||
* Write an arbitrary buffer of data to the backing buffer using the given
|
||||
* tag.
|
||||
*
|
||||
* @param {Buffer} buffer
|
||||
* @param {number} tag The tag to use for the ASN.1 TLV sequence.
|
||||
*
|
||||
* @throws When either input parameter is of the wrong type.
|
||||
*/
|
||||
writeBuffer (buffer, tag) {
|
||||
if (typeof tag !== 'number') {
|
||||
throw TypeError('tag must be a Number')
|
||||
}
|
||||
if (Buffer.isBuffer(buffer) === false) {
|
||||
throw TypeError('buffer must be an instance of Buffer')
|
||||
}
|
||||
|
||||
this.writeByte(tag)
|
||||
this.writeLength(buffer.length)
|
||||
this.#ensureBufferCapacity(buffer.length)
|
||||
buffer.copy(this.#buffer, this.#offset, 0, buffer.length)
|
||||
this.#offset += buffer.length
|
||||
}
|
||||
|
||||
/**
|
||||
* Write a single byte to the backing buffer and advance the offset. The
|
||||
* backing buffer will be automatically expanded to accomodate the new byte
|
||||
* if no room in the buffer remains.
|
||||
*
|
||||
* @param {number} byte The byte to be written.
|
||||
*
|
||||
* @throws When the passed in parameter is not a `Number` (aka a byte).
|
||||
*/
|
||||
writeByte (byte) {
|
||||
if (typeof byte !== 'number') {
|
||||
throw TypeError('argument must be a Number')
|
||||
}
|
||||
|
||||
this.#ensureBufferCapacity(1)
|
||||
this.#buffer[this.#offset++] = byte
|
||||
}
|
||||
|
||||
/**
|
||||
* Write an enumeration TLV to the buffer.
|
||||
*
|
||||
* @param {number} value
|
||||
* @param {number} [tag=0x0a] A custom tag for the enumeration.
|
||||
*
|
||||
* @throws When a passed in parameter is not of the correct type, or the
|
||||
* value requires too many bytes (must be <= 4).
|
||||
*/
|
||||
writeEnumeration (value, tag = types.Enumeration) {
|
||||
if (typeof value !== 'number') {
|
||||
throw TypeError('value must be a Number')
|
||||
}
|
||||
if (typeof tag !== 'number') {
|
||||
throw TypeError('tag must be a Number')
|
||||
}
|
||||
this.writeInt(value, tag)
|
||||
}
|
||||
|
||||
/**
|
||||
* Write an, up to 4 byte, integer TLV to the buffer.
|
||||
*
|
||||
* @param {number} intToWrite
|
||||
* @param {number} [tag=0x02]
|
||||
*
|
||||
* @throws When either parameter is not of the write type, or if the
|
||||
* integer consists of too many bytes.
|
||||
*/
|
||||
writeInt (intToWrite, tag = types.Integer) {
|
||||
if (typeof intToWrite !== 'number') {
|
||||
throw TypeError('intToWrite must be a Number')
|
||||
}
|
||||
if (typeof tag !== 'number') {
|
||||
throw TypeError('tag must be a Number')
|
||||
}
|
||||
|
||||
let intSize = 4
|
||||
while (
|
||||
(
|
||||
((intToWrite & 0xff800000) === 0) ||
|
||||
((intToWrite & 0xff800000) === (0xff800000 >> 0))
|
||||
) && (intSize > 1)
|
||||
) {
|
||||
intSize--
|
||||
intToWrite <<= 8
|
||||
}
|
||||
|
||||
// TODO: figure out how to cover this in a test.
|
||||
/* istanbul ignore if: needs test */
|
||||
if (intSize > 4) {
|
||||
throw Error('BER ints cannot be > 0xffffffff')
|
||||
}
|
||||
|
||||
this.#ensureBufferCapacity(2 + intSize)
|
||||
this.#buffer[this.#offset++] = tag
|
||||
this.#buffer[this.#offset++] = intSize
|
||||
|
||||
while (intSize-- > 0) {
|
||||
this.#buffer[this.#offset++] = ((intToWrite & 0xff000000) >>> 24)
|
||||
intToWrite <<= 8
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Write a set of length bytes to the backing buffer. Per
|
||||
* https://www.rfc-editor.org/rfc/rfc4511.html#section-5.1, LDAP message
|
||||
* BERs prohibit greater than 4 byte lengths. Given we are supporing
|
||||
* the `ldapjs` module, we limit ourselves to 4 byte lengths.
|
||||
*
|
||||
* @param {number} len The length value to write to the buffer.
|
||||
*
|
||||
* @throws When the length is not a number or requires too many bytes.
|
||||
*/
|
||||
writeLength (len) {
|
||||
if (typeof len !== 'number') {
|
||||
throw TypeError('argument must be a Number')
|
||||
}
|
||||
|
||||
this.#ensureBufferCapacity(4)
|
||||
|
||||
if (len <= 0x7f) {
|
||||
this.#buffer[this.#offset++] = len
|
||||
} else if (len <= 0xff) {
|
||||
this.#buffer[this.#offset++] = 0x81
|
||||
this.#buffer[this.#offset++] = len
|
||||
} else if (len <= 0xffff) {
|
||||
this.#buffer[this.#offset++] = 0x82
|
||||
this.#buffer[this.#offset++] = len >> 8
|
||||
this.#buffer[this.#offset++] = len
|
||||
} else if (len <= 0xffffff) {
|
||||
this.#buffer[this.#offset++] = 0x83
|
||||
this.#buffer[this.#offset++] = len >> 16
|
||||
this.#buffer[this.#offset++] = len >> 8
|
||||
this.#buffer[this.#offset++] = len
|
||||
} else {
|
||||
throw Error('length too long (> 4 bytes)')
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Write a NULL tag and value to the buffer.
|
||||
*/
|
||||
writeNull () {
|
||||
this.writeByte(types.Null)
|
||||
this.writeByte(0x00)
|
||||
}
|
||||
|
||||
/**
|
||||
* Given an OID string, e.g. `1.2.840.113549.1.1.1`, split it into
|
||||
* octets, encode the octets, and write it to the backing buffer.
|
||||
*
|
||||
* @param {string} oidString
|
||||
* @param {number} [tag=0x06] A custom tag to use for the OID.
|
||||
*
|
||||
* @throws When the parameters are not of the correct types, or if the
|
||||
* OID is not in the correct format.
|
||||
*/
|
||||
writeOID (oidString, tag = types.OID) {
|
||||
if (typeof oidString !== 'string') {
|
||||
throw TypeError('oidString must be a string')
|
||||
}
|
||||
if (typeof tag !== 'number') {
|
||||
throw TypeError('tag must be a Number')
|
||||
}
|
||||
|
||||
if (/^([0-9]+\.){3,}[0-9]+$/.test(oidString) === false) {
|
||||
throw Error('oidString is not a valid OID string')
|
||||
}
|
||||
|
||||
const parts = oidString.split('.')
|
||||
const bytes = []
|
||||
bytes.push(parseInt(parts[0], 10) * 40 + parseInt(parts[1], 10))
|
||||
for (const part of parts.slice(2)) {
|
||||
encodeOctet(bytes, parseInt(part, 10))
|
||||
}
|
||||
|
||||
this.#ensureBufferCapacity(2 + bytes.length)
|
||||
this.writeByte(tag)
|
||||
this.writeLength(bytes.length)
|
||||
this.appendBuffer(Buffer.from(bytes))
|
||||
|
||||
function encodeOctet (bytes, octet) {
|
||||
if (octet < 128) {
|
||||
bytes.push(octet)
|
||||
} else if (octet < 16_384) {
|
||||
bytes.push((octet >>> 7) | 0x80)
|
||||
bytes.push(octet & 0x7F)
|
||||
} else if (octet < 2_097_152) {
|
||||
bytes.push((octet >>> 14) | 0x80)
|
||||
bytes.push(((octet >>> 7) | 0x80) & 0xFF)
|
||||
bytes.push(octet & 0x7F)
|
||||
} else if (octet < 268_435_456) {
|
||||
bytes.push((octet >>> 21) | 0x80)
|
||||
bytes.push(((octet >>> 14) | 0x80) & 0xFF)
|
||||
bytes.push(((octet >>> 7) | 0x80) & 0xFF)
|
||||
bytes.push(octet & 0x7F)
|
||||
} else {
|
||||
bytes.push(((octet >>> 28) | 0x80) & 0xFF)
|
||||
bytes.push(((octet >>> 21) | 0x80) & 0xFF)
|
||||
bytes.push(((octet >>> 14) | 0x80) & 0xFF)
|
||||
bytes.push(((octet >>> 7) | 0x80) & 0xFF)
|
||||
bytes.push(octet & 0x7F)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Write a string TLV to the buffer.
|
||||
*
|
||||
* @param {string} stringToWrite
|
||||
* @param {number} [tag=0x04] The tag to use.
|
||||
*
|
||||
* @throws When either input parameter is of the wrong type.
|
||||
*/
|
||||
writeString (stringToWrite, tag = types.OctetString) {
|
||||
if (typeof stringToWrite !== 'string') {
|
||||
throw TypeError('stringToWrite must be a string')
|
||||
}
|
||||
if (typeof tag !== 'number') {
|
||||
throw TypeError('tag must be a number')
|
||||
}
|
||||
|
||||
const toWriteLength = Buffer.byteLength(stringToWrite)
|
||||
this.writeByte(tag)
|
||||
this.writeLength(toWriteLength)
|
||||
if (toWriteLength > 0) {
|
||||
this.#ensureBufferCapacity(toWriteLength)
|
||||
this.#buffer.write(stringToWrite, this.#offset)
|
||||
this.#offset += toWriteLength
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Given a set of strings, write each as a string TLV to the buffer.
|
||||
*
|
||||
* @param {string[]} strings
|
||||
*
|
||||
* @throws When the input is not an array.
|
||||
*/
|
||||
writeStringArray (strings) {
|
||||
if (Array.isArray(strings) === false) {
|
||||
throw TypeError('strings must be an instance of Array')
|
||||
}
|
||||
for (const string of strings) {
|
||||
this.writeString(string)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Given a number of bytes to be written into the buffer, verify the buffer
|
||||
* has enough free space. If not, allocate a new buffer, copy the current
|
||||
* backing buffer into the new buffer, and promote the new buffer to be the
|
||||
* current backing buffer.
|
||||
*
|
||||
* @param {number} numberOfBytesToWrite How many bytes are required to be
|
||||
* available for writing in the backing buffer.
|
||||
*/
|
||||
#ensureBufferCapacity (numberOfBytesToWrite) {
|
||||
if (this.#size - this.#offset < numberOfBytesToWrite) {
|
||||
let newSize = this.#size * this.#growthFactor
|
||||
if (newSize - this.#offset < numberOfBytesToWrite) {
|
||||
newSize += numberOfBytesToWrite
|
||||
}
|
||||
|
||||
const newBuffer = Buffer.alloc(newSize)
|
||||
|
||||
this.#buffer.copy(newBuffer, 0, 0, this.#offset)
|
||||
this.#buffer = newBuffer
|
||||
this.#size = newSize
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Shift a region of the buffer indicated by `start` and `length` a number
|
||||
* of bytes indicated by `shiftAmount`.
|
||||
*
|
||||
* @param {number} start The starting position in the buffer for the region
|
||||
* of bytes to be shifted.
|
||||
* @param {number} length The number of bytes that constitutes the region
|
||||
* of the buffer to be shifted.
|
||||
* @param {number} shiftAmount The number of bytes to shift the region by.
|
||||
* This may be negative.
|
||||
*/
|
||||
#shift (start, length, shiftAmount) {
|
||||
// TODO: this leaves garbage behind. We should either zero out the bytes
|
||||
// left behind, or device a better algorightm that generates a clean
|
||||
// buffer.
|
||||
this.#buffer.copy(this.#buffer, start + shiftAmount, start, start + length)
|
||||
this.#offset += shiftAmount
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = BerWriter
|
||||
749
node_modules/@ldapjs/asn1/lib/ber/writer.test.js
generated
vendored
Normal file
749
node_modules/@ldapjs/asn1/lib/ber/writer.test.js
generated
vendored
Normal file
@ -0,0 +1,749 @@
|
||||
'use strict'
|
||||
|
||||
const tap = require('tap')
|
||||
const { Writable } = require('stream')
|
||||
const BerWriter = require('./writer')
|
||||
|
||||
tap.test('has toStringTag', async t => {
|
||||
const writer = new BerWriter()
|
||||
t.equal(Object.prototype.toString.call(writer), '[object BerWriter]')
|
||||
})
|
||||
|
||||
tap.test('#ensureBufferCapacity', t => {
|
||||
t.test('does not change buffer size if unnecessary', async t => {
|
||||
const writer = new BerWriter({ size: 1 })
|
||||
t.equal(writer.size, 1)
|
||||
|
||||
writer.writeByte(0x01)
|
||||
t.equal(writer.size, 1)
|
||||
})
|
||||
|
||||
t.test('expands buffer to accomodate write skipping growth factor', async t => {
|
||||
const writer = new BerWriter({ size: 0 })
|
||||
t.equal(writer.size, 0)
|
||||
|
||||
writer.writeByte(0x01)
|
||||
t.equal(writer.size, 1)
|
||||
t.equal(Buffer.compare(writer.buffer, Buffer.from([0x01])), 0)
|
||||
})
|
||||
|
||||
t.test('expands buffer to accomodate write with growth factor', async t => {
|
||||
const writer = new BerWriter({ size: 1 })
|
||||
t.equal(writer.size, 1)
|
||||
|
||||
writer.writeByte(0x01)
|
||||
writer.writeByte(0x02)
|
||||
t.equal(writer.size, 8)
|
||||
t.equal(Buffer.compare(writer.buffer, Buffer.from([0x01, 0x02])), 0)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('appendBuffer', t => {
|
||||
t.test('throws if input not a buffer', async t => {
|
||||
const writer = new BerWriter()
|
||||
t.throws(
|
||||
() => writer.appendBuffer('foo'),
|
||||
Error('buffer must be an instance of Buffer')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('appendBuffer appends a buffer', async t => {
|
||||
const expected = Buffer.from([0x04, 0x03, 0x66, 0x6f, 0x6f, 0x66, 0x6f, 0x6f])
|
||||
const writer = new BerWriter()
|
||||
writer.writeString('foo')
|
||||
writer.appendBuffer(Buffer.from('foo'))
|
||||
t.equal(Buffer.compare(writer.buffer, expected), 0)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('endSequence', t => {
|
||||
t.test('ends a sequence', async t => {
|
||||
const writer = new BerWriter({ size: 25 })
|
||||
writer.startSequence()
|
||||
writer.writeString('hello world')
|
||||
writer.endSequence()
|
||||
|
||||
const ber = writer.buffer
|
||||
const expected = Buffer.from([
|
||||
0x30, 0x0d, // sequence; 13 bytes
|
||||
0x04, 0x0b, // string; 11 bytes
|
||||
0x68, 0x65, 0x6c, 0x6c, 0x6f, 0x20, // 'hello '
|
||||
0x77, 0x6f, 0x72, 0x6c, 0x64 // 'world'
|
||||
])
|
||||
t.equal(Buffer.compare(ber, expected), 0)
|
||||
})
|
||||
|
||||
t.test('ends sequence of two byte length', async t => {
|
||||
const value = Buffer.alloc(0x81, 0x01)
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.startSequence()
|
||||
writer.writeBuffer(value, 0x04)
|
||||
writer.endSequence()
|
||||
|
||||
const ber = writer.buffer
|
||||
t.equal(
|
||||
Buffer.from([0x30, 0x81, 0x84, 0x04, 0x81, value.length])
|
||||
.compare(ber.subarray(0, 6)),
|
||||
0
|
||||
)
|
||||
})
|
||||
|
||||
t.test('ends sequence of three byte length', async t => {
|
||||
const value = Buffer.alloc(0xfe, 0x01)
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.startSequence()
|
||||
writer.writeBuffer(value, 0x04)
|
||||
writer.endSequence()
|
||||
|
||||
const ber = writer.buffer
|
||||
t.equal(
|
||||
Buffer.from([0x30, 0x82, 0x01, 0x01, 0x04, 0x81, value.length])
|
||||
.compare(ber.subarray(0, 7)),
|
||||
0
|
||||
)
|
||||
})
|
||||
|
||||
t.test('ends sequence of four byte length', async t => {
|
||||
const value = Buffer.alloc(0xaaaaaa, 0x01)
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.startSequence()
|
||||
writer.writeBuffer(value, 0x04)
|
||||
writer.endSequence()
|
||||
|
||||
const ber = writer.buffer
|
||||
t.equal(
|
||||
Buffer.from([0x30, 0x83, 0xaa, 0xaa, 0xaf, 0x04, 0x83, value.length])
|
||||
.compare(ber.subarray(0, 8)),
|
||||
0
|
||||
)
|
||||
})
|
||||
|
||||
t.test('throws if sequence too long', async t => {
|
||||
const value = Buffer.alloc(0xaffffff, 0x01)
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.startSequence()
|
||||
writer.writeByte(0x04)
|
||||
// We can't write the length because it is too long. However, this
|
||||
// still gives us enough data to generate the error we want to generate.
|
||||
writer.appendBuffer(value)
|
||||
t.throws(
|
||||
() => writer.endSequence(),
|
||||
Error('sequence too long')
|
||||
)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('startSequence', t => {
|
||||
t.test('throws if tag not a number', async t => {
|
||||
const writer = new BerWriter()
|
||||
t.throws(
|
||||
() => writer.startSequence('30'),
|
||||
Error('tag must be a Number')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('starts a sequence', async t => {
|
||||
const writer = new BerWriter({ size: 1 })
|
||||
writer.startSequence()
|
||||
t.equal(writer.size, 8)
|
||||
|
||||
const expected = Buffer.from([0x30, 0x00, 0x00, 0x00])
|
||||
t.equal(Buffer.compare(writer.buffer, expected), 0)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('toHexDump', t => {
|
||||
t.test('dumps buffer', t => {
|
||||
const writer = new BerWriter()
|
||||
writer.appendBuffer(Buffer.from([0x00, 0x01, 0x02, 0x03]))
|
||||
const expected = '00010203'
|
||||
|
||||
let found = ''
|
||||
const destination = new Writable({
|
||||
write (chunk, encoding, callback) {
|
||||
found += chunk.toString()
|
||||
callback()
|
||||
}
|
||||
})
|
||||
|
||||
destination.on('finish', () => {
|
||||
t.equal(found, expected)
|
||||
t.end()
|
||||
})
|
||||
|
||||
writer.toHexDump({
|
||||
destination,
|
||||
closeDestination: true
|
||||
})
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('writeBoolean', t => {
|
||||
t.test('throws if input not a boolean', async t => {
|
||||
const writer = new BerWriter()
|
||||
t.throws(
|
||||
() => writer.writeBoolean(1),
|
||||
Error('boolValue must be a Boolean')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('throws if tag not a number', async t => {
|
||||
const writer = new BerWriter()
|
||||
t.throws(
|
||||
() => writer.writeBoolean(true, '5'),
|
||||
Error('tag must be a Number')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('writes true', async t => {
|
||||
const writer = new BerWriter({ size: 1 })
|
||||
writer.writeBoolean(true)
|
||||
t.equal(writer.size, 8)
|
||||
t.equal(Buffer.compare(writer.buffer, Buffer.from([0x01, 0x01, 0xff])), 0)
|
||||
})
|
||||
|
||||
t.test('writes false', async t => {
|
||||
const writer = new BerWriter({ size: 1 })
|
||||
writer.writeBoolean(false)
|
||||
t.equal(writer.size, 8)
|
||||
t.equal(Buffer.compare(writer.buffer, Buffer.from([0x01, 0x01, 0x00])), 0)
|
||||
})
|
||||
|
||||
t.test('writes with custom tag', async t => {
|
||||
const writer = new BerWriter({ size: 1 })
|
||||
writer.writeBoolean(true, 0xff)
|
||||
t.equal(writer.size, 8)
|
||||
t.equal(Buffer.compare(writer.buffer, Buffer.from([0xff, 0x01, 0xff])), 0)
|
||||
})
|
||||
|
||||
// Original test
|
||||
t.test('write boolean', async t => {
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.writeBoolean(true)
|
||||
writer.writeBoolean(false)
|
||||
const ber = writer.buffer
|
||||
|
||||
t.equal(ber.length, 6, 'Wrong length')
|
||||
t.equal(ber[0], 0x01, 'tag wrong')
|
||||
t.equal(ber[1], 0x01, 'length wrong')
|
||||
t.equal(ber[2], 0xff, 'value wrong')
|
||||
t.equal(ber[3], 0x01, 'tag wrong')
|
||||
t.equal(ber[4], 0x01, 'length wrong')
|
||||
t.equal(ber[5], 0x00, 'value wrong')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('writeBuffer', t => {
|
||||
t.test('throws if tag not a number', async t => {
|
||||
const writer = new BerWriter()
|
||||
t.throws(
|
||||
() => writer.writeBuffer(Buffer.alloc(0), '1'),
|
||||
Error('tag must be a Number')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('throws if buffer not a Buffer', async t => {
|
||||
const writer = new BerWriter()
|
||||
t.throws(
|
||||
() => writer.writeBuffer([0x00], 0x01),
|
||||
Error('buffer must be an instance of Buffer')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('write buffer', async t => {
|
||||
const writer = new BerWriter()
|
||||
// write some stuff to start with
|
||||
writer.writeString('hello world')
|
||||
let ber = writer.buffer
|
||||
const buf = Buffer.from([0x04, 0x0b, 0x30, 0x09, 0x02, 0x01, 0x0f, 0x01, 0x01,
|
||||
0xff, 0x01, 0x01, 0xff])
|
||||
writer.writeBuffer(buf.subarray(2, buf.length), 0x04)
|
||||
ber = writer.buffer
|
||||
|
||||
t.equal(ber.length, 26, 'wrong length')
|
||||
t.equal(ber[0], 0x04, 'wrong tag')
|
||||
t.equal(ber[1], 11, 'wrong length')
|
||||
t.equal(ber.slice(2, 13).toString('utf8'), 'hello world', 'wrong value')
|
||||
t.equal(ber[13], buf[0], 'wrong tag')
|
||||
t.equal(ber[14], buf[1], 'wrong length')
|
||||
for (let i = 13, j = 0; i < ber.length && j < buf.length; i++, j++) {
|
||||
t.equal(ber[i], buf[j], 'buffer contents not identical')
|
||||
}
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('writeByte', t => {
|
||||
t.test('throws if input not a number', async t => {
|
||||
const writer = new BerWriter()
|
||||
t.equal(writer.size, 1024)
|
||||
|
||||
t.throws(
|
||||
() => writer.writeByte('1'),
|
||||
Error('argument must be a Number')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('writes a byte to the backing buffer', async t => {
|
||||
const writer = new BerWriter()
|
||||
writer.writeByte(0x01)
|
||||
|
||||
const buffer = writer.buffer
|
||||
t.equal(buffer.length, 1)
|
||||
t.equal(Buffer.compare(buffer, Buffer.from([0x01])), 0)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('writeEnumeration', async t => {
|
||||
t.test('throws if value not a number', async t => {
|
||||
const writer = new BerWriter()
|
||||
t.throws(
|
||||
() => writer.writeEnumeration('1'),
|
||||
Error('value must be a Number')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('throws if tag not a number', async t => {
|
||||
const writer = new BerWriter()
|
||||
t.throws(
|
||||
() => writer.writeEnumeration(1, '1'),
|
||||
Error('tag must be a Number')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('writes an enumeration', async t => {
|
||||
const writer = new BerWriter({ size: 1 })
|
||||
writer.writeEnumeration(0x01)
|
||||
t.equal(writer.size, 8)
|
||||
t.equal(Buffer.compare(writer.buffer, Buffer.from([0x0a, 0x01, 0x01])), 0)
|
||||
})
|
||||
|
||||
t.test('writes an enumeration with custom tag', async t => {
|
||||
const writer = new BerWriter({ size: 1 })
|
||||
writer.writeEnumeration(0x01, 0xff)
|
||||
t.equal(writer.size, 8)
|
||||
t.equal(Buffer.compare(writer.buffer, Buffer.from([0xff, 0x01, 0x01])), 0)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('writeInt', t => {
|
||||
t.test('throws if int not a number', async t => {
|
||||
const writer = new BerWriter()
|
||||
t.throws(
|
||||
() => writer.writeInt('1'),
|
||||
Error('intToWrite must be a Number')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('throws if tag not a number', async t => {
|
||||
const writer = new BerWriter()
|
||||
t.throws(
|
||||
() => writer.writeInt(1, '1'),
|
||||
Error('tag must be a Number')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('write 1 byte int', async t => {
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.writeInt(0x7f)
|
||||
const ber = writer.buffer
|
||||
|
||||
t.equal(ber.length, 3, 'Wrong length for an int: ' + ber.length)
|
||||
t.equal(ber[0], 0x02, 'ASN.1 tag wrong (2) -> ' + ber[0])
|
||||
t.equal(ber[1], 0x01, 'length wrong(1) -> ' + ber[1])
|
||||
t.equal(ber[2], 0x7f, 'value wrong(3) -> ' + ber[2])
|
||||
})
|
||||
|
||||
t.test('write 2 byte int', async t => {
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.writeInt(0x7ffe)
|
||||
const ber = writer.buffer
|
||||
|
||||
t.equal(ber.length, 4, 'Wrong length for an int')
|
||||
t.equal(ber[0], 0x02, 'ASN.1 tag wrong')
|
||||
t.equal(ber[1], 0x02, 'length wrong')
|
||||
t.equal(ber[2], 0x7f, 'value wrong (byte 1)')
|
||||
t.equal(ber[3], 0xfe, 'value wrong (byte 2)')
|
||||
})
|
||||
|
||||
t.test('write 3 byte int', async t => {
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.writeInt(0x7ffffe)
|
||||
const ber = writer.buffer
|
||||
|
||||
t.equal(ber.length, 5, 'Wrong length for an int')
|
||||
t.equal(ber[0], 0x02, 'ASN.1 tag wrong')
|
||||
t.equal(ber[1], 0x03, 'length wrong')
|
||||
t.equal(ber[2], 0x7f, 'value wrong (byte 1)')
|
||||
t.equal(ber[3], 0xff, 'value wrong (byte 2)')
|
||||
t.equal(ber[4], 0xfe, 'value wrong (byte 3)')
|
||||
})
|
||||
|
||||
t.test('write 4 byte int', async t => {
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.writeInt(0x7ffffffe)
|
||||
const ber = writer.buffer
|
||||
|
||||
t.equal(ber.length, 6, 'Wrong length for an int')
|
||||
t.equal(ber[0], 0x02, 'ASN.1 tag wrong')
|
||||
t.equal(ber[1], 0x04, 'length wrong')
|
||||
t.equal(ber[2], 0x7f, 'value wrong (byte 1)')
|
||||
t.equal(ber[3], 0xff, 'value wrong (byte 2)')
|
||||
t.equal(ber[4], 0xff, 'value wrong (byte 3)')
|
||||
t.equal(ber[5], 0xfe, 'value wrong (byte 4)')
|
||||
})
|
||||
|
||||
t.test('write 1 byte negative int', async t => {
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.writeInt(-128)
|
||||
const ber = writer.buffer
|
||||
|
||||
t.equal(ber.length, 3, 'Wrong length for an int')
|
||||
t.equal(ber[0], 0x02, 'ASN.1 tag wrong')
|
||||
t.equal(ber[1], 0x01, 'length wrong')
|
||||
t.equal(ber[2], 0x80, 'value wrong (byte 1)')
|
||||
})
|
||||
|
||||
t.test('write 2 byte negative int', async t => {
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.writeInt(-22400)
|
||||
const ber = writer.buffer
|
||||
|
||||
t.equal(ber.length, 4, 'Wrong length for an int')
|
||||
t.equal(ber[0], 0x02, 'ASN.1 tag wrong')
|
||||
t.equal(ber[1], 0x02, 'length wrong')
|
||||
t.equal(ber[2], 0xa8, 'value wrong (byte 1)')
|
||||
t.equal(ber[3], 0x80, 'value wrong (byte 2)')
|
||||
})
|
||||
|
||||
t.test('write 3 byte negative int', async t => {
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.writeInt(-481653)
|
||||
const ber = writer.buffer
|
||||
|
||||
t.equal(ber.length, 5, 'Wrong length for an int')
|
||||
t.equal(ber[0], 0x02, 'ASN.1 tag wrong')
|
||||
t.equal(ber[1], 0x03, 'length wrong')
|
||||
t.equal(ber[2], 0xf8, 'value wrong (byte 1)')
|
||||
t.equal(ber[3], 0xa6, 'value wrong (byte 2)')
|
||||
t.equal(ber[4], 0x8b, 'value wrong (byte 3)')
|
||||
})
|
||||
|
||||
t.test('write 4 byte negative int', async t => {
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.writeInt(-1522904131)
|
||||
const ber = writer.buffer
|
||||
|
||||
t.equal(ber.length, 6, 'Wrong length for an int')
|
||||
t.equal(ber[0], 0x02, 'ASN.1 tag wrong')
|
||||
t.equal(ber[1], 0x04, 'length wrong')
|
||||
t.equal(ber[2], 0xa5, 'value wrong (byte 1)')
|
||||
t.equal(ber[3], 0x3a, 'value wrong (byte 2)')
|
||||
t.equal(ber[4], 0x53, 'value wrong (byte 3)')
|
||||
t.equal(ber[5], 0xbd, 'value wrong (byte 4)')
|
||||
})
|
||||
|
||||
t.test('throws for > 4 byte integer', { skip: true }, async t => {
|
||||
const writer = new BerWriter()
|
||||
t.throws(
|
||||
() => writer.writeInt(0xffffffffff),
|
||||
Error('BER ints cannot be > 0xffffffff')
|
||||
)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('writeLength', t => {
|
||||
t.test('throws if length not a number', async t => {
|
||||
const writer = new BerWriter()
|
||||
t.throws(
|
||||
() => writer.writeLength('1'),
|
||||
Error('argument must be a Number')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('writes a single byte length', async t => {
|
||||
const writer = new BerWriter({ size: 4 })
|
||||
writer.writeLength(0x7f)
|
||||
t.equal(writer.buffer.length, 1)
|
||||
t.equal(Buffer.compare(writer.buffer, Buffer.from([0x7f])), 0)
|
||||
})
|
||||
|
||||
t.test('writes a two byte length', async t => {
|
||||
const writer = new BerWriter({ size: 4 })
|
||||
writer.writeLength(0xff)
|
||||
t.equal(writer.buffer.length, 2)
|
||||
t.equal(Buffer.compare(writer.buffer, Buffer.from([0x81, 0xff])), 0)
|
||||
})
|
||||
|
||||
t.test('writes a three byte length', async t => {
|
||||
const writer = new BerWriter({ size: 4 })
|
||||
writer.writeLength(0xffff)
|
||||
t.equal(writer.buffer.length, 3)
|
||||
t.equal(Buffer.compare(writer.buffer, Buffer.from([0x82, 0xff, 0xff])), 0)
|
||||
})
|
||||
|
||||
t.test('writes a four byte length', async t => {
|
||||
const writer = new BerWriter({ size: 4 })
|
||||
writer.writeLength(0xffffff)
|
||||
t.equal(writer.buffer.length, 4)
|
||||
t.equal(Buffer.compare(writer.buffer, Buffer.from([0x83, 0xff, 0xff, 0xff])), 0)
|
||||
})
|
||||
|
||||
t.test('throw if byte length is too long', async t => {
|
||||
const writer = new BerWriter({ size: 4 })
|
||||
t.throws(
|
||||
() => writer.writeLength(0xffffffffff),
|
||||
Error('length too long (> 4 bytes)')
|
||||
)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('writeNull', t => {
|
||||
t.test('writeNull', async t => {
|
||||
const writer = new BerWriter({ size: 2 })
|
||||
writer.writeNull()
|
||||
t.equal(writer.size, 2)
|
||||
t.equal(Buffer.compare(writer.buffer, Buffer.from([0x05, 0x00])), 0)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('writeOID', t => {
|
||||
t.test('throws if OID not a string', async t => {
|
||||
const writer = new BerWriter()
|
||||
t.throws(
|
||||
() => writer.writeOID(42),
|
||||
Error('oidString must be a string')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('throws if tag not a number', async t => {
|
||||
const writer = new BerWriter()
|
||||
t.throws(
|
||||
() => writer.writeOID('1.2.3', '1'),
|
||||
Error('tag must be a Number')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('throws if OID not a valid OID string', async t => {
|
||||
const writer = new BerWriter()
|
||||
t.throws(
|
||||
() => writer.writeOID('foo'),
|
||||
Error('oidString is not a valid OID string')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('writes an OID', async t => {
|
||||
const oid = '1.2.840.113549.1.1.1'
|
||||
const writer = new BerWriter()
|
||||
writer.writeOID(oid)
|
||||
|
||||
const expected = Buffer.from([0x06, 0x09, 0x2a, 0x86,
|
||||
0x48, 0x86, 0xf7, 0x0d,
|
||||
0x01, 0x01, 0x01])
|
||||
const ber = writer.buffer
|
||||
t.equal(ber.compare(expected), 0)
|
||||
})
|
||||
|
||||
t.test('writes OID covering all octet encodings', async t => {
|
||||
const oid = '1.2.200.17000.2100100.270100100'
|
||||
const writer = new BerWriter()
|
||||
writer.writeOID(oid)
|
||||
|
||||
const expected = Buffer.from([
|
||||
0x06, 0x0f,
|
||||
0x2a, 0x81, 0x48, 0x81,
|
||||
0x84, 0x68, 0x81, 0x80,
|
||||
0x97, 0x04, 0x81, 0x80,
|
||||
0xe5, 0xcd, 0x04
|
||||
])
|
||||
const ber = writer.buffer
|
||||
t.equal(ber.compare(expected), 0)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('writeString', t => {
|
||||
t.test('throws if non-string supplied', async t => {
|
||||
const writer = new BerWriter()
|
||||
t.throws(
|
||||
() => writer.writeString(42),
|
||||
Error('stringToWrite must be a string')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('throws if tag not a number', async t => {
|
||||
const writer = new BerWriter()
|
||||
t.throws(
|
||||
() => writer.writeString('foo', '1'),
|
||||
Error('tag must be a number')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('writes an empty string', async t => {
|
||||
const writer = new BerWriter()
|
||||
writer.writeString('')
|
||||
|
||||
const expected = Buffer.from([0x04, 0x00])
|
||||
t.equal(Buffer.compare(writer.buffer, expected), 0)
|
||||
})
|
||||
|
||||
t.test('writes a string', async t => {
|
||||
const writer = new BerWriter({ size: 1 })
|
||||
writer.writeString('foo')
|
||||
|
||||
const expected = Buffer.from([0x04, 0x03, 0x66, 0x6f, 0x6f])
|
||||
t.equal(Buffer.compare(writer.buffer, expected), 0)
|
||||
t.equal(writer.size, 8)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('writeString', t => {
|
||||
t.test('throws if non-array supplied', async t => {
|
||||
const writer = new BerWriter()
|
||||
t.throws(
|
||||
() => writer.writeStringArray(42),
|
||||
Error('strings must be an instance of Array')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('write string array', async t => {
|
||||
const writer = new BerWriter()
|
||||
writer.writeStringArray(['hello world', 'fubar!'])
|
||||
const ber = writer.buffer
|
||||
|
||||
t.equal(ber.length, 21, 'wrong length')
|
||||
t.equal(ber[0], 0x04, 'wrong tag')
|
||||
t.equal(ber[1], 11, 'wrong length')
|
||||
t.equal(ber.subarray(2, 13).toString('utf8'), 'hello world', 'wrong value')
|
||||
|
||||
t.equal(ber[13], 0x04, 'wrong tag')
|
||||
t.equal(ber[14], 6, 'wrong length')
|
||||
t.equal(ber.subarray(15).toString('utf8'), 'fubar!', 'wrong value')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('original tests', t => {
|
||||
t.test('resize internal buffer', async t => {
|
||||
const writer = new BerWriter({ size: 2 })
|
||||
writer.writeString('hello world')
|
||||
const ber = writer.buffer
|
||||
|
||||
t.equal(ber.length, 13, 'wrong length')
|
||||
t.equal(ber[0], 0x04, 'wrong tag')
|
||||
t.equal(ber[1], 11, 'wrong length')
|
||||
t.equal(ber.subarray(2).toString('utf8'), 'hello world', 'wrong value')
|
||||
})
|
||||
|
||||
t.test('sequence', async t => {
|
||||
const writer = new BerWriter({ size: 25 })
|
||||
writer.startSequence()
|
||||
writer.writeString('hello world')
|
||||
writer.endSequence()
|
||||
const ber = writer.buffer
|
||||
|
||||
t.equal(ber.length, 15, 'wrong length')
|
||||
t.equal(ber[0], 0x30, 'wrong tag')
|
||||
t.equal(ber[1], 13, 'wrong length')
|
||||
t.equal(ber[2], 0x04, 'wrong tag')
|
||||
t.equal(ber[3], 11, 'wrong length')
|
||||
t.equal(ber.subarray(4).toString('utf8'), 'hello world', 'wrong value')
|
||||
})
|
||||
|
||||
t.test('nested sequence', async t => {
|
||||
const writer = new BerWriter({ size: 25 })
|
||||
writer.startSequence()
|
||||
writer.writeString('hello world')
|
||||
writer.startSequence()
|
||||
writer.writeString('hello world')
|
||||
writer.endSequence()
|
||||
writer.endSequence()
|
||||
const ber = writer.buffer
|
||||
|
||||
t.equal(ber.length, 30, 'wrong length')
|
||||
t.equal(ber[0], 0x30, 'wrong tag')
|
||||
t.equal(ber[1], 28, 'wrong length')
|
||||
t.equal(ber[2], 0x04, 'wrong tag')
|
||||
t.equal(ber[3], 11, 'wrong length')
|
||||
t.equal(ber.subarray(4, 15).toString('utf8'), 'hello world', 'wrong value')
|
||||
t.equal(ber[15], 0x30, 'wrong tag')
|
||||
t.equal(ber[16], 13, 'wrong length')
|
||||
t.equal(ber[17], 0x04, 'wrong tag')
|
||||
t.equal(ber[18], 11, 'wrong length')
|
||||
t.equal(ber.subarray(19, 30).toString('utf8'), 'hello world', 'wrong value')
|
||||
})
|
||||
|
||||
t.test('LDAP bind message', async t => {
|
||||
const dn = 'cn=foo,ou=unit,o=test'
|
||||
const writer = new BerWriter()
|
||||
writer.startSequence()
|
||||
writer.writeInt(3) // msgid = 3
|
||||
writer.startSequence(0x60) // ldap bind
|
||||
writer.writeInt(3) // ldap v3
|
||||
writer.writeString(dn)
|
||||
writer.writeByte(0x80)
|
||||
writer.writeByte(0x00)
|
||||
writer.endSequence()
|
||||
writer.endSequence()
|
||||
const ber = writer.buffer
|
||||
|
||||
t.equal(ber.length, 35, 'wrong length (buffer)')
|
||||
t.equal(ber[0], 0x30, 'wrong tag')
|
||||
t.equal(ber[1], 33, 'wrong length')
|
||||
t.equal(ber[2], 0x02, 'wrong tag')
|
||||
t.equal(ber[3], 1, 'wrong length')
|
||||
t.equal(ber[4], 0x03, 'wrong value')
|
||||
t.equal(ber[5], 0x60, 'wrong tag')
|
||||
t.equal(ber[6], 28, 'wrong length')
|
||||
t.equal(ber[7], 0x02, 'wrong tag')
|
||||
t.equal(ber[8], 1, 'wrong length')
|
||||
t.equal(ber[9], 0x03, 'wrong value')
|
||||
t.equal(ber[10], 0x04, 'wrong tag')
|
||||
t.equal(ber[11], dn.length, 'wrong length')
|
||||
t.equal(ber.subarray(12, 33).toString('utf8'), dn, 'wrong value')
|
||||
t.equal(ber[33], 0x80, 'wrong tag')
|
||||
t.equal(ber[34], 0x00, 'wrong len')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
74
node_modules/@ldapjs/asn1/lib/buffer-to-hex-dump.js
generated
vendored
Normal file
74
node_modules/@ldapjs/asn1/lib/buffer-to-hex-dump.js
generated
vendored
Normal file
@ -0,0 +1,74 @@
|
||||
'use strict'
|
||||
|
||||
const { createWriteStream } = require('fs')
|
||||
|
||||
/**
|
||||
* @typedef {object} HexDumpParams
|
||||
* @property {Buffer} buffer The buffer instance to serialize into a hex dump.
|
||||
* @property {string} [prefix=''] A string to prefix each byte with, e.g.
|
||||
* `0x`.
|
||||
* @property {string} [separator=''] A string to separate each byte with, e.g.
|
||||
* `, '.
|
||||
* @property {string[]} [wrapCharacters=[]] A set of characters to wrap the
|
||||
* output with. For example, `wrapCharacters=['[', ']']` will start the hex
|
||||
* dump with `[` and end it with `]`.
|
||||
* @property {number} [width=10] How many bytes to write per line.
|
||||
* @property {WriteStream | string} [destination=process.stdout] Where to
|
||||
* write the serialized data. If a string is provided, it is assumed to be
|
||||
* the path to a file. This file will be completely overwritten.
|
||||
* @property {boolean} [closeDestination=false] Indicates whether the
|
||||
* `destination` should be closed when done. This _should_ be `true` when the
|
||||
* passed in `destination` is a stream that you control. If a string path is
|
||||
* supplied for the `destination`, this will automatically be handled.
|
||||
*/
|
||||
|
||||
// We'd like to put this coverage directive after the doc block,
|
||||
// but that confuses doc tooling (e.g. WebStorm).
|
||||
/* istanbul ignore next: defaults don't need 100% coverage */
|
||||
/**
|
||||
* Given a buffer of bytes, generate a hex dump that can be loaded later
|
||||
* or viewed in a hex editor (e.g. [Hex Fiend](https://hexfiend.com)).
|
||||
*
|
||||
* @param {HexDumpParams} params
|
||||
*
|
||||
* @throws When the destination cannot be accessed.
|
||||
*/
|
||||
module.exports = function bufferToHexDump ({
|
||||
buffer,
|
||||
prefix = '',
|
||||
separator = '',
|
||||
wrapCharacters = [],
|
||||
width = 10,
|
||||
destination = process.stdout,
|
||||
closeDestination = false
|
||||
}) {
|
||||
let closeStream = closeDestination
|
||||
if (typeof destination === 'string') {
|
||||
destination = createWriteStream(destination)
|
||||
closeStream = true
|
||||
}
|
||||
|
||||
if (wrapCharacters[0]) {
|
||||
destination.write(wrapCharacters[0])
|
||||
}
|
||||
|
||||
for (const [i, byte] of buffer.entries()) {
|
||||
const outByte = Number(byte).toString(16).padStart(2, '0')
|
||||
destination.write(prefix + outByte)
|
||||
if (i !== buffer.byteLength - 1) {
|
||||
destination.write(separator)
|
||||
}
|
||||
if ((i + 1) % width === 0) {
|
||||
destination.write('\n')
|
||||
}
|
||||
}
|
||||
|
||||
if (wrapCharacters[1]) {
|
||||
destination.write(wrapCharacters[1])
|
||||
}
|
||||
|
||||
/* istanbul ignore else */
|
||||
if (closeStream === true) {
|
||||
destination.end()
|
||||
}
|
||||
}
|
||||
75
node_modules/@ldapjs/asn1/lib/buffer-to-hex-dump.test.js
generated
vendored
Normal file
75
node_modules/@ldapjs/asn1/lib/buffer-to-hex-dump.test.js
generated
vendored
Normal file
@ -0,0 +1,75 @@
|
||||
'use strict'
|
||||
|
||||
const tap = require('tap')
|
||||
const path = require('path')
|
||||
const { Writable } = require('stream')
|
||||
const { tmpdir } = require('os')
|
||||
const { randomUUID } = require('crypto')
|
||||
const { readFile, rm } = require('fs/promises')
|
||||
const { setTimeout } = require('timers/promises')
|
||||
const bufferToHexDump = require('./buffer-to-hex-dump')
|
||||
|
||||
const input = Buffer.from([
|
||||
0x00, 0x01, 0x02, 0x03, 0x04,
|
||||
0x05, 0x06, 0x07, 0x08, 0x09,
|
||||
0x0a, 0x0b, 0x0c
|
||||
])
|
||||
|
||||
tap.test('writes to stream', t => {
|
||||
const expected = [
|
||||
'[0x00, 0x01, 0x02, 0x03, \n',
|
||||
'0x04, 0x05, 0x06, 0x07, \n',
|
||||
'0x08, 0x09, 0x0a, 0x0b, \n',
|
||||
'0x0c]'
|
||||
].join('')
|
||||
|
||||
let found = ''
|
||||
const destination = new Writable({
|
||||
write (chunk, encoding, callback) {
|
||||
found += chunk.toString()
|
||||
callback()
|
||||
}
|
||||
})
|
||||
|
||||
destination.on('finish', () => {
|
||||
t.equal(found, expected)
|
||||
t.end()
|
||||
})
|
||||
|
||||
bufferToHexDump({
|
||||
buffer: input,
|
||||
prefix: '0x',
|
||||
separator: ', ',
|
||||
wrapCharacters: ['[', ']'],
|
||||
width: 4,
|
||||
closeDestination: true,
|
||||
destination
|
||||
})
|
||||
})
|
||||
|
||||
tap.test('writes to file', async t => {
|
||||
const expected = [
|
||||
'00010203\n',
|
||||
'04050607\n',
|
||||
'08090a0b\n',
|
||||
'0c'
|
||||
].join('')
|
||||
const destination = path.join(tmpdir(), randomUUID())
|
||||
|
||||
t.teardown(async () => {
|
||||
await rm(destination)
|
||||
})
|
||||
|
||||
bufferToHexDump({
|
||||
buffer: input,
|
||||
width: 4,
|
||||
destination
|
||||
})
|
||||
|
||||
// Give a little time for the write stream to create and
|
||||
// close the file.
|
||||
await setTimeout(100)
|
||||
|
||||
const contents = await readFile(destination)
|
||||
t.equal(contents.toString(), expected)
|
||||
})
|
||||
40
node_modules/@ldapjs/asn1/package.json
generated
vendored
Normal file
40
node_modules/@ldapjs/asn1/package.json
generated
vendored
Normal file
@ -0,0 +1,40 @@
|
||||
{
|
||||
"originalAuthor": "Joyent (joyent.com)",
|
||||
"contributors": [
|
||||
"Mark Cavage <mcavage@gmail.com>",
|
||||
"David Gwynne <loki@animata.net>",
|
||||
"Yunong Xiao <yunong@joyent.com>",
|
||||
"Alex Wilson <alex.wilson@joyent.com>"
|
||||
],
|
||||
"name": "@ldapjs/asn1",
|
||||
"description": "Contains parsers and serializers for ASN.1 (currently BER only)",
|
||||
"version": "2.0.0",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git://github.com/ldapjs/asn1.git"
|
||||
},
|
||||
"main": "index.js",
|
||||
"devDependencies": {
|
||||
"@fastify/pre-commit": "^2.0.2",
|
||||
"eslint": "^8.34.0",
|
||||
"eslint-config-standard": "^17.0.0",
|
||||
"eslint-plugin-import": "^2.27.5",
|
||||
"eslint-plugin-n": "^15.6.1",
|
||||
"eslint-plugin-node": "^11.1.0",
|
||||
"eslint-plugin-promise": "^6.1.1",
|
||||
"tap": "^16.3.4"
|
||||
},
|
||||
"scripts": {
|
||||
"lint": "eslint .",
|
||||
"lint:ci": "eslint .",
|
||||
"test": "tap --no-coverage-report",
|
||||
"test:cov": "tap",
|
||||
"test:cov:html": "tap --coverage-report=html",
|
||||
"test:watch": "tap -w --no-coverage-report"
|
||||
},
|
||||
"license": "MIT",
|
||||
"pre-commit": [
|
||||
"lint",
|
||||
"test"
|
||||
]
|
||||
}
|
||||
9
node_modules/@ldapjs/attribute/.eslintrc
generated
vendored
Normal file
9
node_modules/@ldapjs/attribute/.eslintrc
generated
vendored
Normal file
@ -0,0 +1,9 @@
|
||||
{
|
||||
"parserOptions": {
|
||||
"ecmaVersion": "latest"
|
||||
},
|
||||
|
||||
"extends": [
|
||||
"standard"
|
||||
]
|
||||
}
|
||||
10
node_modules/@ldapjs/attribute/.github/workflows/main.yml
generated
vendored
Normal file
10
node_modules/@ldapjs/attribute/.github/workflows/main.yml
generated
vendored
Normal file
@ -0,0 +1,10 @@
|
||||
name: "CI"
|
||||
on:
|
||||
pull_request:
|
||||
push:
|
||||
branches:
|
||||
- master
|
||||
|
||||
jobs:
|
||||
call-core-ci:
|
||||
uses: ldapjs/.github/.github/workflows/node-ci.yml@main
|
||||
5
node_modules/@ldapjs/attribute/.taprc.yaml
generated
vendored
Normal file
5
node_modules/@ldapjs/attribute/.taprc.yaml
generated
vendored
Normal file
@ -0,0 +1,5 @@
|
||||
reporter: terse
|
||||
coverage-map: coverage-map.js
|
||||
|
||||
files:
|
||||
- 'index.test.js'
|
||||
54
node_modules/@ldapjs/attribute/CHANGES.md
generated
vendored
Normal file
54
node_modules/@ldapjs/attribute/CHANGES.md
generated
vendored
Normal file
@ -0,0 +1,54 @@
|
||||
# ldap-filter
|
||||
|
||||
> ### Important
|
||||
> This file is no longer maintained. For changes, please read
|
||||
> the releases page: https://github.com/ldapjs/filter/releases
|
||||
|
||||
## 0.3.3
|
||||
|
||||
- Assert that NOT filters are closed by a parentheses
|
||||
|
||||
## 0.3.2
|
||||
|
||||
- Perform better checks for trailing characters
|
||||
- Improve test coverage
|
||||
- Change \*Filter.json to work recursively for child filters
|
||||
- Bump assert-plus dependency to 1.0.0
|
||||
|
||||
## 0.3.1
|
||||
|
||||
- Tolerate underscores in attribute names
|
||||
|
||||
## 0.3.0
|
||||
|
||||
- Enforce stricter output escaping for buffer values
|
||||
- **BREAKING** Rename `NotFilter.addfilter` to `NotFilter.setFilter`
|
||||
- **BREAKING** Rewrite filter parser to be more strict about input.
|
||||
This _significantly_ changes the sort of filters which the parser files
|
||||
acceptable. While the old parser would tolerate unescaped characters in
|
||||
the `()\*` set, the new parser requires them to be escaped via the `\XX`
|
||||
hex notation. This is in keeping with
|
||||
[RFC 4514](http://tools.ietf.org/search/rfc4515)
|
||||
- Perform better escaping for values which are not UTF-8
|
||||
|
||||
## 0.2.3
|
||||
- Update dev dependencies
|
||||
- Clean up asserts and prototypes
|
||||
|
||||
## 0.2.2
|
||||
|
||||
- Fix nested paren handling in parser
|
||||
|
||||
## 0.2.1
|
||||
|
||||
- Fix AndFilter per RFC4526
|
||||
|
||||
## 0.2.0
|
||||
|
||||
- Add 'attribute' accessor for ExtFilter matchType
|
||||
- Improve API for custom match functions
|
||||
- Support other value types in EqualityFilter
|
||||
|
||||
## 0.1.0
|
||||
|
||||
- Initial import from ldapjs
|
||||
21
node_modules/@ldapjs/attribute/LICENSE
generated
vendored
Normal file
21
node_modules/@ldapjs/attribute/LICENSE
generated
vendored
Normal file
@ -0,0 +1,21 @@
|
||||
Copyright (c) 2014 Patrick Mooney. All rights reserved.
|
||||
Copyright (c) 2014 Mark Cavage, Inc. All rights reserved.
|
||||
Copyright (c) 2022 The LDAPJS Collaborators.
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE
|
||||
8
node_modules/@ldapjs/attribute/README.md
generated
vendored
Normal file
8
node_modules/@ldapjs/attribute/README.md
generated
vendored
Normal file
@ -0,0 +1,8 @@
|
||||
# @ldapjs/attribute
|
||||
|
||||
Provides a class for representing LDAP entry attributes as described in
|
||||
[RFC 4512 §2.5](https://www.rfc-editor.org/rfc/rfc4512#section-2.5).
|
||||
|
||||
## License
|
||||
|
||||
MIT.
|
||||
3
node_modules/@ldapjs/attribute/coverage-map.js
generated
vendored
Normal file
3
node_modules/@ldapjs/attribute/coverage-map.js
generated
vendored
Normal file
@ -0,0 +1,3 @@
|
||||
'use strict'
|
||||
|
||||
module.exports = testFile => testFile.replace(/\.test\.js$/, '.js')
|
||||
344
node_modules/@ldapjs/attribute/index.js
generated
vendored
Normal file
344
node_modules/@ldapjs/attribute/index.js
generated
vendored
Normal file
@ -0,0 +1,344 @@
|
||||
'use strict'
|
||||
|
||||
const { core: { LBER_SET } } = require('@ldapjs/protocol')
|
||||
const {
|
||||
BerTypes,
|
||||
BerReader,
|
||||
BerWriter
|
||||
} = require('@ldapjs/asn1')
|
||||
const warning = require('./lib/deprecations')
|
||||
|
||||
/**
|
||||
* Represents an LDAP attribute and its associated values as defined by
|
||||
* https://www.rfc-editor.org/rfc/rfc4512#section-2.5.
|
||||
*/
|
||||
class Attribute {
|
||||
#buffers = []
|
||||
#type
|
||||
|
||||
/**
|
||||
* @param {object} options
|
||||
* @param {string} [options.type=''] The name of the attribute, e.g. "cn" for
|
||||
* the common name attribute. For binary attributes, include the `;binary`
|
||||
* option, e.g. `foo;binary`.
|
||||
* @param {string|string[]} [options.values] Either a single value for the
|
||||
* attribute, or a set of values for the attribute.
|
||||
*/
|
||||
constructor (options = {}) {
|
||||
if (options.type && typeof (options.type) !== 'string') {
|
||||
throw TypeError('options.type must be a string')
|
||||
}
|
||||
this.type = options.type || ''
|
||||
|
||||
const values = options.values || options.vals || []
|
||||
if (options.vals) {
|
||||
warning.emit('LDAP_ATTRIBUTE_DEP_001')
|
||||
}
|
||||
this.values = values
|
||||
}
|
||||
|
||||
get [Symbol.toStringTag] () {
|
||||
return 'LdapAttribute'
|
||||
}
|
||||
|
||||
/**
|
||||
* A copy of the buffers that represent the values for the attribute.
|
||||
*
|
||||
* @returns {Buffer[]}
|
||||
*/
|
||||
get buffers () {
|
||||
return this.#buffers.slice(0)
|
||||
}
|
||||
|
||||
/**
|
||||
* Serializes the attribute to a plain JavaScript object representation.
|
||||
*
|
||||
* @returns {object}
|
||||
*/
|
||||
get pojo () {
|
||||
return {
|
||||
type: this.type,
|
||||
values: this.values
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* The attribute name as provided during construction.
|
||||
*
|
||||
* @returns {string}
|
||||
*/
|
||||
get type () {
|
||||
return this.#type
|
||||
}
|
||||
|
||||
/**
|
||||
* Set the attribute name.
|
||||
*
|
||||
* @param {string} name
|
||||
*/
|
||||
set type (name) {
|
||||
this.#type = name
|
||||
}
|
||||
|
||||
/**
|
||||
* The set of attribute values as strings.
|
||||
*
|
||||
* @returns {string[]}
|
||||
*/
|
||||
get values () {
|
||||
const encoding = _bufferEncoding(this.#type)
|
||||
return this.#buffers.map(function (v) {
|
||||
return v.toString(encoding)
|
||||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* Set the attribute's associated values. This will replace any values set
|
||||
* at construction time.
|
||||
*
|
||||
* @param {string|string[]} vals
|
||||
*/
|
||||
set values (vals) {
|
||||
if (Array.isArray(vals) === false) {
|
||||
return this.addValue(vals)
|
||||
}
|
||||
for (const value of vals) {
|
||||
this.addValue(value)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Use {@link values} instead.
|
||||
*
|
||||
* @deprecated
|
||||
* @returns {string[]}
|
||||
*/
|
||||
get vals () {
|
||||
warning.emit('LDAP_ATTRIBUTE_DEP_003')
|
||||
return this.values
|
||||
}
|
||||
|
||||
/**
|
||||
* Use {@link values} instead.
|
||||
*
|
||||
* @deprecated
|
||||
* @param {string|string[]} values
|
||||
*/
|
||||
set vals (values) {
|
||||
warning.emit('LDAP_ATTRIBUTE_DEP_003')
|
||||
this.values = values
|
||||
}
|
||||
|
||||
/**
|
||||
* Append a new value, or set of values, to the current set of values
|
||||
* associated with the attributes.
|
||||
*
|
||||
* @param {string|string[]} value
|
||||
*/
|
||||
addValue (value) {
|
||||
if (Buffer.isBuffer(value)) {
|
||||
this.#buffers.push(value)
|
||||
} else {
|
||||
this.#buffers.push(
|
||||
Buffer.from(value + '', _bufferEncoding(this.#type))
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Replaces instance properties with those found in a given BER.
|
||||
*
|
||||
* @param {import('@ldapjs/asn1').BerReader} ber
|
||||
*
|
||||
* @deprecated Use {@link fromBer} instead.
|
||||
*/
|
||||
parse (ber) {
|
||||
const attr = Attribute.fromBer(ber)
|
||||
this.#type = attr.type
|
||||
this.values = attr.values
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert the {@link Attribute} instance to a {@link BerReader} capable of
|
||||
* being used in an LDAP message.
|
||||
*
|
||||
* @returns {BerReader}
|
||||
*/
|
||||
toBer () {
|
||||
const ber = new BerWriter()
|
||||
|
||||
ber.startSequence()
|
||||
ber.writeString(this.type)
|
||||
ber.startSequence(LBER_SET)
|
||||
|
||||
if (this.#buffers.length > 0) {
|
||||
for (const buffer of this.#buffers) {
|
||||
ber.writeByte(BerTypes.OctetString)
|
||||
ber.writeLength(buffer.length)
|
||||
ber.appendBuffer(buffer)
|
||||
}
|
||||
} else {
|
||||
ber.writeStringArray([])
|
||||
}
|
||||
ber.endSequence()
|
||||
ber.endSequence()
|
||||
|
||||
return new BerReader(ber.buffer)
|
||||
}
|
||||
|
||||
toJSON () {
|
||||
return this.pojo
|
||||
}
|
||||
|
||||
/**
|
||||
* Given two {@link Attribute} instances, determine if they are equal or
|
||||
* different.
|
||||
*
|
||||
* @param {Attribute} attr1 The first object to compare.
|
||||
* @param {Attribute} attr2 The second object to compare.
|
||||
*
|
||||
* @returns {number} `0` if the attributes are equal in value, `-1` if
|
||||
* `attr1` should come before `attr2` when sorted, and `1` if `attr2` should
|
||||
* come before `attr1` when sorted.
|
||||
*
|
||||
* @throws When either input object is not an {@link Attribute}.
|
||||
*/
|
||||
static compare (attr1, attr2) {
|
||||
if (Attribute.isAttribute(attr1) === false || Attribute.isAttribute(attr2) === false) {
|
||||
throw TypeError('can only compare Attribute instances')
|
||||
}
|
||||
|
||||
if (attr1.type < attr2.type) return -1
|
||||
if (attr1.type > attr2.type) return 1
|
||||
|
||||
const aValues = attr1.values
|
||||
const bValues = attr2.values
|
||||
if (aValues.length < bValues.length) return -1
|
||||
if (aValues.length > bValues.length) return 1
|
||||
|
||||
for (let i = 0; i < aValues.length; i++) {
|
||||
if (aValues[i] < bValues[i]) return -1
|
||||
if (aValues[i] > bValues[i]) return 1
|
||||
}
|
||||
|
||||
return 0
|
||||
}
|
||||
|
||||
/**
|
||||
* Read a BER representation of an attribute, and its values, and
|
||||
* create a new {@link Attribute} instance. The BER must start
|
||||
* at the beginning of a sequence.
|
||||
*
|
||||
* @param {import('@ldapjs/asn1').BerReader} ber
|
||||
*
|
||||
* @returns {Attribute}
|
||||
*/
|
||||
static fromBer (ber) {
|
||||
ber.readSequence()
|
||||
|
||||
const type = ber.readString()
|
||||
const values = []
|
||||
|
||||
// If the next byte represents a BER "SET" sequence...
|
||||
if (ber.peek() === LBER_SET) {
|
||||
// .. read that sequence ...
|
||||
/* istanbul ignore else */
|
||||
if (ber.readSequence(LBER_SET)) {
|
||||
const end = ber.offset + ber.length
|
||||
// ... and read all values in that set.
|
||||
while (ber.offset < end) {
|
||||
values.push(
|
||||
ber.readString(BerTypes.OctetString, true)
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const result = new Attribute({
|
||||
type,
|
||||
values
|
||||
})
|
||||
return result
|
||||
}
|
||||
|
||||
/**
|
||||
* Given an object of attribute types mapping to attribute values, construct
|
||||
* a set of Attributes.
|
||||
*
|
||||
* @param {object} obj Each key is an attribute type, and each value is an
|
||||
* attribute value or set of values.
|
||||
*
|
||||
* @returns {Attribute[]}
|
||||
*
|
||||
* @throws If an attribute cannot be constructed correctly.
|
||||
*/
|
||||
static fromObject (obj) {
|
||||
const attributes = []
|
||||
for (const [key, value] of Object.entries(obj)) {
|
||||
if (Array.isArray(value) === true) {
|
||||
attributes.push(new Attribute({
|
||||
type: key,
|
||||
values: value
|
||||
}))
|
||||
} else {
|
||||
attributes.push(new Attribute({
|
||||
type: key,
|
||||
values: [value]
|
||||
}))
|
||||
}
|
||||
}
|
||||
return attributes
|
||||
}
|
||||
|
||||
/**
|
||||
* Determine if an object represents an {@link Attribute}.
|
||||
*
|
||||
* @param {object} attr The object to check. It can be an instance of
|
||||
* {@link Attribute} or a plain JavaScript object that looks like an
|
||||
* {@link Attribute} and can be passed to the constructor to create one.
|
||||
*
|
||||
* @returns {boolean}
|
||||
*/
|
||||
static isAttribute (attr) {
|
||||
if (typeof attr !== 'object') {
|
||||
return false
|
||||
}
|
||||
|
||||
if (Object.prototype.toString.call(attr) === '[object LdapAttribute]') {
|
||||
return true
|
||||
}
|
||||
|
||||
const typeOk = typeof attr.type === 'string'
|
||||
let valuesOk = Array.isArray(attr.values)
|
||||
if (valuesOk === true) {
|
||||
for (const val of attr.values) {
|
||||
if (typeof val !== 'string' && Buffer.isBuffer(val) === false) {
|
||||
valuesOk = false
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
if (typeOk === true && valuesOk === true) {
|
||||
return true
|
||||
}
|
||||
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = Attribute
|
||||
|
||||
/**
|
||||
* Determine the encoding for values based upon whether the binary
|
||||
* option is set on the attribute.
|
||||
*
|
||||
* @param {string} type
|
||||
*
|
||||
* @returns {string} Either "utf8" for a plain string value, or "base64" for
|
||||
* a binary attribute.
|
||||
*
|
||||
* @private
|
||||
*/
|
||||
function _bufferEncoding (type) {
|
||||
return /;binary$/.test(type) ? 'base64' : 'utf8'
|
||||
}
|
||||
403
node_modules/@ldapjs/attribute/index.test.js
generated
vendored
Normal file
403
node_modules/@ldapjs/attribute/index.test.js
generated
vendored
Normal file
@ -0,0 +1,403 @@
|
||||
'use strict'
|
||||
|
||||
const tap = require('tap')
|
||||
const {
|
||||
BerReader,
|
||||
BerWriter
|
||||
} = require('@ldapjs/asn1')
|
||||
const { core: { LBER_SET } } = require('@ldapjs/protocol')
|
||||
const warning = require('./lib/deprecations')
|
||||
const Attribute = require('./')
|
||||
|
||||
// Silence the standard warning logs. We will test the messages explicitly.
|
||||
process.removeAllListeners('warning')
|
||||
|
||||
tap.test('constructor', t => {
|
||||
t.test('new no args', async t => {
|
||||
t.ok(new Attribute())
|
||||
// TODO: verify attributes
|
||||
})
|
||||
|
||||
t.test('new with args', async t => {
|
||||
let attr = new Attribute({
|
||||
type: 'cn',
|
||||
values: ['foo', 'bar']
|
||||
})
|
||||
|
||||
t.ok(attr)
|
||||
|
||||
attr.addValue('baz')
|
||||
t.equal(attr.type, 'cn')
|
||||
const values = attr.values
|
||||
t.equal(values.length, 3)
|
||||
t.equal(values[0], 'foo')
|
||||
t.equal(values[1], 'bar')
|
||||
t.equal(values[2], 'baz')
|
||||
|
||||
t.throws(function () {
|
||||
const typeThatIsNotAString = 1
|
||||
attr = new Attribute({
|
||||
type: typeThatIsNotAString
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
t.test('supports binary attributes', async t => {
|
||||
const attr = new Attribute({
|
||||
type: 'foo;binary',
|
||||
values: ['bar']
|
||||
})
|
||||
t.strictSame(attr.pojo, {
|
||||
type: 'foo;binary',
|
||||
values: ['bao=']
|
||||
})
|
||||
})
|
||||
|
||||
t.test('warns for vals', t => {
|
||||
process.on('warning', handler)
|
||||
t.teardown(async () => {
|
||||
process.removeListener('warning', handler)
|
||||
warning.emitted.set('LDAP_MESSAGE_DEP_001', false)
|
||||
})
|
||||
|
||||
const attr = new Attribute({
|
||||
type: 'foo',
|
||||
vals: ['bar']
|
||||
})
|
||||
t.ok(attr)
|
||||
|
||||
function handler (error) {
|
||||
t.equal(
|
||||
error.message,
|
||||
'options.vals is deprecated. Use options.values instead.'
|
||||
)
|
||||
t.end()
|
||||
}
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('.values', t => {
|
||||
t.test('adds an array of strings', async t => {
|
||||
const attr = new Attribute({ type: 'foo' })
|
||||
attr.values = ['bar', 'baz']
|
||||
t.strictSame(attr.pojo, {
|
||||
type: 'foo',
|
||||
values: ['bar', 'baz']
|
||||
})
|
||||
})
|
||||
|
||||
t.test('adds a single string', async t => {
|
||||
const attr = new Attribute({ type: 'foo' })
|
||||
attr.values = 'bar'
|
||||
t.strictSame(attr.pojo, {
|
||||
type: 'foo',
|
||||
values: ['bar']
|
||||
})
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('.vals', t => {
|
||||
t.beforeEach(async t => {
|
||||
process.on('warning', handler)
|
||||
t.context.handler = handler
|
||||
|
||||
function handler (error) {
|
||||
t.equal(
|
||||
error.message,
|
||||
'Instance property .vals is deprecated. Use property .values instead.'
|
||||
)
|
||||
t.end()
|
||||
}
|
||||
})
|
||||
|
||||
t.afterEach(async (t) => {
|
||||
process.removeListener('warning', t.context.handler)
|
||||
warning.emitted.set('LDAP_ATTRIBUTE_DEP_003', false)
|
||||
})
|
||||
|
||||
t.test('adds an array of strings', async t => {
|
||||
const attr = new Attribute({ type: 'foo' })
|
||||
attr.vals = ['bar', 'baz']
|
||||
t.strictSame(attr.pojo, {
|
||||
type: 'foo',
|
||||
values: ['bar', 'baz']
|
||||
})
|
||||
})
|
||||
|
||||
t.test('adds a single string', async t => {
|
||||
const attr = new Attribute({ type: 'foo' })
|
||||
attr.vals = 'bar'
|
||||
t.strictSame(attr.pojo, {
|
||||
type: 'foo',
|
||||
values: ['bar']
|
||||
})
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('.buffers', t => {
|
||||
t.test('returns underlying buffers', async t => {
|
||||
const attr = new Attribute({
|
||||
type: 'foo',
|
||||
values: ['bar', 'baz']
|
||||
})
|
||||
const buffers = attr.buffers
|
||||
|
||||
t.equal(buffers.length, 2)
|
||||
|
||||
let expected = Buffer.from('bar', 'utf8')
|
||||
t.equal(expected.compare(buffers[0]), 0)
|
||||
|
||||
expected = Buffer.from('baz', 'utf8')
|
||||
t.equal(expected.compare(buffers[1]), 0)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('.type', t => {
|
||||
t.test('gets and sets', async t => {
|
||||
const attr = new Attribute(({
|
||||
type: 'foo',
|
||||
values: ['bar']
|
||||
}))
|
||||
|
||||
t.equal(attr.type, 'foo')
|
||||
attr.type = 'bar'
|
||||
t.equal(attr.type, 'bar')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('toBer', async t => {
|
||||
t.test('renders type with values', async t => {
|
||||
const attr = new Attribute({
|
||||
type: 'cn',
|
||||
values: ['foo', 'bar']
|
||||
})
|
||||
const reader = attr.toBer()
|
||||
t.ok(reader.readSequence())
|
||||
t.equal(reader.readString(), 'cn')
|
||||
t.equal(reader.readSequence(LBER_SET), LBER_SET)
|
||||
t.equal(reader.readString(), 'foo')
|
||||
t.equal(reader.readString(), 'bar')
|
||||
})
|
||||
|
||||
t.test('renders type without values', async t => {
|
||||
const attr = new Attribute({ type: 'cn' })
|
||||
const reader = attr.toBer()
|
||||
t.ok(reader.readSequence())
|
||||
t.equal(reader.readString(), 'cn')
|
||||
t.equal(reader.readSequence(LBER_SET), LBER_SET)
|
||||
t.equal(reader.remain, 0)
|
||||
})
|
||||
})
|
||||
|
||||
tap.test('parse', t => {
|
||||
t.beforeEach(async t => {
|
||||
process.on('warning', handler)
|
||||
t.teardown(async () => {
|
||||
process.removeListener('warning', handler)
|
||||
warning.emitted.set('LDAP_MESSAGE_DEP_002', false)
|
||||
})
|
||||
|
||||
function handler (error) {
|
||||
t.equal(
|
||||
error.message,
|
||||
'Instance method .parse is deprecated. Use static .fromBer instead.'
|
||||
)
|
||||
t.end()
|
||||
}
|
||||
})
|
||||
|
||||
t.test('parse', async t => {
|
||||
const ber = new BerWriter()
|
||||
ber.startSequence()
|
||||
ber.writeString('cn')
|
||||
ber.startSequence(0x31)
|
||||
ber.writeStringArray(['foo', 'bar'])
|
||||
ber.endSequence()
|
||||
ber.endSequence()
|
||||
|
||||
const attr = new Attribute()
|
||||
attr.parse(new BerReader(ber.buffer))
|
||||
|
||||
t.equal(attr.type, 'cn')
|
||||
t.equal(attr.vals.length, 2)
|
||||
t.equal(attr.vals[0], 'foo')
|
||||
t.equal(attr.vals[1], 'bar')
|
||||
})
|
||||
|
||||
t.test('parse - without 0x31', async t => {
|
||||
const ber = new BerWriter()
|
||||
ber.startSequence()
|
||||
ber.writeString('sn')
|
||||
ber.endSequence()
|
||||
|
||||
const attr = new Attribute()
|
||||
attr.parse(new BerReader(ber.buffer))
|
||||
|
||||
t.equal(attr.type, 'sn')
|
||||
t.equal(attr.vals.length, 0)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('pojo / toJSON', t => {
|
||||
t.test('returns an object', async t => {
|
||||
const expected = {
|
||||
type: 'foo',
|
||||
values: ['bar']
|
||||
}
|
||||
const attr = new Attribute(expected)
|
||||
|
||||
t.strictSame(attr.pojo, expected)
|
||||
t.strictSame(JSON.stringify(attr), JSON.stringify(expected))
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('#fromBer', t => {
|
||||
const attributeWithValuesBytes = [
|
||||
0x30, 0x1c, // start first attribute sequence, 28 bytes
|
||||
|
||||
0x04, 0x0b, // string, 11 bytes
|
||||
0x6f, 0x62, 0x6a, 0x65, // "objectClass"
|
||||
0x63, 0x74, 0x43, 0x6c,
|
||||
0x61, 0x73, 0x73,
|
||||
0x31, 0x0d, // start value sequence, 13 bytes
|
||||
0x04, 0x03, 0x74, 0x6f, 0x70, // string: "top"
|
||||
0x04, 0x06, 0x64, 0x6f, 0x6d, 0x61, 0x69, 0x6e // string: "domain"
|
||||
]
|
||||
|
||||
t.test('parses an attribute with values', async t => {
|
||||
const ber = new BerReader(Buffer.from(attributeWithValuesBytes))
|
||||
const attr = Attribute.fromBer(ber)
|
||||
|
||||
t.equal(attr.type, 'objectClass')
|
||||
t.equal(attr.vals[0], 'top')
|
||||
t.equal(attr.vals[1], 'domain')
|
||||
})
|
||||
|
||||
t.test('parses an attribute without values', async t => {
|
||||
const ber = new BerWriter()
|
||||
ber.startSequence()
|
||||
ber.writeString('sn')
|
||||
ber.endSequence()
|
||||
|
||||
const attr = Attribute.fromBer(new BerReader(ber.buffer))
|
||||
t.equal(attr.type, 'sn')
|
||||
t.strictSame(attr.vals, [])
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('#fromObject', t => {
|
||||
t.test('handles basic object', async t => {
|
||||
const attrs = Attribute.fromObject({
|
||||
foo: ['foo'],
|
||||
bar: 'bar',
|
||||
'baz;binary': Buffer.from([0x00])
|
||||
})
|
||||
for (const attr of attrs) {
|
||||
t.equal(Object.prototype.toString.call(attr), '[object LdapAttribute]')
|
||||
}
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('#isAttribute', t => {
|
||||
t.test('rejects non-object', async t => {
|
||||
t.equal(Attribute.isAttribute(42), false)
|
||||
})
|
||||
|
||||
t.test('accepts Attribute instances', async t => {
|
||||
const input = new Attribute({
|
||||
type: 'cn',
|
||||
values: ['foo']
|
||||
})
|
||||
t.equal(Attribute.isAttribute(input), true)
|
||||
})
|
||||
|
||||
t.test('accepts attribute-like objects', async t => {
|
||||
const input = {
|
||||
type: 'cn',
|
||||
values: [
|
||||
'foo',
|
||||
Buffer.from('bar')
|
||||
]
|
||||
}
|
||||
t.equal(Attribute.isAttribute(input), true)
|
||||
})
|
||||
|
||||
t.test('rejects non-attribute-like objects', async t => {
|
||||
let input = {
|
||||
foo: 'foo',
|
||||
values: 'bar'
|
||||
}
|
||||
t.equal(Attribute.isAttribute(input), false)
|
||||
|
||||
input = {
|
||||
type: 'cn',
|
||||
values: [42]
|
||||
}
|
||||
t.equal(Attribute.isAttribute(input), false)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('compare', async t => {
|
||||
const comp = Attribute.compare
|
||||
let a = new Attribute({
|
||||
type: 'foo',
|
||||
values: ['bar']
|
||||
})
|
||||
const b = new Attribute({
|
||||
type: 'foo',
|
||||
values: ['bar']
|
||||
})
|
||||
const notAnAttribute = 'this is not an attribute'
|
||||
|
||||
t.throws(
|
||||
() => comp(a, notAnAttribute),
|
||||
Error('can only compare Attribute instances')
|
||||
)
|
||||
t.throws(
|
||||
() => comp(notAnAttribute, b),
|
||||
Error('can only compare Attribute instances')
|
||||
)
|
||||
|
||||
t.equal(comp(a, b), 0)
|
||||
|
||||
// Different types
|
||||
a = new Attribute({ type: 'boo' })
|
||||
t.equal(comp(a, b), -1)
|
||||
t.equal(comp(b, a), 1)
|
||||
|
||||
// Different value counts
|
||||
a = new Attribute({
|
||||
type: 'foo',
|
||||
values: ['bar', 'bar']
|
||||
})
|
||||
t.equal(comp(a, b), 1)
|
||||
t.equal(comp(b, a), -1)
|
||||
|
||||
// Different value contents (same count)
|
||||
a = new Attribute({
|
||||
type: 'foo',
|
||||
values: ['baz']
|
||||
})
|
||||
t.equal(comp(a, b), 1)
|
||||
t.equal(comp(b, a), -1)
|
||||
})
|
||||
10
node_modules/@ldapjs/attribute/lib/deprecations.js
generated
vendored
Normal file
10
node_modules/@ldapjs/attribute/lib/deprecations.js
generated
vendored
Normal file
@ -0,0 +1,10 @@
|
||||
'use strict'
|
||||
|
||||
const warning = require('process-warning')()
|
||||
const clazz = 'LdapjsAttributeWarning'
|
||||
|
||||
warning.create(clazz, 'LDAP_ATTRIBUTE_DEP_001', 'options.vals is deprecated. Use options.values instead.')
|
||||
warning.create(clazz, 'LDAP_ATTRIBUTE_DEP_002', 'Instance method .parse is deprecated. Use static .fromBer instead.')
|
||||
warning.create(clazz, 'LDAP_ATTRIBUTE_DEP_003', 'Instance property .vals is deprecated. Use property .values instead.')
|
||||
|
||||
module.exports = warning
|
||||
47
node_modules/@ldapjs/attribute/package.json
generated
vendored
Normal file
47
node_modules/@ldapjs/attribute/package.json
generated
vendored
Normal file
@ -0,0 +1,47 @@
|
||||
{
|
||||
"originalAuthor": "Patrick Mooney",
|
||||
"originalContributors": [
|
||||
"Mark Cavage <mcavage@gmail.com>",
|
||||
"Cody Peter Mello <cody.mello@joyent.com>"
|
||||
],
|
||||
"name": "@ldapjs/attribute",
|
||||
"homepage": "https://github.com/ldapjs/attribute",
|
||||
"description": "API for handling LDAP entry attributes",
|
||||
"version": "1.0.0",
|
||||
"license": "MIT",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git@github.com:ldapjs/attribute.git"
|
||||
},
|
||||
"main": "index.js",
|
||||
"directories": {
|
||||
"lib": "./lib"
|
||||
},
|
||||
"dependencies": {
|
||||
"@ldapjs/asn1": "2.0.0",
|
||||
"@ldapjs/protocol": "^1.2.1",
|
||||
"process-warning": "^2.1.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@fastify/pre-commit": "^2.0.2",
|
||||
"eslint": "^8.34.0",
|
||||
"eslint-config-standard": "^17.0.0",
|
||||
"eslint-plugin-import": "^2.27.5",
|
||||
"eslint-plugin-n": "^15.6.1",
|
||||
"eslint-plugin-node": "^11.1.0",
|
||||
"eslint-plugin-promise": "^6.1.1",
|
||||
"tap": "^16.3.4"
|
||||
},
|
||||
"scripts": {
|
||||
"lint": "eslint .",
|
||||
"lint:ci": "eslint .",
|
||||
"test": "tap --no-coverage-report",
|
||||
"test:cov": "tap",
|
||||
"test:cov:html": "tap --coverage-report=html",
|
||||
"test:watch": "tap -w --no-coverage-report"
|
||||
},
|
||||
"precommit": [
|
||||
"lint",
|
||||
"test"
|
||||
]
|
||||
}
|
||||
9
node_modules/@ldapjs/change/.eslintrc
generated
vendored
Normal file
9
node_modules/@ldapjs/change/.eslintrc
generated
vendored
Normal file
@ -0,0 +1,9 @@
|
||||
{
|
||||
"parserOptions": {
|
||||
"ecmaVersion": "latest"
|
||||
},
|
||||
|
||||
"extends": [
|
||||
"standard"
|
||||
]
|
||||
}
|
||||
10
node_modules/@ldapjs/change/.github/workflows/main.yml
generated
vendored
Normal file
10
node_modules/@ldapjs/change/.github/workflows/main.yml
generated
vendored
Normal file
@ -0,0 +1,10 @@
|
||||
name: "CI"
|
||||
on:
|
||||
pull_request:
|
||||
push:
|
||||
branches:
|
||||
- master
|
||||
|
||||
jobs:
|
||||
call-core-ci:
|
||||
uses: ldapjs/.github/.github/workflows/node-ci.yml@main
|
||||
6
node_modules/@ldapjs/change/.taprc.yaml
generated
vendored
Normal file
6
node_modules/@ldapjs/change/.taprc.yaml
generated
vendored
Normal file
@ -0,0 +1,6 @@
|
||||
reporter: terse
|
||||
coverage-map: coverage-map.js
|
||||
|
||||
files:
|
||||
- 'index.test.js'
|
||||
# - 'lib/**/*.test.js'
|
||||
21
node_modules/@ldapjs/change/LICENSE
generated
vendored
Normal file
21
node_modules/@ldapjs/change/LICENSE
generated
vendored
Normal file
@ -0,0 +1,21 @@
|
||||
Copyright (c) 2014 Patrick Mooney. All rights reserved.
|
||||
Copyright (c) 2014 Mark Cavage, Inc. All rights reserved.
|
||||
Copyright (c) 2022 The LDAPJS Collaborators.
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE
|
||||
8
node_modules/@ldapjs/change/README.md
generated
vendored
Normal file
8
node_modules/@ldapjs/change/README.md
generated
vendored
Normal file
@ -0,0 +1,8 @@
|
||||
# change
|
||||
|
||||
Provides objects for managing changes as described in
|
||||
[RFC 4511 §4.6](https://www.rfc-editor.org/rfc/rfc4511.html#section-4.6).
|
||||
|
||||
## License
|
||||
|
||||
MIT.
|
||||
3
node_modules/@ldapjs/change/coverage-map.js
generated
vendored
Normal file
3
node_modules/@ldapjs/change/coverage-map.js
generated
vendored
Normal file
@ -0,0 +1,3 @@
|
||||
'use strict'
|
||||
|
||||
module.exports = testFile => testFile.replace(/\.test\.js$/, '.js')
|
||||
320
node_modules/@ldapjs/change/index.js
generated
vendored
Normal file
320
node_modules/@ldapjs/change/index.js
generated
vendored
Normal file
@ -0,0 +1,320 @@
|
||||
'use strict'
|
||||
|
||||
const { BerReader, BerWriter } = require('@ldapjs/asn1')
|
||||
const Attribute = require('@ldapjs/attribute')
|
||||
|
||||
/**
|
||||
* Implements an LDAP CHANGE sequence as described in
|
||||
* https://www.rfc-editor.org/rfc/rfc4511.html#section-4.6.
|
||||
*/
|
||||
class Change {
|
||||
#operation
|
||||
#modification
|
||||
|
||||
/**
|
||||
* @typedef {object} ChangeParameters
|
||||
* @property {string | number} operation One of `add` (0), `delete` (1), or
|
||||
* `replace` (2). Default: `add`.
|
||||
* @property {object | import('@ldapjs/attribute')} modification An attribute
|
||||
* instance or an object that is shaped like an attribute.
|
||||
*/
|
||||
|
||||
/**
|
||||
* @param {ChangeParameters} input
|
||||
*
|
||||
* @throws When the `modification` parameter is invalid.
|
||||
*/
|
||||
constructor ({ operation = 'add', modification }) {
|
||||
this.operation = operation
|
||||
this.modification = modification
|
||||
}
|
||||
|
||||
get [Symbol.toStringTag] () {
|
||||
return 'LdapChange'
|
||||
}
|
||||
|
||||
/**
|
||||
* The attribute that will be modified by the {@link Change}.
|
||||
*
|
||||
* @returns {import('@ldapjs/attribute')}
|
||||
*/
|
||||
get modification () {
|
||||
return this.#modification
|
||||
}
|
||||
|
||||
/**
|
||||
* Define the attribute to be modified by the {@link Change}.
|
||||
*
|
||||
* @param {object|import('@ldapjs/attribute')} mod
|
||||
*
|
||||
* @throws When `mod` is not an instance of `Attribute` or is not an
|
||||
* `Attribute` shaped object.
|
||||
*/
|
||||
set modification (mod) {
|
||||
if (Attribute.isAttribute(mod) === false) {
|
||||
throw Error('modification must be an Attribute')
|
||||
}
|
||||
if (Object.prototype.toString.call(mod) !== '[object LdapAttribute]') {
|
||||
mod = new Attribute(mod)
|
||||
}
|
||||
this.#modification = mod
|
||||
}
|
||||
|
||||
/**
|
||||
* Get a plain JavaScript object representation of the change.
|
||||
*
|
||||
* @returns {object}
|
||||
*/
|
||||
get pojo () {
|
||||
return {
|
||||
operation: this.operation,
|
||||
modification: this.modification.pojo
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* The string name of the operation that will be performed.
|
||||
*
|
||||
* @returns {string} One of `add`, `delete`, or `replace`.
|
||||
*/
|
||||
get operation () {
|
||||
switch (this.#operation) {
|
||||
case 0x00: {
|
||||
return 'add'
|
||||
}
|
||||
|
||||
case 0x01: {
|
||||
return 'delete'
|
||||
}
|
||||
|
||||
case 0x02: {
|
||||
return 'replace'
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Define the operation that the {@link Change} represents.
|
||||
*
|
||||
* @param {string|number} op May be one of `add` (0), `delete` (1),
|
||||
* or `replace` (2).
|
||||
*
|
||||
* @throws When the `op` is not recognized.
|
||||
*/
|
||||
set operation (op) {
|
||||
if (typeof op === 'string') {
|
||||
op = op.toLowerCase()
|
||||
}
|
||||
|
||||
switch (op) {
|
||||
case 0x00:
|
||||
case 'add': {
|
||||
this.#operation = 0x00
|
||||
break
|
||||
}
|
||||
|
||||
case 0x01:
|
||||
case 'delete': {
|
||||
this.#operation = 0x01
|
||||
break
|
||||
}
|
||||
|
||||
case 0x02:
|
||||
case 'replace': {
|
||||
this.#operation = 0x02
|
||||
break
|
||||
}
|
||||
|
||||
default: {
|
||||
const type = Number.isInteger(op)
|
||||
? '0x' + Number(op).toString(16)
|
||||
: op
|
||||
throw Error(`invalid operation type: ${type}`)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Serialize the instance to a BER.
|
||||
*
|
||||
* @returns {import('@ldapjs/asn1').BerReader}
|
||||
*/
|
||||
toBer () {
|
||||
const writer = new BerWriter()
|
||||
writer.startSequence()
|
||||
writer.writeEnumeration(this.#operation)
|
||||
|
||||
const attrBer = this.#modification.toBer()
|
||||
writer.appendBuffer(attrBer.buffer)
|
||||
writer.endSequence()
|
||||
|
||||
return new BerReader(writer.buffer)
|
||||
}
|
||||
|
||||
/**
|
||||
* See {@link pojo}.
|
||||
*
|
||||
* @returns {object}
|
||||
*/
|
||||
toJSON () {
|
||||
return this.pojo
|
||||
}
|
||||
|
||||
/**
|
||||
* Applies a {@link Change} to a `target` object.
|
||||
*
|
||||
* @example
|
||||
* const change = new Change({
|
||||
* operation: 'add',
|
||||
* modification: {
|
||||
* type: 'cn',
|
||||
* values: ['new']
|
||||
* }
|
||||
* })
|
||||
* const target = {
|
||||
* cn: ['old']
|
||||
* }
|
||||
* Change.apply(change, target)
|
||||
* // target = { cn: ['old', 'new'] }
|
||||
*
|
||||
* @param {Change} change The change to apply.
|
||||
* @param {object} target The object to modify. This object will be mutated
|
||||
* by the function. It should have properties that match the `modification`
|
||||
* of the change.
|
||||
* @param {boolean} scalar When `true`, will convert single-item arrays
|
||||
* to scalar values. Default: `false`.
|
||||
*
|
||||
* @returns {object} The mutated `target`.
|
||||
*
|
||||
* @throws When the `change` is not an instance of {@link Change}.
|
||||
*/
|
||||
static apply (change, target, scalar = false) {
|
||||
if (Change.isChange(change) === false) {
|
||||
throw Error('change must be an instance of Change')
|
||||
}
|
||||
|
||||
const type = change.modification.type
|
||||
const values = change.modification.values
|
||||
|
||||
let data = target[type]
|
||||
if (data === undefined) {
|
||||
data = []
|
||||
} else if (Array.isArray(data) === false) {
|
||||
data = [data]
|
||||
}
|
||||
|
||||
switch (change.operation) {
|
||||
case 'add': {
|
||||
// Add only new unique entries.
|
||||
const newValues = values.filter(v => data.indexOf(v) === -1)
|
||||
Array.prototype.push.apply(data, newValues)
|
||||
break
|
||||
}
|
||||
|
||||
case 'delete': {
|
||||
data = data.filter(v => values.indexOf(v) === -1)
|
||||
if (data.length === 0) {
|
||||
// An empty list indicates the attribute should be removed
|
||||
// completely.
|
||||
delete target[type]
|
||||
return target
|
||||
}
|
||||
break
|
||||
}
|
||||
|
||||
case 'replace': {
|
||||
if (values.length === 0) {
|
||||
// A new value set that is empty is a delete.
|
||||
delete target[type]
|
||||
return target
|
||||
}
|
||||
data = values
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if (scalar === true && data.length === 1) {
|
||||
// Replace array value with a scalar value if the modified set is
|
||||
// single valued and the operation calls for a scalar.
|
||||
target[type] = data[0]
|
||||
} else {
|
||||
target[type] = data
|
||||
}
|
||||
|
||||
return target
|
||||
}
|
||||
|
||||
/**
|
||||
* Determines if an object is an instance of {@link Change}, or at least
|
||||
* resembles the shape of a {@link Change} object. A plain object will match
|
||||
* if it has a `modification` property that matches an `Attribute`,
|
||||
* an `operation` property that is a string or number, and has a `toBer`
|
||||
* method. An object that resembles a {@link Change} does not guarantee
|
||||
* compatibility. A `toString` check is much more accurate.
|
||||
*
|
||||
* @param {Change|object} change
|
||||
*
|
||||
* @returns {boolean}
|
||||
*/
|
||||
static isChange (change) {
|
||||
if (Object.prototype.toString.call(change) === '[object LdapChange]') {
|
||||
return true
|
||||
}
|
||||
if (Object.prototype.toString.call(change) !== '[object Object]') {
|
||||
return false
|
||||
}
|
||||
if (
|
||||
Attribute.isAttribute(change.modification) === true &&
|
||||
(typeof change.operation === 'string' || typeof change.operation === 'number')
|
||||
) {
|
||||
return true
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
/**
|
||||
* Compares two {@link Change} instance to determine the priority of the
|
||||
* changes relative to each other.
|
||||
*
|
||||
* @param {Change} change1
|
||||
* @param {Change} change2
|
||||
*
|
||||
* @returns {number} -1 for lower priority, 1 for higher priority, and 0
|
||||
* for equal priority in relation to `change1`, e.g. -1 would mean `change`
|
||||
* has lower priority than `change2`.
|
||||
*
|
||||
* @throws When neither parameter resembles a {@link Change} object.
|
||||
*/
|
||||
static compare (change1, change2) {
|
||||
if (Change.isChange(change1) === false || Change.isChange(change2) === false) {
|
||||
throw Error('can only compare Change instances')
|
||||
}
|
||||
if (change1.operation < change2.operation) {
|
||||
return -1
|
||||
}
|
||||
if (change1.operation > change2.operation) {
|
||||
return 1
|
||||
}
|
||||
return Attribute.compare(change1.modification, change2.modification)
|
||||
}
|
||||
|
||||
/**
|
||||
* Parse a BER into a new {@link Change} object.
|
||||
*
|
||||
* @param {import('@ldapjs/asn1').BerReader} ber The BER to process. It must
|
||||
* be at an offset that starts a new change sequence. The reader will be
|
||||
* advanced to the end of the change sequence by this method.
|
||||
*
|
||||
* @returns {Change}
|
||||
*
|
||||
* @throws When there is an error processing the BER.
|
||||
*/
|
||||
static fromBer (ber) {
|
||||
ber.readSequence()
|
||||
const operation = ber.readEnumeration()
|
||||
const modification = Attribute.fromBer(ber)
|
||||
return new Change({ operation, modification })
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = Change
|
||||
422
node_modules/@ldapjs/change/index.test.js
generated
vendored
Normal file
422
node_modules/@ldapjs/change/index.test.js
generated
vendored
Normal file
@ -0,0 +1,422 @@
|
||||
'use strict'
|
||||
|
||||
const tap = require('tap')
|
||||
const { BerReader } = require('@ldapjs/asn1')
|
||||
const Attribute = require('@ldapjs/attribute')
|
||||
const Change = require('./index')
|
||||
|
||||
tap.test('constructor', t => {
|
||||
t.test('throws for bad operation', async t => {
|
||||
t.throws(
|
||||
() => new Change({ operation: 'bad' }),
|
||||
Error('invalid operation type: bad')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('throws for bad modification', async t => {
|
||||
t.throws(
|
||||
() => new Change({ modification: 'bad' }),
|
||||
Error('modification must be an Attribute')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('creates an instance', async t => {
|
||||
const change = new Change({
|
||||
modification: new Attribute()
|
||||
})
|
||||
t.equal(change.operation, 'add')
|
||||
t.type(change.modification, Attribute)
|
||||
t.equal(Object.prototype.toString.call(change), '[object LdapChange]')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('modification', t => {
|
||||
t.test('gets', async t => {
|
||||
const attr = new Attribute()
|
||||
const change = new Change({ modification: attr })
|
||||
t.equal(change.modification, attr)
|
||||
})
|
||||
|
||||
t.test('sets', async t => {
|
||||
const attr1 = new Attribute()
|
||||
const attr2 = new Attribute()
|
||||
const change = new Change({ modification: attr1 })
|
||||
t.equal(change.modification, attr1)
|
||||
change.modification = attr2
|
||||
t.equal(change.modification, attr2)
|
||||
t.not(attr1, attr2)
|
||||
})
|
||||
|
||||
t.test('throws if value is not attribute-like', async t => {
|
||||
const change = new Change({ modification: new Attribute() })
|
||||
t.throws(
|
||||
() => { change.modification = { foo: 'foo' } },
|
||||
Error('modification must be an Attribute')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('converts attribute-like to Attribute', async t => {
|
||||
const change = new Change({
|
||||
modification: {
|
||||
type: 'dn=foo,dc=example,dc=com',
|
||||
values: []
|
||||
}
|
||||
})
|
||||
t.equal(
|
||||
Object.prototype.toString.call(change.modification),
|
||||
'[object LdapAttribute]'
|
||||
)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('.operation', t => {
|
||||
const attr = new Attribute()
|
||||
const change = new Change({ modification: attr })
|
||||
|
||||
t.test('throws for unrecognized operation', async t => {
|
||||
t.throws(
|
||||
() => { change.operation = 'bad' },
|
||||
Error('invalid operation type: bad')
|
||||
)
|
||||
t.throws(
|
||||
() => { change.operation = 0xff },
|
||||
Error('invalid operation type: 0xff')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('sets and gets', async t => {
|
||||
change.operation = 0
|
||||
t.equal(change.operation, 'add')
|
||||
change.operation = 'add'
|
||||
t.equal(change.operation, 'add')
|
||||
|
||||
change.operation = 1
|
||||
t.equal(change.operation, 'delete')
|
||||
change.operation = 'delete'
|
||||
t.equal(change.operation, 'delete')
|
||||
|
||||
change.operation = 2
|
||||
t.equal(change.operation, 'replace')
|
||||
change.operation = 'replace'
|
||||
t.equal(change.operation, 'replace')
|
||||
|
||||
change.operation = 'Replace'
|
||||
t.equal(change.operation, 'replace')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('.pojo', t => {
|
||||
t.test('returns a plain object', async t => {
|
||||
const change = new Change({
|
||||
modification: new Attribute()
|
||||
})
|
||||
const expected = {
|
||||
operation: 'add',
|
||||
modification: {
|
||||
type: '',
|
||||
values: []
|
||||
}
|
||||
}
|
||||
t.strictSame(change.pojo, expected)
|
||||
t.strictSame(change.toJSON(), expected)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('toBer', t => {
|
||||
t.test('serializes to ber', async t => {
|
||||
const expected = Buffer.from([
|
||||
0x30, 0x15, // sequence, 21 bytes
|
||||
0x0a, 0x01, 0x00, // enumerated value 0
|
||||
0x30, 0x10, // sequence, 16 bytes
|
||||
0x04, 0x02, // string, 2 bytes
|
||||
0x63, 0x6e, // 'cn'
|
||||
0x31, 0x0a, // sequence of strings, 10 bytes
|
||||
0x04, 0x03, // string, 3 bytes
|
||||
0x66, 0x6f, 0x6f, // 'foo'
|
||||
0x04, 0x03, // string 3 bytes
|
||||
0x62, 0x61, 0x72
|
||||
])
|
||||
const change = new Change({
|
||||
modification: {
|
||||
type: 'cn',
|
||||
values: ['foo', 'bar']
|
||||
}
|
||||
})
|
||||
const ber = change.toBer()
|
||||
t.equal(expected.compare(ber.buffer), 0)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('#apply', t => {
|
||||
t.test('throws if change is not a Change', async t => {
|
||||
t.throws(
|
||||
() => Change.apply({}, {}),
|
||||
Error('change must be an instance of Change')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('applies to a target with no type', async t => {
|
||||
const attr = new Attribute({
|
||||
type: 'cn',
|
||||
values: ['new']
|
||||
})
|
||||
const change = new Change({ modification: attr })
|
||||
const target = {}
|
||||
Change.apply(change, target)
|
||||
t.strictSame(target, {
|
||||
cn: ['new']
|
||||
})
|
||||
})
|
||||
|
||||
t.test('applies to a target with a scalar type', async t => {
|
||||
const attr = new Attribute({
|
||||
type: 'cn',
|
||||
values: ['new']
|
||||
})
|
||||
const change = new Change({ modification: attr })
|
||||
const target = { cn: 'old' }
|
||||
Change.apply(change, target)
|
||||
t.strictSame(target, {
|
||||
cn: ['old', 'new']
|
||||
})
|
||||
})
|
||||
|
||||
t.test('applies to a target with an array type', async t => {
|
||||
const attr = new Attribute({
|
||||
type: 'cn',
|
||||
values: ['new']
|
||||
})
|
||||
const change = new Change({ modification: attr })
|
||||
const target = { cn: ['old'] }
|
||||
Change.apply(change, target)
|
||||
t.strictSame(target, {
|
||||
cn: ['old', 'new']
|
||||
})
|
||||
})
|
||||
|
||||
t.test('add operation adds only new values', async t => {
|
||||
const attr = new Attribute({
|
||||
type: 'cn',
|
||||
values: ['new', 'foo']
|
||||
})
|
||||
const change = new Change({ modification: attr })
|
||||
const target = { cn: ['old', 'new'] }
|
||||
Change.apply(change, target)
|
||||
t.strictSame(target, {
|
||||
cn: ['old', 'new', 'foo']
|
||||
})
|
||||
})
|
||||
|
||||
t.test('delete operation removes property', async t => {
|
||||
const attr = new Attribute({
|
||||
type: 'cn',
|
||||
values: ['new']
|
||||
})
|
||||
const change = new Change({
|
||||
operation: 'delete',
|
||||
modification: attr
|
||||
})
|
||||
const target = { cn: ['new'] }
|
||||
Change.apply(change, target)
|
||||
t.strictSame(target, {})
|
||||
})
|
||||
|
||||
t.test('delete operation removes values', async t => {
|
||||
const attr = new Attribute({
|
||||
type: 'cn',
|
||||
values: ['remove_me']
|
||||
})
|
||||
const change = new Change({
|
||||
operation: 'delete',
|
||||
modification: attr
|
||||
})
|
||||
const target = { cn: ['remove_me', 'keep_me'] }
|
||||
Change.apply(change, target)
|
||||
t.strictSame(target, {
|
||||
cn: ['keep_me']
|
||||
})
|
||||
})
|
||||
|
||||
t.test('replace removes empty set', async t => {
|
||||
const attr = new Attribute({
|
||||
type: 'cn',
|
||||
values: []
|
||||
})
|
||||
const change = new Change({
|
||||
operation: 'replace',
|
||||
modification: attr
|
||||
})
|
||||
const target = { cn: ['old'] }
|
||||
Change.apply(change, target)
|
||||
t.strictSame(target, {})
|
||||
})
|
||||
|
||||
t.test('replace removes values', async t => {
|
||||
const attr = new Attribute({
|
||||
type: 'cn',
|
||||
values: ['new_set']
|
||||
})
|
||||
const change = new Change({
|
||||
operation: 'replace',
|
||||
modification: attr
|
||||
})
|
||||
const target = { cn: ['old_set'] }
|
||||
Change.apply(change, target)
|
||||
t.strictSame(target, {
|
||||
cn: ['new_set']
|
||||
})
|
||||
})
|
||||
|
||||
t.test('scalar option works for new single values', async t => {
|
||||
const attr = new Attribute({
|
||||
type: 'cn',
|
||||
values: ['new']
|
||||
})
|
||||
const change = new Change({ modification: attr })
|
||||
const target = {}
|
||||
Change.apply(change, target, true)
|
||||
t.strictSame(target, {
|
||||
cn: 'new'
|
||||
})
|
||||
})
|
||||
|
||||
t.test('scalar option is ignored for multiple values', async t => {
|
||||
const attr = new Attribute({
|
||||
type: 'cn',
|
||||
values: ['new']
|
||||
})
|
||||
const change = new Change({ modification: attr })
|
||||
const target = {
|
||||
cn: ['old']
|
||||
}
|
||||
Change.apply(change, target, true)
|
||||
t.strictSame(target, {
|
||||
cn: ['old', 'new']
|
||||
})
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('#isChange', t => {
|
||||
t.test('true for instance', async t => {
|
||||
const change = new Change({ modification: new Attribute() })
|
||||
t.equal(Change.isChange(change), true)
|
||||
})
|
||||
|
||||
t.test('false for non-object', async t => {
|
||||
t.equal(Change.isChange([]), false)
|
||||
})
|
||||
|
||||
t.test('true for shape match', async t => {
|
||||
const change = {
|
||||
operation: 'add',
|
||||
modification: {
|
||||
type: '',
|
||||
values: []
|
||||
}
|
||||
}
|
||||
t.equal(Change.isChange(change), true)
|
||||
|
||||
change.operation = 0
|
||||
change.modification = new Attribute()
|
||||
t.equal(Change.isChange(change), true)
|
||||
})
|
||||
|
||||
t.test('false for shape mis-match', async t => {
|
||||
const change = {
|
||||
operation: 'add',
|
||||
mod: {
|
||||
type: '',
|
||||
values: []
|
||||
}
|
||||
}
|
||||
t.equal(Change.isChange(change), false)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('#compare', t => {
|
||||
t.test('throws if params are not changes', async t => {
|
||||
const change = new Change({ modification: new Attribute() })
|
||||
const expected = Error('can only compare Change instances')
|
||||
t.throws(
|
||||
() => Change.compare({}, change),
|
||||
expected
|
||||
)
|
||||
t.throws(
|
||||
() => Change.compare(change, {}),
|
||||
expected
|
||||
)
|
||||
})
|
||||
|
||||
t.test('orders add first', async t => {
|
||||
const change1 = new Change({ modification: new Attribute() })
|
||||
const change2 = new Change({
|
||||
operation: 'delete',
|
||||
modification: new Attribute()
|
||||
})
|
||||
|
||||
t.equal(Change.compare(change1, change2), -1)
|
||||
|
||||
change2.operation = 'replace'
|
||||
t.equal(Change.compare(change1, change2), -1)
|
||||
})
|
||||
|
||||
t.test('orders delete above add', async t => {
|
||||
const change1 = new Change({ modification: new Attribute() })
|
||||
const change2 = new Change({
|
||||
operation: 'delete',
|
||||
modification: new Attribute()
|
||||
})
|
||||
|
||||
t.equal(Change.compare(change2, change1), 1)
|
||||
})
|
||||
|
||||
t.test('orders by attribute for same operation', async t => {
|
||||
const change1 = new Change({ modification: new Attribute() })
|
||||
const change2 = new Change({ modification: new Attribute() })
|
||||
t.equal(Change.compare(change1, change2), 0)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('#fromBer', t => {
|
||||
t.test('creates instance', async t => {
|
||||
const bytes = [
|
||||
0x30, 0x15, // sequence, 21 bytes
|
||||
0x0a, 0x01, 0x00, // enumerated value 0
|
||||
0x30, 0x10, // sequence, 16 bytes
|
||||
0x04, 0x02, // string, 2 bytes
|
||||
0x63, 0x6e, // 'cn'
|
||||
0x31, 0x0a, // sequence of strings, 10 bytes
|
||||
0x04, 0x03, // string, 3 bytes
|
||||
0x66, 0x6f, 0x6f, // 'foo'
|
||||
0x04, 0x03, // string 3 bytes
|
||||
0x62, 0x61, 0x72
|
||||
]
|
||||
const reader = new BerReader(Buffer.from(bytes))
|
||||
const change = Change.fromBer(reader)
|
||||
t.strictSame(change.pojo, {
|
||||
operation: 'add',
|
||||
modification: {
|
||||
type: 'cn',
|
||||
values: ['foo', 'bar']
|
||||
}
|
||||
})
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
43
node_modules/@ldapjs/change/package.json
generated
vendored
Normal file
43
node_modules/@ldapjs/change/package.json
generated
vendored
Normal file
@ -0,0 +1,43 @@
|
||||
{
|
||||
"originalAuthor": "Patrick Mooney",
|
||||
"originalContributors": [
|
||||
"Mark Cavage <mcavage@gmail.com>",
|
||||
"Cody Peter Mello <cody.mello@joyent.com>"
|
||||
],
|
||||
"name": "@ldapjs/change",
|
||||
"homepage": "https://github.com/ldapjs/change",
|
||||
"description": "API for handling LDAP change objects",
|
||||
"version": "1.0.0",
|
||||
"license": "MIT",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git@github.com:ldapjs/change.git"
|
||||
},
|
||||
"main": "index.js",
|
||||
"dependencies": {
|
||||
"@ldapjs/asn1": "2.0.0",
|
||||
"@ldapjs/attribute": "1.0.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@fastify/pre-commit": "^2.0.2",
|
||||
"eslint": "^8.34.0",
|
||||
"eslint-config-standard": "^17.0.0",
|
||||
"eslint-plugin-import": "^2.27.5",
|
||||
"eslint-plugin-n": "^15.6.1",
|
||||
"eslint-plugin-node": "^11.1.0",
|
||||
"eslint-plugin-promise": "^6.1.1",
|
||||
"tap": "^16.3.4"
|
||||
},
|
||||
"scripts": {
|
||||
"lint": "eslint .",
|
||||
"lint:ci": "eslint .",
|
||||
"test": "tap --no-coverage-report",
|
||||
"test:cov": "tap",
|
||||
"test:cov:html": "tap --coverage-report=html",
|
||||
"test:watch": "tap -w --no-coverage-report"
|
||||
},
|
||||
"precommit": [
|
||||
"lint",
|
||||
"test"
|
||||
]
|
||||
}
|
||||
9
node_modules/@ldapjs/controls/.eslintrc
generated
vendored
Normal file
9
node_modules/@ldapjs/controls/.eslintrc
generated
vendored
Normal file
@ -0,0 +1,9 @@
|
||||
{
|
||||
"parserOptions": {
|
||||
"ecmaVersion": "latest"
|
||||
},
|
||||
|
||||
"extends": [
|
||||
"standard"
|
||||
]
|
||||
}
|
||||
10
node_modules/@ldapjs/controls/.github/workflows/main.yml
generated
vendored
Normal file
10
node_modules/@ldapjs/controls/.github/workflows/main.yml
generated
vendored
Normal file
@ -0,0 +1,10 @@
|
||||
name: "CI"
|
||||
on:
|
||||
pull_request:
|
||||
push:
|
||||
branches:
|
||||
- master
|
||||
|
||||
jobs:
|
||||
call-core-ci:
|
||||
uses: ldapjs/.github/.github/workflows/node-ci.yml@main
|
||||
5
node_modules/@ldapjs/controls/.taprc.yml
generated
vendored
Normal file
5
node_modules/@ldapjs/controls/.taprc.yml
generated
vendored
Normal file
@ -0,0 +1,5 @@
|
||||
files:
|
||||
- 'index.test.js'
|
||||
- 'lib/**/*.test.js'
|
||||
|
||||
coverage-map: coverage-map.js
|
||||
22
node_modules/@ldapjs/controls/LICENSE
generated
vendored
Normal file
22
node_modules/@ldapjs/controls/LICENSE
generated
vendored
Normal file
@ -0,0 +1,22 @@
|
||||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) 2011 Mark Cavage, All rights reserved.
|
||||
Copyright (c) 2022 The LDAPJS Collaborators.
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE
|
||||
12
node_modules/@ldapjs/controls/Readme.md
generated
vendored
Normal file
12
node_modules/@ldapjs/controls/Readme.md
generated
vendored
Normal file
@ -0,0 +1,12 @@
|
||||
# @ldapjs/controls
|
||||
|
||||
This package provides implementations of [LDAP controls][controls]. The
|
||||
primary purpose of this library is to facilitate client and server
|
||||
implementations in the [`ldapjs`](https://npm.im/ldapjs) package.
|
||||
|
||||
## Docs
|
||||
|
||||
At this time, one must reference the code to learn about the available
|
||||
controls and their methods.
|
||||
|
||||
[controls]: https://datatracker.ietf.org/doc/html/rfc4511#section-4.1.11
|
||||
13
node_modules/@ldapjs/controls/coverage-map.js
generated
vendored
Normal file
13
node_modules/@ldapjs/controls/coverage-map.js
generated
vendored
Normal file
@ -0,0 +1,13 @@
|
||||
module.exports = testFile => {
|
||||
if (testFile.startsWith('virtual-list-view-request-control') === true) {
|
||||
// Do not count towards coverage as it is disabled.
|
||||
return false
|
||||
}
|
||||
|
||||
if (testFile.startsWith('virtual-list-view-response-control') === true) {
|
||||
// Do not count towards coverage as it is disabled.
|
||||
return false
|
||||
}
|
||||
|
||||
testFile.replace(/\.test\.js$/, '.js')
|
||||
}
|
||||
103
node_modules/@ldapjs/controls/index.js
generated
vendored
Normal file
103
node_modules/@ldapjs/controls/index.js
generated
vendored
Normal file
@ -0,0 +1,103 @@
|
||||
'use strict'
|
||||
|
||||
const { Ber } = require('@ldapjs/asn1')
|
||||
|
||||
const Control = require('./lib/control')
|
||||
const EntryChangeNotificationControl = require('./lib/controls/entry-change-notification-control')
|
||||
const PagedResultsControl = require('./lib/controls/paged-results-control')
|
||||
const PasswordPolicyControl = require('./lib/controls/password-policy-control')
|
||||
const PersistentSearchControl = require('./lib/controls/persistent-search-control')
|
||||
const ServerSideSortingRequestControl = require('./lib/controls/server-side-sorting-request-control')
|
||||
const ServerSideSortingResponseControl = require('./lib/controls/server-side-sorting-response-control')
|
||||
const VirtualListViewRequestControl = require('./lib/controls/virtual-list-view-request-control')
|
||||
const VirtualListViewResponseControl = require('./lib/controls/virtual-list-view-response-control')
|
||||
|
||||
module.exports = {
|
||||
|
||||
getControl: function getControl (ber) {
|
||||
if (!ber) throw TypeError('ber must be provided')
|
||||
|
||||
if (ber.readSequence() === null) { return null }
|
||||
|
||||
let type
|
||||
const opts = {
|
||||
criticality: false,
|
||||
value: null
|
||||
}
|
||||
|
||||
/* istanbul ignore else */
|
||||
if (ber.length) {
|
||||
const end = ber.offset + ber.length
|
||||
|
||||
type = ber.readString()
|
||||
/* istanbul ignore else */
|
||||
if (ber.offset < end) {
|
||||
/* istanbul ignore else */
|
||||
if (ber.peek() === Ber.Boolean) { opts.criticality = ber.readBoolean() }
|
||||
}
|
||||
|
||||
if (ber.offset < end) { opts.value = ber.readString(Ber.OctetString, true) }
|
||||
}
|
||||
|
||||
let control
|
||||
switch (type) {
|
||||
case EntryChangeNotificationControl.OID: {
|
||||
control = new EntryChangeNotificationControl(opts)
|
||||
break
|
||||
}
|
||||
|
||||
case PagedResultsControl.OID: {
|
||||
control = new PagedResultsControl(opts)
|
||||
break
|
||||
}
|
||||
|
||||
case PasswordPolicyControl.OID: {
|
||||
control = new PasswordPolicyControl(opts)
|
||||
break
|
||||
}
|
||||
|
||||
case PersistentSearchControl.OID: {
|
||||
control = new PersistentSearchControl(opts)
|
||||
break
|
||||
}
|
||||
|
||||
case ServerSideSortingRequestControl.OID: {
|
||||
control = new ServerSideSortingRequestControl(opts)
|
||||
break
|
||||
}
|
||||
|
||||
case ServerSideSortingResponseControl.OID: {
|
||||
control = new ServerSideSortingResponseControl(opts)
|
||||
break
|
||||
}
|
||||
|
||||
case VirtualListViewRequestControl.OID: {
|
||||
control = new VirtualListViewRequestControl(opts)
|
||||
break
|
||||
}
|
||||
|
||||
case VirtualListViewResponseControl.OID: {
|
||||
control = new VirtualListViewResponseControl(opts)
|
||||
break
|
||||
}
|
||||
|
||||
default: {
|
||||
opts.type = type
|
||||
control = new Control(opts)
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
return control
|
||||
},
|
||||
|
||||
Control,
|
||||
EntryChangeNotificationControl,
|
||||
PagedResultsControl,
|
||||
PasswordPolicyControl,
|
||||
PersistentSearchControl,
|
||||
ServerSideSortingRequestControl,
|
||||
ServerSideSortingResponseControl,
|
||||
VirtualListViewRequestControl,
|
||||
VirtualListViewResponseControl
|
||||
}
|
||||
161
node_modules/@ldapjs/controls/index.test.js
generated
vendored
Normal file
161
node_modules/@ldapjs/controls/index.test.js
generated
vendored
Normal file
@ -0,0 +1,161 @@
|
||||
'use strict'
|
||||
|
||||
const tap = require('tap')
|
||||
const { BerReader, BerWriter } = require('@ldapjs/asn1')
|
||||
const controls = require('.')
|
||||
|
||||
tap.test('#getControl', t => {
|
||||
t.test('requires a BER to parse', async t => {
|
||||
try {
|
||||
controls.getControl()
|
||||
t.fail('should throw exception')
|
||||
} catch (error) {
|
||||
t.match(error, /ber must be provided/)
|
||||
}
|
||||
})
|
||||
|
||||
t.test('returns null for empty BER', async t => {
|
||||
const result = controls.getControl(new BerReader(Buffer.alloc(0)))
|
||||
t.equal(result, null)
|
||||
})
|
||||
|
||||
t.test('parses a BER (control)', async t => {
|
||||
const ber = new BerWriter()
|
||||
ber.startSequence()
|
||||
ber.writeString('2.16.840.1.113730.3.4.2')
|
||||
ber.writeBoolean(true)
|
||||
ber.writeString('foo')
|
||||
ber.endSequence()
|
||||
|
||||
const control = controls.getControl(new BerReader(ber.buffer))
|
||||
|
||||
t.ok(control)
|
||||
t.equal(control.type, '2.16.840.1.113730.3.4.2')
|
||||
t.ok(control.criticality)
|
||||
t.equal(control.value.toString('utf8'), 'foo')
|
||||
t.end()
|
||||
})
|
||||
|
||||
t.test('parses BER with no value', function (t) {
|
||||
const ber = new BerWriter()
|
||||
ber.startSequence()
|
||||
ber.writeString('2.16.840.1.113730.3.4.2')
|
||||
ber.endSequence()
|
||||
|
||||
const control = controls.getControl(new BerReader(ber.buffer))
|
||||
|
||||
t.ok(control)
|
||||
t.equal(control.type, '2.16.840.1.113730.3.4.2')
|
||||
t.equal(control.criticality, false)
|
||||
t.notOk(control.value, null)
|
||||
t.end()
|
||||
})
|
||||
|
||||
t.test('returns a EntryChangeNotificationControl', async t => {
|
||||
const ecnc = new controls.EntryChangeNotificationControl({
|
||||
type: controls.EntryChangeNotificationControl.OID,
|
||||
criticality: true,
|
||||
value: {
|
||||
changeType: 8,
|
||||
previousDN: 'cn=foobarbazcar',
|
||||
changeNumber: 123456789
|
||||
}
|
||||
})
|
||||
|
||||
const ber = new BerWriter()
|
||||
ecnc.toBer(ber)
|
||||
|
||||
const c = controls.getControl(new BerReader(ber.buffer))
|
||||
t.ok(c)
|
||||
t.equal(c.type, controls.EntryChangeNotificationControl.OID)
|
||||
t.ok(c.criticality)
|
||||
t.equal(c.value.changeType, 8)
|
||||
t.equal(c.value.previousDN, 'cn=foobarbazcar')
|
||||
t.equal(c.value.changeNumber, 123456789)
|
||||
})
|
||||
|
||||
t.test('returns a PagedResultsControl', async t => {
|
||||
const prc = new controls.PagedResultsControl({
|
||||
type: controls.PagedResultsControl.OID,
|
||||
criticality: true,
|
||||
value: {
|
||||
size: 20,
|
||||
cookie: Buffer.alloc(0)
|
||||
}
|
||||
})
|
||||
|
||||
const ber = new BerWriter()
|
||||
prc.toBer(ber)
|
||||
|
||||
const c = controls.getControl(new BerReader(ber.buffer))
|
||||
t.ok(c)
|
||||
t.equal(c.type, controls.PagedResultsControl.OID)
|
||||
t.ok(c.criticality)
|
||||
t.equal(c.value.size, 20)
|
||||
t.equal(Buffer.compare(c.value.cookie, Buffer.alloc(0)), 0)
|
||||
})
|
||||
|
||||
t.test('returns a PasswordPolicyControl', async t => {
|
||||
const ppc = new controls.PasswordPolicyControl({
|
||||
type: controls.PasswordPolicyControl.OID,
|
||||
criticality: true,
|
||||
value: {
|
||||
error: 1,
|
||||
timeBeforeExpiration: 2
|
||||
}
|
||||
})
|
||||
|
||||
const ber = new BerWriter()
|
||||
ppc.toBer(ber)
|
||||
|
||||
const c = controls.getControl(new BerReader(ber.buffer))
|
||||
t.ok(c)
|
||||
t.equal(c.type, controls.PasswordPolicyControl.OID)
|
||||
t.ok(c.criticality)
|
||||
t.equal(c.value.error, 1)
|
||||
t.equal(c.value.timeBeforeExpiration, 2)
|
||||
})
|
||||
|
||||
t.test('returns a PersistentSearchControl', async t => {
|
||||
const buf = Buffer.from([
|
||||
0x30, 0x26, 0x04, 0x17, 0x32, 0x2e, 0x31, 0x36, 0x2e, 0x38, 0x34, 0x30,
|
||||
0x2e, 0x31, 0x2e, 0x31, 0x31, 0x33, 0x37, 0x33, 0x30, 0x2e, 0x33, 0x2e,
|
||||
0x34, 0x2e, 0x33, 0x04, 0x0b, 0x30, 0x09, 0x02, 0x01, 0x0f, 0x01, 0x01,
|
||||
0xff, 0x01, 0x01, 0xff])
|
||||
|
||||
const ber = new BerReader(buf)
|
||||
const psc = controls.getControl(ber)
|
||||
t.ok(psc)
|
||||
t.equal(psc.type, controls.PersistentSearchControl.OID)
|
||||
t.equal(psc.criticality, false)
|
||||
t.equal(psc.value.changeTypes, 15)
|
||||
t.equal(psc.value.changesOnly, true)
|
||||
t.equal(psc.value.returnECs, true)
|
||||
})
|
||||
|
||||
t.test('returns a ServerSideSortingRequestControl', async t => {
|
||||
const sssc = new controls.ServerSideSortingRequestControl()
|
||||
const ber = new BerWriter()
|
||||
sssc.toBer(ber)
|
||||
|
||||
const c = controls.getControl(new BerReader(ber.buffer))
|
||||
t.ok(c)
|
||||
t.equal(c.type, controls.ServerSideSortingRequestControl.OID)
|
||||
t.equal(c.value.length, 0)
|
||||
})
|
||||
|
||||
t.test('returns a ServerSideSortingResponseControl', async t => {
|
||||
const sssc = new controls.ServerSideSortingResponseControl()
|
||||
const ber = new BerWriter()
|
||||
sssc.toBer(ber)
|
||||
|
||||
const c = controls.getControl(new BerReader(ber.buffer))
|
||||
t.ok(c)
|
||||
t.equal(c.type, controls.ServerSideSortingResponseControl.OID)
|
||||
t.equal(c.criticality, false)
|
||||
t.notOk(c.value.result)
|
||||
t.notOk(c.value.failedAttribute)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
88
node_modules/@ldapjs/controls/lib/control.js
generated
vendored
Normal file
88
node_modules/@ldapjs/controls/lib/control.js
generated
vendored
Normal file
@ -0,0 +1,88 @@
|
||||
'use strict'
|
||||
|
||||
const { BerWriter } = require('@ldapjs/asn1')
|
||||
|
||||
/**
|
||||
* Baseline LDAP control object. Implements
|
||||
* https://tools.ietf.org/html/rfc4511#section-4.1.11
|
||||
*
|
||||
* @class
|
||||
*/
|
||||
class Control {
|
||||
/**
|
||||
* @typedef {object} ControlParams
|
||||
* @property {string} [type=''] The dotted decimal control type value.
|
||||
* @property {boolean} [criticality=false] Criticality value for the control.
|
||||
* @property {string|Buffer} [value] The value for the control. If this is
|
||||
* a `string` then it will be written as-is. If it is an instance of `Buffer`
|
||||
* then it will be written by `value.toString()` when generating a BER
|
||||
* instance.
|
||||
*/
|
||||
|
||||
/**
|
||||
* Create a new baseline LDAP control.
|
||||
*
|
||||
* @param {ControlParams} [options]
|
||||
*/
|
||||
constructor (options = {}) {
|
||||
const opts = Object.assign({ type: '', criticality: false, value: null }, options)
|
||||
this.type = opts.type
|
||||
this.criticality = opts.criticality
|
||||
this.value = opts.value
|
||||
}
|
||||
|
||||
get [Symbol.toStringTag] () {
|
||||
return 'LdapControl'
|
||||
}
|
||||
|
||||
/**
|
||||
* Serializes the control into a plain JavaScript object that can be passed
|
||||
* to the constructor as an options object. If an instance has a `_pojo(obj)`
|
||||
* method then the built object will be sent to that method and the resulting
|
||||
* mutated object returned.
|
||||
*
|
||||
* @returns {object} A plain JavaScript object that represents an LDAP control.
|
||||
*/
|
||||
get pojo () {
|
||||
const obj = {
|
||||
type: this.type,
|
||||
value: this.value,
|
||||
criticality: this.criticality
|
||||
}
|
||||
|
||||
if (typeof this._pojo === 'function') {
|
||||
this._pojo(obj)
|
||||
}
|
||||
|
||||
return obj
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts the instance into a [BER](http://luca.ntop.org/Teaching/Appunti/asn1.html)
|
||||
* representation.
|
||||
*
|
||||
* @param {BerWriter} [ber] An empty `BerWriter` instance to populate.
|
||||
*
|
||||
* @returns {object} A BER object.
|
||||
*/
|
||||
toBer (ber = new BerWriter()) {
|
||||
ber.startSequence()
|
||||
ber.writeString(this.type || '')
|
||||
ber.writeBoolean(this.criticality)
|
||||
|
||||
/* istanbul ignore else */
|
||||
if (typeof (this._toBer) === 'function') {
|
||||
this._toBer(ber)
|
||||
} else if (this.value !== undefined) {
|
||||
if (typeof this.value === 'string') {
|
||||
ber.writeString(this.value)
|
||||
} else if (Buffer.isBuffer(this.value)) {
|
||||
ber.writeString(this.value.toString())
|
||||
}
|
||||
}
|
||||
|
||||
ber.endSequence()
|
||||
return ber
|
||||
}
|
||||
}
|
||||
module.exports = Control
|
||||
159
node_modules/@ldapjs/controls/lib/control.test.js
generated
vendored
Normal file
159
node_modules/@ldapjs/controls/lib/control.test.js
generated
vendored
Normal file
@ -0,0 +1,159 @@
|
||||
'use strict'
|
||||
|
||||
const tap = require('tap')
|
||||
const { BerWriter } = require('@ldapjs/asn1')
|
||||
const Control = require('./control')
|
||||
|
||||
tap.test('constructor', t => {
|
||||
t.test('new no args', function (t) {
|
||||
t.ok(new Control())
|
||||
t.equal(Object.prototype.toString.call(new Control()), '[object LdapControl]')
|
||||
t.end()
|
||||
})
|
||||
|
||||
t.test('new with args', function (t) {
|
||||
const c = new Control({
|
||||
type: '2.16.840.1.113730.3.4.2',
|
||||
criticality: true
|
||||
})
|
||||
t.ok(c)
|
||||
t.equal(c.type, '2.16.840.1.113730.3.4.2')
|
||||
t.ok(c.criticality)
|
||||
t.end()
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('pojo', t => {
|
||||
t.test('passes through _pojo', async t => {
|
||||
class Foo extends Control {
|
||||
_pojo (obj) {
|
||||
obj.foo = 'foo'
|
||||
}
|
||||
}
|
||||
const control = new Foo()
|
||||
t.strictSame(control.pojo, {
|
||||
type: '',
|
||||
value: null,
|
||||
criticality: false,
|
||||
foo: 'foo'
|
||||
})
|
||||
})
|
||||
|
||||
t.test('returns basic object', async t => {
|
||||
const control = new Control({ type: '1.2.3', criticality: false, value: 'foo' })
|
||||
t.strictSame(control.pojo, {
|
||||
type: '1.2.3',
|
||||
value: 'foo',
|
||||
criticality: false
|
||||
})
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('toBer', t => {
|
||||
t.test('converts empty instance to BER', async t => {
|
||||
const target = new BerWriter()
|
||||
target.startSequence()
|
||||
target.writeString('')
|
||||
target.writeBoolean(false)
|
||||
target.endSequence()
|
||||
|
||||
const control = new Control()
|
||||
const ber = control.toBer()
|
||||
|
||||
t.equal(Buffer.compare(ber.buffer, target.buffer), 0)
|
||||
})
|
||||
|
||||
t.test('converts instance to BER', async t => {
|
||||
const target = new BerWriter()
|
||||
target.startSequence()
|
||||
target.writeString('2.16.840.1.113730.3.4.2')
|
||||
target.writeBoolean(true)
|
||||
target.writeString('foo')
|
||||
target.endSequence()
|
||||
|
||||
const control = new Control({
|
||||
type: '2.16.840.1.113730.3.4.2',
|
||||
criticality: true,
|
||||
value: Buffer.from('foo', 'utf8')
|
||||
})
|
||||
const ber = control.toBer()
|
||||
|
||||
t.equal(Buffer.compare(ber.buffer, target.buffer), 0)
|
||||
})
|
||||
|
||||
t.test('converts instance to BER (side effect manner)', async t => {
|
||||
const target = new BerWriter()
|
||||
target.startSequence()
|
||||
target.writeString('2.16.840.1.113730.3.4.2')
|
||||
target.writeBoolean(true)
|
||||
target.writeString('foo')
|
||||
target.endSequence()
|
||||
|
||||
const control = new Control({
|
||||
type: '2.16.840.1.113730.3.4.2',
|
||||
criticality: true,
|
||||
value: Buffer.from('foo', 'utf8')
|
||||
})
|
||||
const ber = new BerWriter()
|
||||
control.toBer(ber)
|
||||
|
||||
t.equal(Buffer.compare(ber.buffer, target.buffer), 0)
|
||||
})
|
||||
|
||||
t.test('converts instance to BER with string value', async t => {
|
||||
const target = new BerWriter()
|
||||
target.startSequence()
|
||||
target.writeString('2.16.840.1.113730.3.4.2')
|
||||
target.writeBoolean(true)
|
||||
target.writeString('foo')
|
||||
target.endSequence()
|
||||
|
||||
const control = new Control({
|
||||
type: '2.16.840.1.113730.3.4.2',
|
||||
criticality: true,
|
||||
value: 'foo'
|
||||
})
|
||||
const ber = control.toBer()
|
||||
|
||||
t.equal(Buffer.compare(ber.buffer, target.buffer), 0)
|
||||
})
|
||||
|
||||
t.test('ignores unrecognized value', async t => {
|
||||
const target = new BerWriter()
|
||||
target.startSequence()
|
||||
target.writeString('2.16.840.1.113730.3.4.2')
|
||||
target.writeBoolean(true)
|
||||
target.writeBoolean(false)
|
||||
target.endSequence()
|
||||
|
||||
const control = new Control({
|
||||
type: '2.16.840.1.113730.3.4.2',
|
||||
criticality: true,
|
||||
value: false
|
||||
})
|
||||
const ber = control.toBer()
|
||||
|
||||
t.not(Buffer.compare(ber.buffer, target.buffer), 0)
|
||||
})
|
||||
|
||||
t.test('passes through _toBer', async t => {
|
||||
t.plan(2)
|
||||
const target = new BerWriter()
|
||||
target.startSequence()
|
||||
target.writeString('')
|
||||
target.writeBoolean(false)
|
||||
target.endSequence()
|
||||
|
||||
const control = new Control()
|
||||
control._toBer = (ber) => t.ok(ber)
|
||||
const ber = control.toBer()
|
||||
|
||||
t.equal(Buffer.compare(ber.buffer, target.buffer), 0)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
107
node_modules/@ldapjs/controls/lib/controls/entry-change-notification-control.js
generated
vendored
Normal file
107
node_modules/@ldapjs/controls/lib/controls/entry-change-notification-control.js
generated
vendored
Normal file
@ -0,0 +1,107 @@
|
||||
'use strict'
|
||||
|
||||
const { BerReader, BerWriter } = require('@ldapjs/asn1')
|
||||
const isObject = require('../is-object')
|
||||
const hasOwn = require('../has-own')
|
||||
const Control = require('../control')
|
||||
|
||||
/**
|
||||
* @typedef {object} EntryChangeNotificationControlValue
|
||||
* @property {number} changeType One of 1 (add), 2 (delete), 4 (modify),
|
||||
* or 8 (modifyDN).
|
||||
* @property {string} previousDN Only set when operation is a modifyDN op.
|
||||
* @property {number} changeNumber
|
||||
*/
|
||||
|
||||
/**
|
||||
* Implements:
|
||||
* https://datatracker.ietf.org/doc/html/draft-ietf-ldapext-psearch-03.txt#section-5
|
||||
*
|
||||
* @extends Control
|
||||
*/
|
||||
class EntryChangeNotificationControl extends Control {
|
||||
static OID = '2.16.840.1.113730.3.4.7'
|
||||
|
||||
/**
|
||||
* @typedef {ControlParams} EntryChangeNotificationParams
|
||||
* @property {EntryChangeNotificationControlValue | Buffer} [value]
|
||||
*/
|
||||
|
||||
/**
|
||||
* Creates a new persistent search control.
|
||||
*
|
||||
* @param {EntryChangeNotificationParams} [options]
|
||||
*/
|
||||
constructor (options = {}) {
|
||||
options.type = EntryChangeNotificationControl.OID
|
||||
super(options)
|
||||
|
||||
this._value = {
|
||||
changeType: 4
|
||||
}
|
||||
|
||||
if (hasOwn(options, 'value') === false) {
|
||||
return
|
||||
}
|
||||
|
||||
if (Buffer.isBuffer(options.value)) {
|
||||
this.#parse(options.value)
|
||||
} else if (isObject(options.value)) {
|
||||
this._value = options.value
|
||||
} else {
|
||||
throw new TypeError('options.value must be a Buffer or Object')
|
||||
}
|
||||
}
|
||||
|
||||
get value () {
|
||||
return this._value
|
||||
}
|
||||
|
||||
set value (obj) {
|
||||
this._value = Object.assign({}, this._value, obj)
|
||||
}
|
||||
|
||||
/**
|
||||
* Given a BER buffer that represents a
|
||||
* {@link EntryChangeNotificationControlValue}, read that buffer into the
|
||||
* current instance.
|
||||
*/
|
||||
#parse (buffer) {
|
||||
const ber = new BerReader(buffer)
|
||||
/* istanbul ignore else */
|
||||
if (ber.readSequence()) {
|
||||
this._value = {
|
||||
changeType: ber.readInt()
|
||||
}
|
||||
|
||||
/* istanbul ignore else */
|
||||
if (this._value.changeType === 8) {
|
||||
// If the operation was moddn, then parse the optional previousDN attr.
|
||||
this._value.previousDN = ber.readString()
|
||||
}
|
||||
|
||||
this._value.changeNumber = ber.readInt()
|
||||
}
|
||||
}
|
||||
|
||||
_toBer (ber) {
|
||||
const writer = new BerWriter()
|
||||
writer.startSequence()
|
||||
writer.writeInt(this._value.changeType)
|
||||
if (this._value.previousDN) { writer.writeString(this._value.previousDN) }
|
||||
|
||||
if (Object.prototype.hasOwnProperty.call(this._value, 'changeNumber')) {
|
||||
writer.writeInt(parseInt(this._value.changeNumber, 10))
|
||||
}
|
||||
writer.endSequence()
|
||||
|
||||
ber.writeBuffer(writer.buffer, 0x04)
|
||||
return ber
|
||||
}
|
||||
|
||||
_updatePlainObject (obj) {
|
||||
obj.controlValue = this.value
|
||||
return obj
|
||||
}
|
||||
}
|
||||
module.exports = EntryChangeNotificationControl
|
||||
133
node_modules/@ldapjs/controls/lib/controls/entry-change-notification-control.test.js
generated
vendored
Normal file
133
node_modules/@ldapjs/controls/lib/controls/entry-change-notification-control.test.js
generated
vendored
Normal file
@ -0,0 +1,133 @@
|
||||
'use strict'
|
||||
|
||||
const tap = require('tap')
|
||||
const { BerWriter } = require('@ldapjs/asn1')
|
||||
const ECNC = require('./entry-change-notification-control')
|
||||
const Control = require('../control')
|
||||
|
||||
tap.test('contructor', t => {
|
||||
t.test('new no args', async t => {
|
||||
const control = new ECNC()
|
||||
t.ok(control)
|
||||
t.type(control, ECNC)
|
||||
t.type(control, Control)
|
||||
t.equal(control.type, ECNC.OID)
|
||||
t.same(control.value, {
|
||||
changeType: 4
|
||||
})
|
||||
})
|
||||
|
||||
t.test('new with args', async t => {
|
||||
const control = new ECNC({
|
||||
type: '2.16.840.1.113730.3.4.7',
|
||||
criticality: true,
|
||||
value: {
|
||||
changeType: 1
|
||||
}
|
||||
})
|
||||
t.ok(control)
|
||||
t.equal(control.type, '2.16.840.1.113730.3.4.7')
|
||||
t.ok(control.criticality)
|
||||
t.same(control.value, {
|
||||
changeType: 1
|
||||
})
|
||||
})
|
||||
|
||||
t.test('with value buffer', async t => {
|
||||
const value = new BerWriter()
|
||||
value.startSequence()
|
||||
value.writeInt(8)
|
||||
value.writeString('dn=foo')
|
||||
value.writeInt(42)
|
||||
value.endSequence()
|
||||
|
||||
const control = new ECNC({ value: value.buffer })
|
||||
t.same(control.value, {
|
||||
changeType: 8,
|
||||
previousDN: 'dn=foo',
|
||||
changeNumber: 42
|
||||
})
|
||||
})
|
||||
|
||||
t.test('throws for bad value', async t => {
|
||||
t.throws(() => new ECNC({ value: 42 }))
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('pojo', t => {
|
||||
t.test('adds control value', async t => {
|
||||
const control = new ECNC({
|
||||
value: {
|
||||
changeType: 8,
|
||||
previousDN: 'dn=foo',
|
||||
changeNumber: 42
|
||||
}
|
||||
})
|
||||
t.strictSame(control.pojo, {
|
||||
type: ECNC.OID,
|
||||
criticality: false,
|
||||
value: {
|
||||
changeType: 8,
|
||||
previousDN: 'dn=foo',
|
||||
changeNumber: 42
|
||||
}
|
||||
})
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('toBer', t => {
|
||||
t.test('converts empty instance to BER', async t => {
|
||||
const target = new BerWriter()
|
||||
target.startSequence()
|
||||
target.writeString(ECNC.OID)
|
||||
target.writeBoolean(false) // Control.criticality
|
||||
|
||||
const value = new BerWriter()
|
||||
value.startSequence()
|
||||
value.writeInt(4)
|
||||
// value.writeInt(0)
|
||||
value.endSequence()
|
||||
|
||||
target.writeBuffer(value.buffer, 0x04)
|
||||
target.endSequence()
|
||||
|
||||
const control = new ECNC()
|
||||
const ber = control.toBer()
|
||||
|
||||
t.equal(Buffer.compare(ber.buffer, target.buffer), 0)
|
||||
})
|
||||
|
||||
t.test('converts instance with full values to BER', async t => {
|
||||
const target = new BerWriter()
|
||||
target.startSequence()
|
||||
target.writeString(ECNC.OID)
|
||||
target.writeBoolean(false) // Control.criticality
|
||||
|
||||
const value = new BerWriter()
|
||||
value.startSequence()
|
||||
value.writeInt(8)
|
||||
value.writeString('dn=foo')
|
||||
value.writeInt(42)
|
||||
value.endSequence()
|
||||
|
||||
target.writeBuffer(value.buffer, 0x04)
|
||||
target.endSequence()
|
||||
|
||||
const control = new ECNC({
|
||||
value: {
|
||||
changeType: 8,
|
||||
previousDN: 'dn=foo',
|
||||
changeNumber: 42
|
||||
}
|
||||
})
|
||||
const ber = control.toBer()
|
||||
|
||||
t.equal(Buffer.compare(ber.buffer, target.buffer), 0)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
103
node_modules/@ldapjs/controls/lib/controls/paged-results-control.js
generated
vendored
Normal file
103
node_modules/@ldapjs/controls/lib/controls/paged-results-control.js
generated
vendored
Normal file
@ -0,0 +1,103 @@
|
||||
'use strict'
|
||||
|
||||
const { Ber, BerReader, BerWriter } = require('@ldapjs/asn1')
|
||||
const isObject = require('../is-object')
|
||||
const hasOwn = require('../has-own')
|
||||
const Control = require('../control')
|
||||
|
||||
/**
|
||||
* @typedef {object} PagedResultsControlValue
|
||||
* @property {number} size The requested page size from a client, or the result
|
||||
* set size estimate from the server.
|
||||
* @property {Buffer} cookie Identifier for the result set.
|
||||
*/
|
||||
|
||||
/**
|
||||
* Implements:
|
||||
* https://datatracker.ietf.org/doc/html/rfc2696#section-2
|
||||
*
|
||||
* @extends Control
|
||||
*/
|
||||
class PagedResultsControl extends Control {
|
||||
static OID = '1.2.840.113556.1.4.319'
|
||||
|
||||
/**
|
||||
* @typedef {ControlParams} PagedResultsParams
|
||||
* @property {PagedResultsControlValue | Buffer} [value]
|
||||
*/
|
||||
|
||||
/**
|
||||
* Creates a new paged results control.
|
||||
*
|
||||
* @param {PagedResultsParams} [options]
|
||||
*/
|
||||
constructor (options = {}) {
|
||||
options.type = PagedResultsControl.OID
|
||||
super(options)
|
||||
|
||||
this._value = {
|
||||
size: 0,
|
||||
cookie: Buffer.alloc(0)
|
||||
}
|
||||
|
||||
if (hasOwn(options, 'value') === false) {
|
||||
return
|
||||
}
|
||||
|
||||
if (Buffer.isBuffer(options.value)) {
|
||||
this.#parse(options.value)
|
||||
} else if (isObject(options.value)) {
|
||||
this.value = options.value
|
||||
} else {
|
||||
throw new TypeError('options.value must be a Buffer or Object')
|
||||
}
|
||||
}
|
||||
|
||||
get value () {
|
||||
return this._value
|
||||
}
|
||||
|
||||
set value (obj) {
|
||||
this._value = Object.assign({}, this._value, obj)
|
||||
if (typeof this._value.cookie === 'string') {
|
||||
this._value.cookie = Buffer.from(this._value.cookie)
|
||||
}
|
||||
}
|
||||
|
||||
#parse (buffer) {
|
||||
const ber = new BerReader(buffer)
|
||||
|
||||
/* istanbul ignore else */
|
||||
if (ber.readSequence()) {
|
||||
this._value = {}
|
||||
this._value.size = ber.readInt()
|
||||
this._value.cookie = ber.readString(Ber.OctetString, true)
|
||||
// readString returns '' instead of a zero-length buffer
|
||||
if (!this._value.cookie) {
|
||||
this._value.cookie = Buffer.alloc(0)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
_toBer (ber) {
|
||||
const writer = new BerWriter()
|
||||
writer.startSequence()
|
||||
writer.writeInt(this._value.size)
|
||||
if (this._value.cookie && this._value.cookie.length > 0) {
|
||||
writer.writeBuffer(this._value.cookie, Ber.OctetString)
|
||||
} else {
|
||||
// writeBuffer rejects zero-length buffers
|
||||
writer.writeString('')
|
||||
}
|
||||
writer.endSequence()
|
||||
|
||||
ber.writeBuffer(writer.buffer, Ber.OctetString)
|
||||
return ber
|
||||
}
|
||||
|
||||
_updatePlainObject (obj) {
|
||||
obj.controlValue = this.value
|
||||
return obj
|
||||
}
|
||||
}
|
||||
module.exports = PagedResultsControl
|
||||
139
node_modules/@ldapjs/controls/lib/controls/paged-results-control.test.js
generated
vendored
Normal file
139
node_modules/@ldapjs/controls/lib/controls/paged-results-control.test.js
generated
vendored
Normal file
@ -0,0 +1,139 @@
|
||||
'use strict'
|
||||
|
||||
const tap = require('tap')
|
||||
const { BerWriter } = require('@ldapjs/asn1')
|
||||
const PSC = require('./paged-results-control')
|
||||
const Control = require('../control')
|
||||
|
||||
tap.test('contructor', t => {
|
||||
t.test('new no args', async t => {
|
||||
const control = new PSC()
|
||||
t.ok(control)
|
||||
t.type(control, PSC)
|
||||
t.type(control, Control)
|
||||
t.equal(control.type, PSC.OID)
|
||||
t.equal(control.value.size, 0)
|
||||
t.equal(Buffer.alloc(0).compare(control.value.cookie), 0)
|
||||
})
|
||||
|
||||
t.test('new with args', async t => {
|
||||
const control = new PSC({
|
||||
type: '1.2.840.113556.1.4.319',
|
||||
criticality: true,
|
||||
value: {
|
||||
size: 1,
|
||||
cookie: 'foo'
|
||||
}
|
||||
})
|
||||
t.ok(control)
|
||||
t.equal(control.type, '1.2.840.113556.1.4.319')
|
||||
t.ok(control.criticality)
|
||||
t.equal(control.value.size, 1)
|
||||
t.equal(Buffer.from('foo').compare(control.value.cookie), 0)
|
||||
})
|
||||
|
||||
t.test('with value buffer', async t => {
|
||||
const value = new BerWriter()
|
||||
value.startSequence()
|
||||
value.writeInt(1)
|
||||
value.writeBuffer(Buffer.from('foo'), 0x04)
|
||||
value.endSequence()
|
||||
|
||||
const control = new PSC({ value: value.buffer })
|
||||
t.equal(control.value.size, 1)
|
||||
t.equal(Buffer.from('foo').compare(control.value.cookie), 0)
|
||||
})
|
||||
|
||||
t.test('with value buffer (empty cookie)', async t => {
|
||||
const value = new BerWriter()
|
||||
value.startSequence()
|
||||
value.writeInt(1)
|
||||
value.endSequence()
|
||||
|
||||
const control = new PSC({ value: value.buffer })
|
||||
t.equal(control.value.size, 1)
|
||||
t.equal(Buffer.alloc(0).compare(control.value.cookie), 0)
|
||||
})
|
||||
|
||||
t.test('throws for bad value', async t => {
|
||||
t.throws(() => new PSC({ value: 42 }))
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('pojo', t => {
|
||||
t.test('adds control value', async t => {
|
||||
const control = new PSC({
|
||||
value: {
|
||||
size: 1,
|
||||
cookie: 'foo'
|
||||
}
|
||||
})
|
||||
t.same(control.pojo, {
|
||||
type: PSC.OID,
|
||||
criticality: false,
|
||||
value: {
|
||||
size: 1,
|
||||
cookie: Buffer.from('foo')
|
||||
}
|
||||
})
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('toBer', t => {
|
||||
t.test('converts empty instance to BER', async t => {
|
||||
const target = new BerWriter()
|
||||
target.startSequence()
|
||||
target.writeString(PSC.OID)
|
||||
target.writeBoolean(false) // Control.criticality
|
||||
|
||||
const value = new BerWriter()
|
||||
value.startSequence()
|
||||
value.writeInt(1)
|
||||
value.writeString('foo')
|
||||
value.endSequence()
|
||||
|
||||
target.writeBuffer(value.buffer, 0x04)
|
||||
target.endSequence()
|
||||
|
||||
const control = new PSC({
|
||||
value: {
|
||||
size: 1,
|
||||
cookie: 'foo'
|
||||
}
|
||||
})
|
||||
const ber = control.toBer()
|
||||
|
||||
t.equal(Buffer.compare(ber.buffer, target.buffer), 0)
|
||||
})
|
||||
|
||||
t.test('converts empty instance to BER (empty cookie)', async t => {
|
||||
const target = new BerWriter()
|
||||
target.startSequence()
|
||||
target.writeString(PSC.OID)
|
||||
target.writeBoolean(false) // Control.criticality
|
||||
|
||||
const value = new BerWriter()
|
||||
value.startSequence()
|
||||
value.writeInt(1)
|
||||
value.writeString('')
|
||||
value.endSequence()
|
||||
|
||||
target.writeBuffer(value.buffer, 0x04)
|
||||
target.endSequence()
|
||||
|
||||
const control = new PSC({
|
||||
value: {
|
||||
size: 1
|
||||
}
|
||||
})
|
||||
const ber = control.toBer()
|
||||
|
||||
t.equal(Buffer.compare(ber.buffer, target.buffer), 0)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
118
node_modules/@ldapjs/controls/lib/controls/password-policy-control.js
generated
vendored
Normal file
118
node_modules/@ldapjs/controls/lib/controls/password-policy-control.js
generated
vendored
Normal file
@ -0,0 +1,118 @@
|
||||
'use strict'
|
||||
|
||||
const { BerReader, BerWriter } = require('@ldapjs/asn1')
|
||||
const isObject = require('../is-object')
|
||||
const hasOwn = require('../has-own')
|
||||
const Control = require('../control')
|
||||
|
||||
/**
|
||||
* @typedef {object} PasswordPolicyResponseControlValue
|
||||
* @property {number} error One of 0 (passwordExpired), 1 (accountLocked),
|
||||
* 2 (changeAfterReset), 3 (passwordModNotAllowed), 4 (mustSupplyOldPassword),
|
||||
* 5 (insufficientPasswordQuality), 6 (passwordTooShort), 7 (passwordTooYoung),
|
||||
* 8 (passwordInHistory), 9 (passwordTooYoung)
|
||||
* @property {number} timeBeforeExpiration
|
||||
* @property {number} graceAuthNsRemaining
|
||||
*/
|
||||
|
||||
/**
|
||||
* Implements both request and response controls:
|
||||
* https://datatracker.ietf.org/doc/html/draft-behera-ldap-password-policy-11#name-controls-used-for-password-
|
||||
*
|
||||
* @extends Control
|
||||
*/
|
||||
class PasswordPolicyControl extends Control {
|
||||
static OID = '1.3.6.1.4.1.42.2.27.8.5.1'
|
||||
|
||||
/**
|
||||
* @typedef {ControlParams} PasswordPolicyResponseParams
|
||||
* @property {PasswordPolicyResponseControlValue | Buffer} [value]
|
||||
*/
|
||||
|
||||
/**
|
||||
* Creates a new password policy control.
|
||||
*
|
||||
* @param {PasswordPolicyResponseParams} [options]
|
||||
*/
|
||||
constructor (options = {}) {
|
||||
options.type = PasswordPolicyControl.OID
|
||||
super(options)
|
||||
|
||||
this._value = {}
|
||||
|
||||
if (hasOwn(options, 'value') === false) {
|
||||
return
|
||||
}
|
||||
|
||||
if (Buffer.isBuffer(options.value)) {
|
||||
this.#parse(options.value)
|
||||
} else if (isObject(options.value)) {
|
||||
if (hasOwn(options.value, 'timeBeforeExpiration') === true && hasOwn(options.value, 'graceAuthNsRemaining') === true) {
|
||||
throw new Error('options.value must contain either timeBeforeExpiration or graceAuthNsRemaining, not both')
|
||||
}
|
||||
this._value = options.value
|
||||
} else {
|
||||
throw new TypeError('options.value must be a Buffer or Object')
|
||||
}
|
||||
}
|
||||
|
||||
get value () {
|
||||
return this._value
|
||||
}
|
||||
|
||||
set value (obj) {
|
||||
this._value = Object.assign({}, this._value, obj)
|
||||
}
|
||||
|
||||
/**
|
||||
* Given a BER buffer that represents a
|
||||
* {@link PasswordPolicyResponseControlValue}, read that buffer into the
|
||||
* current instance.
|
||||
*/
|
||||
#parse (buffer) {
|
||||
const ber = new BerReader(buffer)
|
||||
if (ber.readSequence()) {
|
||||
this._value = {}
|
||||
if (ber.peek() === 0xa0) {
|
||||
ber.readSequence(0xa0)
|
||||
if (ber.peek() === 0x80) {
|
||||
this._value.timeBeforeExpiration = ber._readTag(0x80)
|
||||
} else if (ber.peek() === 0x81) {
|
||||
this._value.graceAuthNsRemaining = ber._readTag(0x81)
|
||||
}
|
||||
}
|
||||
if (ber.peek() === 0x81) {
|
||||
this._value.error = ber._readTag(0x81)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
_toBer (ber) {
|
||||
if (!this._value || Object.keys(this._value).length === 0) { return }
|
||||
|
||||
const writer = new BerWriter()
|
||||
writer.startSequence()
|
||||
if (hasOwn(this._value, 'timeBeforeExpiration')) {
|
||||
writer.startSequence(0xa0)
|
||||
writer.writeInt(this._value.timeBeforeExpiration, 0x80)
|
||||
writer.endSequence()
|
||||
} else if (hasOwn(this._value, 'graceAuthNsRemaining')) {
|
||||
writer.startSequence(0xa0)
|
||||
writer.writeInt(this._value.graceAuthNsRemaining, 0x81)
|
||||
writer.endSequence()
|
||||
}
|
||||
if (hasOwn(this._value, 'error')) {
|
||||
writer.writeInt(this._value.error, 0x81)
|
||||
}
|
||||
writer.endSequence()
|
||||
|
||||
ber.writeBuffer(writer.buffer, 0x04)
|
||||
return ber
|
||||
}
|
||||
|
||||
_updatePlainObject (obj) {
|
||||
obj.controlValue = this.value
|
||||
return obj
|
||||
}
|
||||
}
|
||||
module.exports = PasswordPolicyControl
|
||||
113
node_modules/@ldapjs/controls/lib/controls/password-policy-control.test.js
generated
vendored
Normal file
113
node_modules/@ldapjs/controls/lib/controls/password-policy-control.test.js
generated
vendored
Normal file
@ -0,0 +1,113 @@
|
||||
'use strict'
|
||||
|
||||
const tap = require('tap')
|
||||
const { BerWriter } = require('@ldapjs/asn1')
|
||||
const PPC = require('./password-policy-control')
|
||||
const Control = require('../control')
|
||||
|
||||
tap.test('contructor', t => {
|
||||
t.test('new no args', async t => {
|
||||
const control = new PPC()
|
||||
t.ok(control)
|
||||
t.type(control, PPC)
|
||||
t.type(control, Control)
|
||||
t.equal(control.type, PPC.OID)
|
||||
t.same(control.value, {})
|
||||
})
|
||||
|
||||
t.test('new with args', async t => {
|
||||
const control = new PPC({
|
||||
type: '1.3.6.1.4.1.42.2.27.8.5.1',
|
||||
criticality: true,
|
||||
value: {
|
||||
error: 1,
|
||||
timeBeforeExpiration: 2
|
||||
}
|
||||
})
|
||||
t.ok(control)
|
||||
t.equal(control.type, '1.3.6.1.4.1.42.2.27.8.5.1')
|
||||
t.ok(control.criticality)
|
||||
t.same(control.value, {
|
||||
error: 1,
|
||||
timeBeforeExpiration: 2
|
||||
})
|
||||
})
|
||||
|
||||
t.test('with value buffer', async t => {
|
||||
const value = new BerWriter()
|
||||
value.startSequence()
|
||||
value.writeInt(5, 0x81)
|
||||
value.endSequence()
|
||||
|
||||
const control = new PPC({ value: value.buffer })
|
||||
t.same(control.value, {
|
||||
error: 5
|
||||
})
|
||||
})
|
||||
|
||||
t.test('throws for bad value', async t => {
|
||||
t.throws(() => new PPC({ value: 42 }))
|
||||
t.throws(() => new PPC({ value: { timeBeforeExpiration: 1, graceAuthNsRemaining: 2 } }))
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('pojo', t => {
|
||||
t.test('adds control value', async t => {
|
||||
const control = new PPC()
|
||||
t.same(control.pojo, {
|
||||
type: PPC.OID,
|
||||
criticality: false,
|
||||
value: {}
|
||||
})
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('toBer', t => {
|
||||
t.test('converts empty instance to BER', async t => {
|
||||
const target = new BerWriter()
|
||||
target.startSequence()
|
||||
target.writeString(PPC.OID)
|
||||
target.writeBoolean(false) // Control.criticality
|
||||
target.endSequence()
|
||||
|
||||
const control = new PPC()
|
||||
const ber = control.toBer()
|
||||
|
||||
t.equal(Buffer.compare(ber.buffer, target.buffer), 0)
|
||||
})
|
||||
|
||||
t.test('converts full instance to BER', async t => {
|
||||
const target = new BerWriter()
|
||||
target.startSequence()
|
||||
target.writeString(PPC.OID)
|
||||
target.writeBoolean(true) // Control.criticality
|
||||
|
||||
const value = new BerWriter()
|
||||
value.startSequence()
|
||||
value.startSequence(0xa0)
|
||||
value.writeInt(2, 0x81)
|
||||
value.endSequence()
|
||||
value.writeInt(1, 0x81)
|
||||
value.endSequence()
|
||||
|
||||
target.writeBuffer(value.buffer, 0x04)
|
||||
target.endSequence()
|
||||
|
||||
const control = new PPC({
|
||||
criticality: true,
|
||||
value: {
|
||||
error: 1,
|
||||
graceAuthNsRemaining: 2
|
||||
}
|
||||
})
|
||||
const ber = control.toBer()
|
||||
|
||||
t.equal(Buffer.compare(ber.buffer, target.buffer), 0)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
100
node_modules/@ldapjs/controls/lib/controls/persistent-search-control.js
generated
vendored
Normal file
100
node_modules/@ldapjs/controls/lib/controls/persistent-search-control.js
generated
vendored
Normal file
@ -0,0 +1,100 @@
|
||||
'use strict'
|
||||
|
||||
const { BerReader, BerWriter } = require('@ldapjs/asn1')
|
||||
const isObject = require('../is-object')
|
||||
const hasOwn = require('../has-own')
|
||||
const Control = require('../control')
|
||||
|
||||
/**
|
||||
* @typedef {object} PersistentSearchControlValue
|
||||
* @property {number} changeTypes A bitwise OR of 1 (add), 2 (delete),
|
||||
* 4 (modify), and 8 (modifyDN).
|
||||
* @property {boolean} changesOnly
|
||||
* @property {boolean} returnECs
|
||||
*/
|
||||
|
||||
/**
|
||||
* Implements:
|
||||
* https://datatracker.ietf.org/doc/html/draft-ietf-ldapext-psearch-03.txt
|
||||
*
|
||||
* @extends Control
|
||||
*/
|
||||
class PersistentSearchControl extends Control {
|
||||
static OID = '2.16.840.1.113730.3.4.3'
|
||||
|
||||
/**
|
||||
* @typedef {ControlParams} PersistentSearchParams
|
||||
* @property {PersistentSearchControlValue | Buffer} [value]
|
||||
*/
|
||||
|
||||
/**
|
||||
* Creates a new persistent search control.
|
||||
*
|
||||
* @param {PersistentSearchParams} [options]
|
||||
*/
|
||||
constructor (options = {}) {
|
||||
options.type = PersistentSearchControl.OID
|
||||
super(options)
|
||||
|
||||
this._value = {
|
||||
changeTypes: 15,
|
||||
changesOnly: true,
|
||||
returnECs: true
|
||||
}
|
||||
|
||||
if (hasOwn(options, 'value') === false) {
|
||||
return
|
||||
}
|
||||
|
||||
if (Buffer.isBuffer(options.value)) {
|
||||
this.#parse(options.value)
|
||||
} else if (isObject(options.value)) {
|
||||
this._value = options.value
|
||||
} else {
|
||||
throw new TypeError('options.value must be a Buffer or Object')
|
||||
}
|
||||
}
|
||||
|
||||
get value () {
|
||||
return this._value
|
||||
}
|
||||
|
||||
set value (obj) {
|
||||
this._value = Object.assign({}, this._value, obj)
|
||||
}
|
||||
|
||||
/**
|
||||
* Given a BER buffer that represents a {@link PersistentSearchControlValue},
|
||||
* read that buffer into the current instance.
|
||||
*/
|
||||
#parse (buffer) {
|
||||
const ber = new BerReader(buffer)
|
||||
|
||||
/* istanbul ignore else */
|
||||
if (ber.readSequence()) {
|
||||
this._value = {
|
||||
changeTypes: ber.readInt(),
|
||||
changesOnly: ber.readBoolean(),
|
||||
returnECs: ber.readBoolean()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
_toBer (ber) {
|
||||
const writer = new BerWriter()
|
||||
writer.startSequence()
|
||||
writer.writeInt(this._value.changeTypes)
|
||||
writer.writeBoolean(this._value.changesOnly)
|
||||
writer.writeBoolean(this._value.returnECs)
|
||||
writer.endSequence()
|
||||
|
||||
ber.writeBuffer(writer.buffer, 0x04)
|
||||
return ber
|
||||
}
|
||||
|
||||
_updatePlainObject (obj) {
|
||||
obj.controlValue = this.value
|
||||
return obj
|
||||
}
|
||||
}
|
||||
module.exports = PersistentSearchControl
|
||||
106
node_modules/@ldapjs/controls/lib/controls/persistent-search-control.test.js
generated
vendored
Normal file
106
node_modules/@ldapjs/controls/lib/controls/persistent-search-control.test.js
generated
vendored
Normal file
@ -0,0 +1,106 @@
|
||||
'use strict'
|
||||
|
||||
const tap = require('tap')
|
||||
const { BerWriter } = require('@ldapjs/asn1')
|
||||
const PSC = require('./persistent-search-control')
|
||||
const Control = require('../control')
|
||||
|
||||
tap.test('contructor', t => {
|
||||
t.test('new no args', async t => {
|
||||
const control = new PSC()
|
||||
t.ok(control)
|
||||
t.type(control, PSC)
|
||||
t.type(control, Control)
|
||||
t.equal(control.type, PSC.OID)
|
||||
t.same(control.value, {
|
||||
changeTypes: 15,
|
||||
changesOnly: true,
|
||||
returnECs: true
|
||||
})
|
||||
})
|
||||
|
||||
t.test('new with args', async t => {
|
||||
const control = new PSC({
|
||||
type: '2.16.840.1.113730.3.4.3',
|
||||
criticality: true,
|
||||
value: {
|
||||
changeTypes: 1,
|
||||
changesOnly: false,
|
||||
returnECs: true
|
||||
}
|
||||
})
|
||||
t.ok(control)
|
||||
t.equal(control.type, '2.16.840.1.113730.3.4.3')
|
||||
t.ok(control.criticality)
|
||||
t.same(control.value, {
|
||||
changeTypes: 1,
|
||||
changesOnly: false,
|
||||
returnECs: true
|
||||
})
|
||||
})
|
||||
|
||||
t.test('with value buffer', async t => {
|
||||
const value = new BerWriter()
|
||||
value.startSequence()
|
||||
value.writeInt(2)
|
||||
value.writeBoolean(true)
|
||||
value.writeBoolean(false)
|
||||
value.endSequence()
|
||||
|
||||
const control = new PSC({ value: value.buffer })
|
||||
t.same(control.value, {
|
||||
changeTypes: 2,
|
||||
changesOnly: true,
|
||||
returnECs: false
|
||||
})
|
||||
})
|
||||
|
||||
t.test('throws for bad value', async t => {
|
||||
t.throws(() => new PSC({ value: 42 }))
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('pojo', t => {
|
||||
t.test('adds control value', async t => {
|
||||
const control = new PSC()
|
||||
t.same(control.pojo, {
|
||||
type: PSC.OID,
|
||||
criticality: false,
|
||||
value: {
|
||||
changeTypes: 15,
|
||||
changesOnly: true,
|
||||
returnECs: true
|
||||
}
|
||||
})
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('toBer', t => {
|
||||
t.test('converts empty instance to BER', async t => {
|
||||
const target = new BerWriter()
|
||||
target.startSequence()
|
||||
target.writeString(PSC.OID)
|
||||
target.writeBoolean(false) // Control.criticality
|
||||
|
||||
const value = new BerWriter()
|
||||
value.startSequence()
|
||||
value.writeInt(15)
|
||||
value.writeBoolean(true)
|
||||
value.writeBoolean(true)
|
||||
value.endSequence()
|
||||
|
||||
target.writeBuffer(value.buffer, 0x04)
|
||||
target.endSequence()
|
||||
|
||||
const control = new PSC()
|
||||
const ber = control.toBer()
|
||||
|
||||
t.equal(Buffer.compare(ber.buffer, target.buffer), 0)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
132
node_modules/@ldapjs/controls/lib/controls/server-side-sorting-request-control.js
generated
vendored
Normal file
132
node_modules/@ldapjs/controls/lib/controls/server-side-sorting-request-control.js
generated
vendored
Normal file
@ -0,0 +1,132 @@
|
||||
'use strict'
|
||||
|
||||
const { Ber, BerReader, BerWriter } = require('@ldapjs/asn1')
|
||||
const isObject = require('../is-object')
|
||||
const hasOwn = require('../has-own')
|
||||
const Control = require('../control')
|
||||
|
||||
/**
|
||||
* @typedef {object} SortKeyItem
|
||||
* @property {string} attributeType
|
||||
* @property {string} orderingRule
|
||||
* @property {boolean} reverseOrder
|
||||
*/
|
||||
|
||||
/**
|
||||
* @typedef {SortKeyItem[]} ServerSideSortingRequestControlValue
|
||||
*/
|
||||
|
||||
/**
|
||||
* Implements:
|
||||
* https://datatracker.ietf.org/doc/html/draft-ietf-ldapext-sorting#section-3.1
|
||||
*
|
||||
* @extends Control
|
||||
*/
|
||||
class ServerSideSortingRequestControl extends Control {
|
||||
static OID = '1.2.840.113556.1.4.473'
|
||||
|
||||
/**
|
||||
* @typedef {ControlParams} ServerSideSortingRequestParams
|
||||
* @property {ServerSideSortingRequestControlValue | SortKeyItem | Buffer} [value]
|
||||
*/
|
||||
|
||||
/**
|
||||
* Creates a new server side sorting request control.
|
||||
*
|
||||
* @param {ServerSideSortingRequestParams} [options]
|
||||
*/
|
||||
constructor (options = { value: [] }) {
|
||||
options.type = ServerSideSortingRequestControl.OID
|
||||
super(options)
|
||||
|
||||
const inputValue = options.value ?? []
|
||||
if (Buffer.isBuffer(inputValue)) {
|
||||
this.#parse(inputValue)
|
||||
} else if (Array.isArray(inputValue)) {
|
||||
for (const obj of inputValue) {
|
||||
if (isObject(obj) === false) {
|
||||
throw new Error('Control value must be an object')
|
||||
}
|
||||
if (hasOwn(obj, 'attributeType') === false) {
|
||||
throw new Error('Missing required key: attributeType')
|
||||
}
|
||||
}
|
||||
this.value = inputValue
|
||||
} else if (isObject(inputValue)) {
|
||||
if (hasOwn(inputValue, 'attributeType') === false) {
|
||||
throw new Error('Missing required key: attributeType')
|
||||
}
|
||||
this.value = [inputValue]
|
||||
} else {
|
||||
throw new TypeError('options.value must be a Buffer, Array or Object')
|
||||
}
|
||||
}
|
||||
|
||||
get value () {
|
||||
return this._value
|
||||
}
|
||||
|
||||
set value (items) {
|
||||
if (Buffer.isBuffer(items) === true) return
|
||||
if (Array.isArray(items) === false) {
|
||||
this._value = [items]
|
||||
return
|
||||
}
|
||||
this._value = items
|
||||
}
|
||||
|
||||
#parse (buffer) {
|
||||
const ber = new BerReader(buffer)
|
||||
let item
|
||||
/* istanbul ignore else */
|
||||
if (ber.readSequence(0x30)) {
|
||||
this.value = []
|
||||
|
||||
while (ber.readSequence(0x30)) {
|
||||
item = {}
|
||||
item.attributeType = ber.readString(Ber.OctetString)
|
||||
/* istanbul ignore else */
|
||||
if (ber.peek() === 0x80) {
|
||||
item.orderingRule = ber.readString(0x80)
|
||||
}
|
||||
/* istanbul ignore else */
|
||||
if (ber.peek() === 0x81) {
|
||||
item.reverseOrder = (ber._readTag(0x81) !== 0)
|
||||
}
|
||||
this.value.push(item)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
_pojo (obj) {
|
||||
obj.value = this.value
|
||||
return obj
|
||||
}
|
||||
|
||||
_toBer (ber) {
|
||||
if (this.value.length === 0) { return }
|
||||
|
||||
const writer = new BerWriter()
|
||||
writer.startSequence(0x30)
|
||||
for (let i = 0; i < this.value.length; i++) {
|
||||
const item = this.value[i]
|
||||
writer.startSequence(0x30)
|
||||
/* istanbul ignore else */
|
||||
if (hasOwn(item, 'attributeType')) {
|
||||
writer.writeString(item.attributeType, Ber.OctetString)
|
||||
}
|
||||
/* istanbul ignore else */
|
||||
if (hasOwn(item, 'orderingRule')) {
|
||||
writer.writeString(item.orderingRule, 0x80)
|
||||
}
|
||||
/* istanbul ignore else */
|
||||
if (hasOwn(item, 'reverseOrder')) {
|
||||
writer.writeBoolean(item.reverseOrder, 0x81)
|
||||
}
|
||||
writer.endSequence()
|
||||
}
|
||||
writer.endSequence()
|
||||
ber.writeBuffer(writer.buffer, 0x04)
|
||||
}
|
||||
}
|
||||
module.exports = ServerSideSortingRequestControl
|
||||
144
node_modules/@ldapjs/controls/lib/controls/server-side-sorting-request-control.test.js
generated
vendored
Normal file
144
node_modules/@ldapjs/controls/lib/controls/server-side-sorting-request-control.test.js
generated
vendored
Normal file
@ -0,0 +1,144 @@
|
||||
'use strict'
|
||||
|
||||
const tap = require('tap')
|
||||
const { BerWriter } = require('@ldapjs/asn1')
|
||||
const SSSRC = require('./server-side-sorting-request-control')
|
||||
const Control = require('../control')
|
||||
|
||||
tap.test('contructor', t => {
|
||||
t.test('new no args', async t => {
|
||||
const control = new SSSRC()
|
||||
t.ok(control)
|
||||
t.type(control, SSSRC)
|
||||
t.type(control, Control)
|
||||
t.equal(control.type, SSSRC.OID)
|
||||
t.same(control.value, [])
|
||||
})
|
||||
|
||||
t.test('new with args', async t => {
|
||||
const control = new SSSRC({
|
||||
type: '1.2.840.113556.1.4.473',
|
||||
criticality: true,
|
||||
value: [{ attributeType: 'foo' }]
|
||||
})
|
||||
t.ok(control)
|
||||
t.equal(control.type, '1.2.840.113556.1.4.473')
|
||||
t.ok(control.criticality)
|
||||
t.same(control.value, [{ attributeType: 'foo' }])
|
||||
})
|
||||
|
||||
t.test('new with object', async t => {
|
||||
const control = new SSSRC({
|
||||
type: '1.2.840.113556.1.4.473',
|
||||
criticality: true,
|
||||
value: { attributeType: 'foo' }
|
||||
})
|
||||
t.ok(control)
|
||||
t.equal(control.type, '1.2.840.113556.1.4.473')
|
||||
t.ok(control.criticality)
|
||||
t.same(control.value, [{ attributeType: 'foo' }])
|
||||
})
|
||||
|
||||
t.test('with value buffer', async t => {
|
||||
const value = new BerWriter()
|
||||
value.startSequence(0x30) // Open "array"
|
||||
value.startSequence(0x30) // Start "item"
|
||||
value.writeString('foo', 0x04)
|
||||
value.writeString('bar', 0x80)
|
||||
value.writeBoolean(false, 0x81)
|
||||
value.endSequence() // End item
|
||||
value.endSequence() // Close array
|
||||
|
||||
const control = new SSSRC({ value: value.buffer })
|
||||
t.same(control.value, [{
|
||||
attributeType: 'foo',
|
||||
orderingRule: 'bar',
|
||||
reverseOrder: false
|
||||
}])
|
||||
})
|
||||
|
||||
t.test('throws for bad value', async t => {
|
||||
t.throws(() => new SSSRC({ value: 42 }))
|
||||
})
|
||||
|
||||
t.test('throws for bad object value', async t => {
|
||||
t.throws(() => new SSSRC({ value: { foo: 'bar' } }))
|
||||
})
|
||||
|
||||
t.test('throws for bad array value', async t => {
|
||||
t.throws(() => new SSSRC({ value: [42] }))
|
||||
t.throws(() => new SSSRC({ value: [{ foo: 'bar' }] }))
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('pojo', t => {
|
||||
t.test('adds control value', async t => {
|
||||
const control = new SSSRC()
|
||||
t.same(control.pojo, {
|
||||
type: SSSRC.OID,
|
||||
criticality: false,
|
||||
value: []
|
||||
})
|
||||
})
|
||||
|
||||
t.test('_pojo', async t => {
|
||||
const control = new SSSRC()
|
||||
const obj = control._pojo({ value: 'change_me' })
|
||||
t.strictSame(obj, { value: [] })
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('toBer', t => {
|
||||
t.test('converts empty instance to BER', async t => {
|
||||
const target = new BerWriter()
|
||||
target.startSequence()
|
||||
target.writeString(SSSRC.OID)
|
||||
target.writeBoolean(false) // Control.criticality
|
||||
target.endSequence()
|
||||
|
||||
const control = new SSSRC()
|
||||
const ber = control.toBer()
|
||||
|
||||
t.equal(Buffer.compare(ber.buffer, target.buffer), 0)
|
||||
})
|
||||
|
||||
t.test('converts full instance BER', async t => {
|
||||
const target = new BerWriter()
|
||||
target.startSequence()
|
||||
target.writeString(SSSRC.OID)
|
||||
target.writeBoolean(false) // Control.criticality
|
||||
|
||||
const value = new BerWriter()
|
||||
value.startSequence(0x30) // Open "array"
|
||||
value.startSequence(0x30) // Start "item"
|
||||
value.writeString('one', 0x04)
|
||||
value.writeString('one', 0x80)
|
||||
value.writeBoolean(false, 0x81)
|
||||
value.endSequence() // End item
|
||||
value.startSequence(0x30) // Start "item"
|
||||
value.writeString('two', 0x04)
|
||||
value.writeString('two', 0x80)
|
||||
value.writeBoolean(true, 0x81)
|
||||
value.endSequence() // End item
|
||||
value.endSequence() // Close array
|
||||
|
||||
target.writeBuffer(value.buffer, 0x04)
|
||||
target.endSequence()
|
||||
|
||||
const control = new SSSRC({
|
||||
value: [
|
||||
{ attributeType: 'one', orderingRule: 'one', reverseOrder: false },
|
||||
{ attributeType: 'two', orderingRule: 'two', reverseOrder: true }
|
||||
]
|
||||
})
|
||||
const ber = control.toBer()
|
||||
|
||||
t.equal(Buffer.compare(Buffer.from(target.buffer), ber.buffer), 0)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
129
node_modules/@ldapjs/controls/lib/controls/server-side-sorting-response-control.js
generated
vendored
Normal file
129
node_modules/@ldapjs/controls/lib/controls/server-side-sorting-response-control.js
generated
vendored
Normal file
@ -0,0 +1,129 @@
|
||||
'use strict'
|
||||
|
||||
const { BerReader, BerWriter } = require('@ldapjs/asn1')
|
||||
const Control = require('../control')
|
||||
const isObject = require('../is-object')
|
||||
const hasOwn = require('../has-own')
|
||||
const { resultCodes: RESULT_CODES } = require('@ldapjs/protocol')
|
||||
|
||||
const validCodeNames = [
|
||||
'SUCCESS',
|
||||
'OPERATIONS_ERROR',
|
||||
'TIME_LIMIT_EXCEEDED',
|
||||
'STRONGER_AUTH_REQUIRED',
|
||||
'ADMIN_LIMIT_EXCEEDED',
|
||||
'NO_SUCH_ATTRIBUTE',
|
||||
'INAPPROPRIATE_MATCHING',
|
||||
'INSUFFICIENT_ACCESS_RIGHTS',
|
||||
'BUSY',
|
||||
'UNWILLING_TO_PERFORM',
|
||||
'OTHER'
|
||||
]
|
||||
|
||||
const filteredCodes = Object.entries(RESULT_CODES).filter(([k, v]) => validCodeNames.includes(k))
|
||||
const VALID_CODES = new Map([
|
||||
...filteredCodes,
|
||||
...filteredCodes.map(([k, v]) => { return [v, k] })
|
||||
])
|
||||
|
||||
/**
|
||||
* @typedef {object} ServerSideSortingResponseControlResult
|
||||
* @property {number} result
|
||||
* @property {string} failedAttribute
|
||||
*/
|
||||
|
||||
/**
|
||||
* Implements:
|
||||
* https://datatracker.ietf.org/doc/html/draft-ietf-ldapext-sorting#section-3.2
|
||||
*
|
||||
* @extends Control
|
||||
*/
|
||||
class ServerSideSortingResponseControl extends Control {
|
||||
static OID = '1.2.840.113556.1.4.474'
|
||||
|
||||
/**
|
||||
* A map of possible response codes. Includes `CODE => VALUE` and
|
||||
* `VALUE => CODE`. For example, `RESPONSE_CODES.get(0)` returns
|
||||
* `LDAP_SUCCESS`, and `RESPONSE_CODES.get('LDAP_SUCCESS')` returns `0`.
|
||||
*/
|
||||
static RESPONSE_CODES = Object.freeze(VALID_CODES)
|
||||
|
||||
/**
|
||||
* @typedef {ControlParams} ServerSideSortingResponseParams
|
||||
* @property {ServerSideSortingResponseControlResult | Buffer} value
|
||||
*/
|
||||
|
||||
/**
|
||||
* Creates a new server side sorting response control.
|
||||
*
|
||||
* @param {ServerSideSortingResponseParams} [options]
|
||||
*/
|
||||
constructor (options = {}) {
|
||||
options.type = ServerSideSortingResponseControl.OID
|
||||
options.criticality = false
|
||||
super(options)
|
||||
|
||||
this.value = {}
|
||||
|
||||
if (hasOwn(options, 'value') === false || !options.value) {
|
||||
return
|
||||
}
|
||||
|
||||
const value = options.value
|
||||
if (Buffer.isBuffer(value)) {
|
||||
this.#parse(value)
|
||||
} else if (isObject(value)) {
|
||||
if (VALID_CODES.has(value.result) === false) {
|
||||
throw new Error('Invalid result code')
|
||||
}
|
||||
if (hasOwn(value, 'failedAttribute') && (typeof value.failedAttribute) !== 'string') {
|
||||
throw new Error('failedAttribute must be String')
|
||||
}
|
||||
|
||||
this.value = value
|
||||
} else {
|
||||
throw new TypeError('options.value must be a Buffer or Object')
|
||||
}
|
||||
}
|
||||
|
||||
get value () {
|
||||
return this._value
|
||||
}
|
||||
|
||||
set value (obj) {
|
||||
this._value = Object.assign({}, this._value, obj)
|
||||
}
|
||||
|
||||
#parse (buffer) {
|
||||
const ber = new BerReader(buffer)
|
||||
/* istanbul ignore else */
|
||||
if (ber.readSequence(0x30)) {
|
||||
this._value = {}
|
||||
this._value.result = ber.readEnumeration()
|
||||
/* istanbul ignore else */
|
||||
if (ber.peek() === 0x80) {
|
||||
this._value.failedAttribute = ber.readString(0x80)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
_pojo (obj) {
|
||||
obj.value = this.value
|
||||
return obj
|
||||
}
|
||||
|
||||
_toBer (ber) {
|
||||
if (!this._value || Object.keys(this._value).length === 0) { return }
|
||||
|
||||
const writer = new BerWriter()
|
||||
writer.startSequence(0x30)
|
||||
writer.writeEnumeration(this.value.result)
|
||||
/* istanbul ignore else */
|
||||
if (this.value.result !== RESULT_CODES.SUCCESS && this.value.failedAttribute) {
|
||||
writer.writeString(this.value.failedAttribute, 0x80)
|
||||
}
|
||||
writer.endSequence()
|
||||
ber.writeBuffer(writer.buffer, 0x04)
|
||||
}
|
||||
}
|
||||
module.exports = ServerSideSortingResponseControl
|
||||
125
node_modules/@ldapjs/controls/lib/controls/server-side-sorting-response-control.test.js
generated
vendored
Normal file
125
node_modules/@ldapjs/controls/lib/controls/server-side-sorting-response-control.test.js
generated
vendored
Normal file
@ -0,0 +1,125 @@
|
||||
'use strict'
|
||||
|
||||
const tap = require('tap')
|
||||
const { BerWriter } = require('@ldapjs/asn1')
|
||||
const SSSRC = require('./server-side-sorting-response-control')
|
||||
const Control = require('../control')
|
||||
|
||||
tap.test('constructor', t => {
|
||||
t.test('new no args', async t => {
|
||||
const control = new SSSRC()
|
||||
t.ok(control)
|
||||
t.type(control, SSSRC)
|
||||
t.type(control, Control)
|
||||
t.equal(control.type, SSSRC.OID)
|
||||
t.same(control.value, {})
|
||||
})
|
||||
|
||||
t.test('new with args', async t => {
|
||||
const control = new SSSRC({
|
||||
type: '1.2.840.113556.1.4.474',
|
||||
criticality: true,
|
||||
value: {
|
||||
result: SSSRC.RESPONSE_CODES.get('OPERATIONS_ERROR'),
|
||||
failedAttribute: 'foo'
|
||||
}
|
||||
})
|
||||
t.ok(control)
|
||||
t.equal(control.type, '1.2.840.113556.1.4.474')
|
||||
t.equal(control.criticality, false)
|
||||
t.same(control.value, {
|
||||
result: 1,
|
||||
failedAttribute: 'foo'
|
||||
})
|
||||
})
|
||||
|
||||
t.test('with value buffer', async t => {
|
||||
const value = new BerWriter()
|
||||
value.startSequence(0x30)
|
||||
value.writeEnumeration(1)
|
||||
value.writeString('foo', 0x80)
|
||||
value.endSequence()
|
||||
|
||||
const control = new SSSRC({ value: value.buffer })
|
||||
t.same(control.value, {
|
||||
result: 1,
|
||||
failedAttribute: 'foo'
|
||||
})
|
||||
})
|
||||
|
||||
t.test('throws for bad value', async t => {
|
||||
t.throws(() => new SSSRC({ value: 42 }))
|
||||
t.throws(() => new SSSRC({ value: {} }))
|
||||
t.throws(() => new SSSRC({
|
||||
value: {
|
||||
result: 1,
|
||||
failedAttribute: 42
|
||||
}
|
||||
}))
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('pojo', t => {
|
||||
t.test('adds control value', async t => {
|
||||
const control = new SSSRC()
|
||||
t.same(control.pojo, {
|
||||
type: SSSRC.OID,
|
||||
criticality: false,
|
||||
value: {}
|
||||
})
|
||||
})
|
||||
|
||||
t.test('_pojo', async t => {
|
||||
const control = new SSSRC()
|
||||
t.strictSame(control._pojo({ value: 'change_me' }), {
|
||||
value: {}
|
||||
})
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('toBer', t => {
|
||||
t.test('converts empty instance to BER', async t => {
|
||||
const target = new BerWriter()
|
||||
target.startSequence()
|
||||
target.writeString(SSSRC.OID)
|
||||
target.writeBoolean(false) // Control.criticality
|
||||
target.endSequence()
|
||||
|
||||
const control = new SSSRC()
|
||||
const ber = control.toBer()
|
||||
|
||||
t.equal(Buffer.compare(ber.buffer, target.buffer), 0)
|
||||
})
|
||||
|
||||
t.test('converts full instance to BER', async t => {
|
||||
const target = new BerWriter()
|
||||
target.startSequence()
|
||||
target.writeString(SSSRC.OID)
|
||||
target.writeBoolean(false) // Control.criticality
|
||||
|
||||
const value = new BerWriter()
|
||||
value.startSequence(0x30)
|
||||
value.writeEnumeration(1)
|
||||
value.writeString('foo', 0x80)
|
||||
value.endSequence()
|
||||
|
||||
target.writeBuffer(value.buffer, 0x04)
|
||||
target.endSequence()
|
||||
|
||||
const control = new SSSRC({
|
||||
value: {
|
||||
result: SSSRC.RESPONSE_CODES.get('OPERATIONS_ERROR'),
|
||||
failedAttribute: 'foo'
|
||||
}
|
||||
})
|
||||
const ber = control.toBer()
|
||||
|
||||
t.equal(Buffer.compare(ber.buffer, target.buffer), 0)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
116
node_modules/@ldapjs/controls/lib/controls/virtual-list-view-request-control.js
generated
vendored
Normal file
116
node_modules/@ldapjs/controls/lib/controls/virtual-list-view-request-control.js
generated
vendored
Normal file
@ -0,0 +1,116 @@
|
||||
'use strict'
|
||||
|
||||
const { BerReader, BerWriter } = require('@ldapjs/asn1')
|
||||
const isObject = require('../is-object')
|
||||
const hasOwn = require('../has-own')
|
||||
const Control = require('../control')
|
||||
|
||||
/**
|
||||
* @typedef {object} VirtualListViewControlValue
|
||||
* @property {number} beforeCount
|
||||
* @property {number} afterCount
|
||||
*
|
||||
*/
|
||||
|
||||
/**
|
||||
* Implements:
|
||||
* https://datatracker.ietf.org/doc/html/draft-ietf-ldapext-ldapv3-vlv-07#section-6.1
|
||||
*
|
||||
* @extends Control
|
||||
*/
|
||||
class VirtualListViewRequestControl extends Control {
|
||||
static OID = '2.16.840.1.113730.3.4.9'
|
||||
|
||||
/**
|
||||
* @typedef {ControlParams} VirtualListViewRequestParams
|
||||
* @property {Buffer|VirtualListViewControlValue} [value]
|
||||
*/
|
||||
|
||||
/**
|
||||
* @param {VirtualListViewRequestParams} [options]
|
||||
*/
|
||||
constructor (options = {}) {
|
||||
options.type = VirtualListViewRequestControl.OID
|
||||
super(options)
|
||||
|
||||
if (hasOwn(options, 'value') === false) {
|
||||
// return
|
||||
throw Error('control is not enabled')
|
||||
}
|
||||
|
||||
if (Buffer.isBuffer(options.value)) {
|
||||
this.#parse(options.value)
|
||||
} else if (isObject(options.value)) {
|
||||
if (Object.prototype.hasOwnProperty.call(options.value, 'beforeCount') === false) {
|
||||
throw new Error('Missing required key: beforeCount')
|
||||
}
|
||||
if (Object.prototype.hasOwnProperty.call(options.value, 'afterCount') === false) {
|
||||
throw new Error('Missing required key: afterCount')
|
||||
}
|
||||
this._value = options.value
|
||||
} else {
|
||||
throw new TypeError('options.value must be a Buffer or Object')
|
||||
}
|
||||
|
||||
throw Error('control is not enabled')
|
||||
}
|
||||
|
||||
get value () {
|
||||
return this._value
|
||||
}
|
||||
|
||||
set value (items) {
|
||||
if (Buffer.isBuffer(items) === true) return
|
||||
if (Array.isArray(items) === false) {
|
||||
this._value = [items]
|
||||
return
|
||||
}
|
||||
this._value = items
|
||||
}
|
||||
|
||||
#parse (buffer) {
|
||||
const ber = new BerReader(buffer)
|
||||
if (ber.readSequence()) {
|
||||
this._value = {}
|
||||
this._value.beforeCount = ber.readInt()
|
||||
this._value.afterCount = ber.readInt()
|
||||
if (ber.peek() === 0xa0) {
|
||||
if (ber.readSequence(0xa0)) {
|
||||
this._value.targetOffset = ber.readInt()
|
||||
this._value.contentCount = ber.readInt()
|
||||
}
|
||||
}
|
||||
if (ber.peek() === 0x81) {
|
||||
this._value.greaterThanOrEqual = ber.readString(0x81)
|
||||
}
|
||||
return true
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
_pojo (obj) {
|
||||
obj.value = this.value
|
||||
return obj
|
||||
}
|
||||
|
||||
_toBer (ber) {
|
||||
if (!this._value || this._value.length === 0) {
|
||||
return
|
||||
}
|
||||
const writer = new BerWriter()
|
||||
writer.startSequence(0x30)
|
||||
writer.writeInt(this._value.beforeCount)
|
||||
writer.writeInt(this._value.afterCount)
|
||||
if (this._value.targetOffset !== undefined) {
|
||||
writer.startSequence(0xa0)
|
||||
writer.writeInt(this._value.targetOffset)
|
||||
writer.writeInt(this._value.contentCount)
|
||||
writer.endSequence()
|
||||
} else if (this._value.greaterThanOrEqual !== undefined) {
|
||||
writer.writeString(this._value.greaterThanOrEqual, 0x81)
|
||||
}
|
||||
writer.endSequence()
|
||||
ber.writeBuffer(writer.buffer, 0x04)
|
||||
}
|
||||
}
|
||||
module.exports = VirtualListViewRequestControl
|
||||
110
node_modules/@ldapjs/controls/lib/controls/virtual-list-view-request-control.test.js
generated
vendored
Normal file
110
node_modules/@ldapjs/controls/lib/controls/virtual-list-view-request-control.test.js
generated
vendored
Normal file
@ -0,0 +1,110 @@
|
||||
'use strict'
|
||||
|
||||
const tap = require('tap')
|
||||
tap.test('stubbed', async t => {
|
||||
t.pass()
|
||||
})
|
||||
|
||||
/**
|
||||
* This test is disabled. The commented code below is directly copied from
|
||||
* the original test file in the core `node-ldapjs` repo. The actual test
|
||||
* suite should follow the patterns of the
|
||||
* server-side-sorting-request-control.test.js test suite.
|
||||
*
|
||||
* See https://github.com/ldapjs/node-ldapjs/pull/797#issuecomment-1094132289
|
||||
*/
|
||||
|
||||
// 'use strict'
|
||||
|
||||
// const { test } = require('tap')
|
||||
// const { BerReader, BerWriter } = require('@ldapjs/asn1')
|
||||
// const { getControl, VirtualListViewRequestControl: VLVRControl } = require('../../lib')
|
||||
|
||||
// test('VLV request - new no args', function (t) {
|
||||
// t.ok(new VLVRControl())
|
||||
// t.end()
|
||||
// })
|
||||
|
||||
// test('VLV request - new with args', function (t) {
|
||||
// const c = new VLVRControl({
|
||||
// criticality: true,
|
||||
// value: {
|
||||
// beforeCount: 0,
|
||||
// afterCount: 3,
|
||||
// targetOffset: 1,
|
||||
// contentCount: 0
|
||||
// }
|
||||
// })
|
||||
// t.ok(c)
|
||||
// t.equal(c.type, '2.16.840.1.113730.3.4.9')
|
||||
// t.ok(c.criticality)
|
||||
// t.equal(c.value.beforeCount, 0)
|
||||
// t.equal(c.value.afterCount, 3)
|
||||
// t.equal(c.value.targetOffset, 1)
|
||||
// t.equal(c.value.contentCount, 0)
|
||||
|
||||
// t.end()
|
||||
// })
|
||||
|
||||
// test('VLV request - toBer - with offset', function (t) {
|
||||
// const vlvc = new VLVRControl({
|
||||
// criticality: true,
|
||||
// value: {
|
||||
// beforeCount: 0,
|
||||
// afterCount: 3,
|
||||
// targetOffset: 1,
|
||||
// contentCount: 0
|
||||
// }
|
||||
// })
|
||||
|
||||
// const ber = new BerWriter()
|
||||
// vlvc.toBer(ber)
|
||||
|
||||
// const c = getControl(new BerReader(ber.buffer))
|
||||
// t.ok(c)
|
||||
// t.equal(c.type, '2.16.840.1.113730.3.4.9')
|
||||
// t.ok(c.criticality)
|
||||
// t.equal(c.value.beforeCount, 0)
|
||||
// t.equal(c.value.afterCount, 3)
|
||||
// t.equal(c.value.targetOffset, 1)
|
||||
// t.equal(c.value.contentCount, 0)
|
||||
|
||||
// t.end()
|
||||
// })
|
||||
|
||||
// test('VLV request - toBer - with assertion', function (t) {
|
||||
// const vlvc = new VLVRControl({
|
||||
// criticality: true,
|
||||
// value: {
|
||||
// beforeCount: 0,
|
||||
// afterCount: 3,
|
||||
// greaterThanOrEqual: '*foo*'
|
||||
// }
|
||||
// })
|
||||
|
||||
// const ber = new BerWriter()
|
||||
// vlvc.toBer(ber)
|
||||
|
||||
// const c = getControl(new BerReader(ber.buffer))
|
||||
// t.ok(c)
|
||||
// t.equal(c.type, '2.16.840.1.113730.3.4.9')
|
||||
// t.ok(c.criticality)
|
||||
// t.equal(c.value.beforeCount, 0)
|
||||
// t.equal(c.value.afterCount, 3)
|
||||
// t.equal(c.value.greaterThanOrEqual, '*foo*')
|
||||
|
||||
// t.end()
|
||||
// })
|
||||
|
||||
// test('VLV request - toBer - empty', function (t) {
|
||||
// const vlvc = new VLVRControl()
|
||||
// const ber = new BerWriter()
|
||||
// vlvc.toBer(ber)
|
||||
|
||||
// const c = getControl(new BerReader(ber.buffer))
|
||||
// t.ok(c)
|
||||
// t.equal(c.type, '2.16.840.1.113730.3.4.9')
|
||||
// t.equal(c.criticality, false)
|
||||
// t.notOk(c.value.result)
|
||||
// t.end()
|
||||
// })
|
||||
150
node_modules/@ldapjs/controls/lib/controls/virtual-list-view-response-control.js
generated
vendored
Normal file
150
node_modules/@ldapjs/controls/lib/controls/virtual-list-view-response-control.js
generated
vendored
Normal file
@ -0,0 +1,150 @@
|
||||
'use strict'
|
||||
|
||||
const { Ber, BerReader, BerWriter } = require('@ldapjs/asn1')
|
||||
const isObject = require('../is-object')
|
||||
const hasOwn = require('../has-own')
|
||||
const Control = require('../control')
|
||||
const { resultCodes: RESULT_CODES } = require('@ldapjs/protocol')
|
||||
|
||||
const validCodeNames = [
|
||||
'SUCCESS',
|
||||
'OPERATIONS_ERROR',
|
||||
'UNWILLING_TO_PERFORM',
|
||||
'INSUFFICIENT_ACCESS_RIGHTS',
|
||||
'BUSY',
|
||||
'TIME_LIMIT_EXCEEDED',
|
||||
'STRONGER_AUTH_REQUIRED',
|
||||
'ADMIN_LIMIT_EXCEEDED',
|
||||
'SORT_CONTROL_MISSING',
|
||||
'OFFSET_RANGE_ERROR',
|
||||
'CONTROL_ERROR',
|
||||
'OTHER'
|
||||
]
|
||||
|
||||
const filteredCodes = Object.entries(RESULT_CODES).filter(([k, v]) => validCodeNames.includes(k))
|
||||
const VALID_CODES = new Map([
|
||||
...filteredCodes,
|
||||
...filteredCodes.map(([k, v]) => { return [v, k] })
|
||||
])
|
||||
|
||||
// TODO: complete this doc block based on the "implements" spec link
|
||||
/**
|
||||
* @typedef {object} VirtualListViewResponseControlValue
|
||||
* @property {number} result A valid LDAP response code for the control.
|
||||
*/
|
||||
|
||||
/**
|
||||
* Implements:
|
||||
* https://datatracker.ietf.org/doc/html/draft-ietf-ldapext-ldapv3-vlv-07#section-6.2
|
||||
*
|
||||
* @extends Control
|
||||
*/
|
||||
class VirtualListViewResponseControl extends Control {
|
||||
static OID = '2.16.840.1.113730.3.4.10'
|
||||
|
||||
/**
|
||||
* A map of possible response codes. Includes `CODE => VALUE` and
|
||||
* `VALUE => CODE`. For example, `RESPONSE_CODES.get(0)` returns
|
||||
* `LDAP_SUCCESS`, and `RESPONSE_CODES.get('LDAP_SUCCESS')` returns `0`.
|
||||
*/
|
||||
static RESPONSE_CODES = Object.freeze(VALID_CODES)
|
||||
|
||||
/**
|
||||
* @typedef {ControlParams} VirtualListViewResponseParams
|
||||
* @property {Buffer|VirtualListViewResponseControlValue} [value]
|
||||
*/
|
||||
|
||||
/**
|
||||
* @param {VirtualListViewResponseParams} options
|
||||
*/
|
||||
constructor (options = {}) {
|
||||
options.type = VirtualListViewResponseControl.OID
|
||||
options.criticality = false
|
||||
super(options)
|
||||
|
||||
this.value = {}
|
||||
|
||||
if (hasOwn(options, 'value') === false || !options.value) {
|
||||
// return
|
||||
throw Error('control not enabled')
|
||||
}
|
||||
|
||||
const value = options.value
|
||||
if (Buffer.isBuffer(value)) {
|
||||
this.#parse(options.value)
|
||||
} else if (isObject(value)) {
|
||||
if (VALID_CODES.has(value.result) === false) {
|
||||
throw new Error('Invalid result code')
|
||||
}
|
||||
this.value = options.value
|
||||
} else {
|
||||
throw new TypeError('options.value must be a Buffer or Object')
|
||||
}
|
||||
|
||||
throw Error('control not enabled')
|
||||
}
|
||||
|
||||
get value () {
|
||||
return this._value
|
||||
}
|
||||
|
||||
set value (obj) {
|
||||
this._value = Object.assign({}, this._value, obj)
|
||||
}
|
||||
|
||||
#parse (buffer) {
|
||||
const ber = new BerReader(buffer)
|
||||
if (ber.readSequence()) {
|
||||
this._value = {}
|
||||
|
||||
if (ber.peek(0x02)) {
|
||||
this._value.targetPosition = ber.readInt()
|
||||
}
|
||||
|
||||
if (ber.peek(0x02)) {
|
||||
this._value.contentCount = ber.readInt()
|
||||
}
|
||||
|
||||
this._value.result = ber.readEnumeration()
|
||||
this._value.cookie = ber.readString(Ber.OctetString, true)
|
||||
|
||||
// readString returns '' instead of a zero-length buffer
|
||||
if (!this._value.cookie) {
|
||||
this._value.cookie = Buffer.alloc(0)
|
||||
}
|
||||
|
||||
return true
|
||||
}
|
||||
|
||||
return false
|
||||
}
|
||||
|
||||
_pojo (obj) {
|
||||
obj.value = this.value
|
||||
return obj
|
||||
}
|
||||
|
||||
_toBer (ber) {
|
||||
if (this.value.length === 0) { return }
|
||||
|
||||
const writer = new BerWriter()
|
||||
writer.startSequence()
|
||||
if (this.value.targetPosition !== undefined) {
|
||||
writer.writeInt(this.value.targetPosition)
|
||||
}
|
||||
if (this.value.contentCount !== undefined) {
|
||||
writer.writeInt(this.value.contentCount)
|
||||
}
|
||||
|
||||
writer.writeEnumeration(this.value.result)
|
||||
if (this.value.cookie && this.value.cookie.length > 0) {
|
||||
writer.writeBuffer(this.value.cookie, Ber.OctetString)
|
||||
} else {
|
||||
writer.writeString('') // writeBuffer rejects zero-length buffers
|
||||
}
|
||||
|
||||
writer.endSequence()
|
||||
ber.writeBuffer(writer.buffer, 0x04)
|
||||
}
|
||||
}
|
||||
module.exports = VirtualListViewResponseControl
|
||||
84
node_modules/@ldapjs/controls/lib/controls/virtual-list-view-response-control.test.js
generated
vendored
Normal file
84
node_modules/@ldapjs/controls/lib/controls/virtual-list-view-response-control.test.js
generated
vendored
Normal file
@ -0,0 +1,84 @@
|
||||
'use strict'
|
||||
|
||||
const tap = require('tap')
|
||||
tap.test('stubbed', async t => {
|
||||
t.pass()
|
||||
})
|
||||
|
||||
/**
|
||||
* This test is disabled. The commented code below is directly copied from
|
||||
* the original test file in the core `node-ldapjs` repo. The actual test
|
||||
* suite should follow the patterns of the
|
||||
* server-side-sorting-response-control.test.js test suite.
|
||||
*
|
||||
* See https://github.com/ldapjs/node-ldapjs/pull/797#issuecomment-1094132289
|
||||
*/
|
||||
|
||||
// 'use strict'
|
||||
|
||||
// const { test } = require('tap')
|
||||
// const { BerReader, BerWriter } = require('@ldapjs/asn1')
|
||||
// const ldap = require('../../lib')
|
||||
// const { getControl, VirtualListViewResponseControl: VLVResponseControl } = require('../../lib')
|
||||
// const OID = '2.16.840.1.113730.3.4.10'
|
||||
|
||||
// test('VLV response - new no args', function (t) {
|
||||
// const c = new VLVResponseControl()
|
||||
// t.ok(c)
|
||||
// t.equal(c.type, OID)
|
||||
// t.equal(c.criticality, false)
|
||||
// t.end()
|
||||
// })
|
||||
|
||||
// test('VLV response - new with args', function (t) {
|
||||
// const c = new VLVResponseControl({
|
||||
// criticality: true,
|
||||
// value: {
|
||||
// result: ldap.LDAP_SUCCESS,
|
||||
// targetPosition: 0,
|
||||
// contentCount: 10
|
||||
// }
|
||||
// })
|
||||
// t.ok(c)
|
||||
// t.equal(c.type, OID)
|
||||
// t.equal(c.criticality, false)
|
||||
// t.equal(c.value.result, ldap.LDAP_SUCCESS)
|
||||
// t.equal(c.value.targetPosition, 0)
|
||||
// t.equal(c.value.contentCount, 10)
|
||||
// t.end()
|
||||
// })
|
||||
|
||||
// test('VLV response - toBer', function (t) {
|
||||
// const vlpc = new VLVResponseControl({
|
||||
// value: {
|
||||
// targetPosition: 0,
|
||||
// contentCount: 10,
|
||||
// result: ldap.LDAP_SUCCESS
|
||||
// }
|
||||
// })
|
||||
|
||||
// const ber = new BerWriter()
|
||||
// vlpc.toBer(ber)
|
||||
|
||||
// const c = getControl(new BerReader(ber.buffer))
|
||||
// t.ok(c)
|
||||
// t.equal(c.type, OID)
|
||||
// t.equal(c.criticality, false)
|
||||
// t.equal(c.value.result, ldap.LDAP_SUCCESS)
|
||||
// t.equal(c.value.targetPosition, 0)
|
||||
// t.equal(c.value.contentCount, 10)
|
||||
// t.end()
|
||||
// })
|
||||
|
||||
// test('VLV response - toBer - empty', function (t) {
|
||||
// const vlpc = new VLVResponseControl()
|
||||
// const ber = new BerWriter()
|
||||
// vlpc.toBer(ber)
|
||||
|
||||
// const c = getControl(new BerReader(ber.buffer))
|
||||
// t.ok(c)
|
||||
// t.equal(c.type, OID)
|
||||
// t.equal(c.criticality, false)
|
||||
// t.notOk(c.value.result)
|
||||
// t.end()
|
||||
// })
|
||||
5
node_modules/@ldapjs/controls/lib/has-own.js
generated
vendored
Normal file
5
node_modules/@ldapjs/controls/lib/has-own.js
generated
vendored
Normal file
@ -0,0 +1,5 @@
|
||||
'use strict'
|
||||
|
||||
module.exports = function hasOwn (obj, prop) {
|
||||
return Object.prototype.hasOwnProperty.call(obj, prop)
|
||||
}
|
||||
5
node_modules/@ldapjs/controls/lib/is-object.js
generated
vendored
Normal file
5
node_modules/@ldapjs/controls/lib/is-object.js
generated
vendored
Normal file
@ -0,0 +1,5 @@
|
||||
'use strict'
|
||||
|
||||
module.exports = function isObject (input) {
|
||||
return Object.prototype.toString.call(input) === '[object Object]'
|
||||
}
|
||||
10
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/.github/workflows/main.yml
generated
vendored
Normal file
10
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/.github/workflows/main.yml
generated
vendored
Normal file
@ -0,0 +1,10 @@
|
||||
name: "CI"
|
||||
on:
|
||||
pull_request:
|
||||
push:
|
||||
branches:
|
||||
- master
|
||||
|
||||
jobs:
|
||||
call-core-ci:
|
||||
uses: ldapjs/.github/.github/workflows/node-ci.yml@main
|
||||
4
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/.taprc.yml
generated
vendored
Normal file
4
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/.taprc.yml
generated
vendored
Normal file
@ -0,0 +1,4 @@
|
||||
check-coverage: false
|
||||
|
||||
files:
|
||||
- 'lib/**/*.test.js'
|
||||
22
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/LICENSE
generated
vendored
Normal file
22
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/LICENSE
generated
vendored
Normal file
@ -0,0 +1,22 @@
|
||||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) 2011 Mark Cavage, All rights reserved.
|
||||
Copyright (c) 2022 The LDAPJS Collaborators.
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE
|
||||
39
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/README.md
generated
vendored
Normal file
39
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/README.md
generated
vendored
Normal file
@ -0,0 +1,39 @@
|
||||
# `@ldapjs/asn1`
|
||||
|
||||
`@ldapjs/asn1` is a library for encoding and decoding ASN.1 datatypes in pure
|
||||
JS. Currently BER encoding is supported.
|
||||
|
||||
### Decoding
|
||||
|
||||
The following reads an ASN.1 sequence with a boolean.
|
||||
|
||||
var Ber = require('@ldapjs/asn1').Ber;
|
||||
|
||||
var reader = new Ber.Reader(Buffer.from([0x30, 0x03, 0x01, 0x01, 0xff]));
|
||||
|
||||
reader.readSequence();
|
||||
console.log('Sequence len: ' + reader.length);
|
||||
if (reader.peek() === Ber.Boolean)
|
||||
console.log(reader.readBoolean());
|
||||
|
||||
### Encoding
|
||||
|
||||
The following generates the same payload as above.
|
||||
|
||||
var Ber = require('@ldapjs/asn1').Ber;
|
||||
|
||||
var writer = new Ber.Writer();
|
||||
|
||||
writer.startSequence();
|
||||
writer.writeBoolean(true);
|
||||
writer.endSequence();
|
||||
|
||||
console.log(writer.buffer);
|
||||
|
||||
## Installation
|
||||
|
||||
npm install asn1
|
||||
|
||||
## Bugs
|
||||
|
||||
See <https://github.com/ldapjs/asn1/issues>.
|
||||
12
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/lib/ber/errors.js
generated
vendored
Normal file
12
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/lib/ber/errors.js
generated
vendored
Normal file
@ -0,0 +1,12 @@
|
||||
// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved.
|
||||
|
||||
module.exports = {
|
||||
|
||||
newInvalidAsn1Error: function (msg) {
|
||||
const e = new Error()
|
||||
e.name = 'InvalidAsn1Error'
|
||||
e.message = msg || ''
|
||||
return e
|
||||
}
|
||||
|
||||
}
|
||||
24
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/lib/ber/index.js
generated
vendored
Normal file
24
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/lib/ber/index.js
generated
vendored
Normal file
@ -0,0 +1,24 @@
|
||||
// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved.
|
||||
|
||||
const errors = require('./errors')
|
||||
const types = require('./types')
|
||||
|
||||
const Reader = require('./reader')
|
||||
const Writer = require('./writer')
|
||||
|
||||
// --- Exports
|
||||
|
||||
module.exports = {
|
||||
|
||||
Reader: Reader,
|
||||
|
||||
Writer: Writer
|
||||
|
||||
}
|
||||
|
||||
for (const t in types) {
|
||||
if (Object.prototype.hasOwnProperty.call(types, t)) { module.exports[t] = types[t] }
|
||||
}
|
||||
for (const e in errors) {
|
||||
if (Object.prototype.hasOwnProperty.call(errors, e)) { module.exports[e] = errors[e] }
|
||||
}
|
||||
227
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/lib/ber/reader.js
generated
vendored
Normal file
227
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/lib/ber/reader.js
generated
vendored
Normal file
@ -0,0 +1,227 @@
|
||||
// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved.
|
||||
|
||||
const assert = require('assert')
|
||||
const ASN1 = require('./types')
|
||||
const errors = require('./errors')
|
||||
|
||||
// --- Globals
|
||||
|
||||
const newInvalidAsn1Error = errors.newInvalidAsn1Error
|
||||
|
||||
// --- API
|
||||
|
||||
function Reader (data) {
|
||||
if (!data || !Buffer.isBuffer(data)) { throw new TypeError('data must be a node Buffer') }
|
||||
|
||||
this._buf = data
|
||||
this._size = data.length
|
||||
|
||||
// These hold the "current" state
|
||||
this._len = 0
|
||||
this._offset = 0
|
||||
}
|
||||
|
||||
Object.defineProperty(Reader.prototype, Symbol.toStringTag, { value: 'BerReader' })
|
||||
|
||||
Object.defineProperty(Reader.prototype, 'length', {
|
||||
enumerable: true,
|
||||
get: function () { return (this._len) }
|
||||
})
|
||||
|
||||
Object.defineProperty(Reader.prototype, 'offset', {
|
||||
enumerable: true,
|
||||
get: function () { return (this._offset) }
|
||||
})
|
||||
|
||||
Object.defineProperty(Reader.prototype, 'remain', {
|
||||
get: function () { return (this._size - this._offset) }
|
||||
})
|
||||
|
||||
Object.defineProperty(Reader.prototype, 'buffer', {
|
||||
get: function () { return (this._buf.slice(this._offset)) }
|
||||
})
|
||||
|
||||
/**
|
||||
* Reads a single byte and advances offset; you can pass in `true` to make this
|
||||
* a "peek" operation (i.e., get the byte, but don't advance the offset).
|
||||
*
|
||||
* @param {Boolean} peek true means don't move offset.
|
||||
* @return {Number} the next byte, null if not enough data.
|
||||
*/
|
||||
Reader.prototype.readByte = function (peek) {
|
||||
if (this._size - this._offset < 1) { return null }
|
||||
|
||||
const b = this._buf[this._offset] & 0xff
|
||||
|
||||
if (!peek) { this._offset += 1 }
|
||||
|
||||
return b
|
||||
}
|
||||
|
||||
Reader.prototype.peek = function () {
|
||||
return this.readByte(true)
|
||||
}
|
||||
|
||||
/**
|
||||
* Reads a (potentially) variable length off the BER buffer. This call is
|
||||
* not really meant to be called directly, as callers have to manipulate
|
||||
* the internal buffer afterwards.
|
||||
*
|
||||
* As a result of this call, you can call `Reader.length`, until the
|
||||
* next thing called that does a readLength.
|
||||
*
|
||||
* @return {Number} the amount of offset to advance the buffer.
|
||||
* @throws {InvalidAsn1Error} on bad ASN.1
|
||||
*/
|
||||
Reader.prototype.readLength = function (offset) {
|
||||
if (offset === undefined) { offset = this._offset }
|
||||
|
||||
if (offset >= this._size) { return null }
|
||||
|
||||
let lenB = this._buf[offset++] & 0xff
|
||||
if (lenB === null) { return null }
|
||||
|
||||
if ((lenB & 0x80) === 0x80) {
|
||||
lenB &= 0x7f
|
||||
|
||||
if (lenB === 0) { throw newInvalidAsn1Error('Indefinite length not supported') }
|
||||
|
||||
if (lenB > 4) { throw newInvalidAsn1Error('encoding too long') }
|
||||
|
||||
if (this._size - offset < lenB) { return null }
|
||||
|
||||
this._len = 0
|
||||
for (let i = 0; i < lenB; i++) { this._len = (this._len << 8) + (this._buf[offset++] & 0xff) }
|
||||
} else {
|
||||
// Wasn't a variable length
|
||||
this._len = lenB
|
||||
}
|
||||
|
||||
return offset
|
||||
}
|
||||
|
||||
/**
|
||||
* Parses the next sequence in this BER buffer.
|
||||
*
|
||||
* To get the length of the sequence, call `Reader.length`.
|
||||
*
|
||||
* @return {Number} the sequence's tag.
|
||||
*/
|
||||
Reader.prototype.readSequence = function (tag) {
|
||||
const seq = this.peek()
|
||||
if (seq === null) { return null }
|
||||
if (tag !== undefined && tag !== seq) {
|
||||
throw newInvalidAsn1Error('Expected 0x' + tag.toString(16) +
|
||||
': got 0x' + seq.toString(16))
|
||||
}
|
||||
|
||||
const o = this.readLength(this._offset + 1) // stored in `length`
|
||||
if (o === null) { return null }
|
||||
|
||||
this._offset = o
|
||||
return seq
|
||||
}
|
||||
|
||||
Reader.prototype.readInt = function () {
|
||||
return this._readTag(ASN1.Integer)
|
||||
}
|
||||
|
||||
Reader.prototype.readBoolean = function (tag) {
|
||||
return (this._readTag(tag || ASN1.Boolean) !== 0)
|
||||
}
|
||||
|
||||
Reader.prototype.readEnumeration = function () {
|
||||
return this._readTag(ASN1.Enumeration)
|
||||
}
|
||||
|
||||
Reader.prototype.readString = function (tag, retbuf) {
|
||||
if (!tag) { tag = ASN1.OctetString }
|
||||
|
||||
const b = this.peek()
|
||||
if (b === null) { return null }
|
||||
|
||||
if (b !== tag) {
|
||||
throw newInvalidAsn1Error('Expected 0x' + tag.toString(16) +
|
||||
': got 0x' + b.toString(16))
|
||||
}
|
||||
|
||||
const o = this.readLength(this._offset + 1) // stored in `length`
|
||||
|
||||
if (o === null) { return null }
|
||||
|
||||
if (this.length > this._size - o) { return null }
|
||||
|
||||
this._offset = o
|
||||
|
||||
if (this.length === 0) { return retbuf ? Buffer.alloc(0) : '' }
|
||||
|
||||
const str = this._buf.slice(this._offset, this._offset + this.length)
|
||||
this._offset += this.length
|
||||
|
||||
return retbuf ? str : str.toString('utf8')
|
||||
}
|
||||
|
||||
Reader.prototype.readOID = function (tag) {
|
||||
if (!tag) { tag = ASN1.OID }
|
||||
|
||||
const b = this.readString(tag, true)
|
||||
if (b === null) { return null }
|
||||
|
||||
const values = []
|
||||
let value = 0
|
||||
|
||||
for (let i = 0; i < b.length; i++) {
|
||||
const byte = b[i] & 0xff
|
||||
|
||||
value <<= 7
|
||||
value += byte & 0x7f
|
||||
if ((byte & 0x80) === 0) {
|
||||
values.push(value)
|
||||
value = 0
|
||||
}
|
||||
}
|
||||
|
||||
value = values.shift()
|
||||
values.unshift(value % 40)
|
||||
values.unshift((value / 40) >> 0)
|
||||
|
||||
return values.join('.')
|
||||
}
|
||||
|
||||
Reader.prototype._readTag = function (tag) {
|
||||
assert.ok(tag !== undefined)
|
||||
|
||||
const b = this.peek()
|
||||
|
||||
if (b === null) { return null }
|
||||
|
||||
if (b !== tag) {
|
||||
throw newInvalidAsn1Error('Expected 0x' + tag.toString(16) +
|
||||
': got 0x' + b.toString(16))
|
||||
}
|
||||
|
||||
const o = this.readLength(this._offset + 1) // stored in `length`
|
||||
if (o === null) { return null }
|
||||
|
||||
if (this.length > 4) { throw newInvalidAsn1Error('Integer too long: ' + this.length) }
|
||||
|
||||
if (this.length > this._size - o) { return null }
|
||||
this._offset = o
|
||||
|
||||
const fb = this._buf[this._offset]
|
||||
let value = 0
|
||||
|
||||
let i
|
||||
for (i = 0; i < this.length; i++) {
|
||||
value <<= 8
|
||||
value |= (this._buf[this._offset++] & 0xff)
|
||||
}
|
||||
|
||||
if ((fb & 0x80) === 0x80 && i !== 4) { value -= (1 << (i * 8)) }
|
||||
|
||||
return value >> 0
|
||||
}
|
||||
|
||||
// --- Exported API
|
||||
|
||||
module.exports = Reader
|
||||
182
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/lib/ber/reader.test.js
generated
vendored
Normal file
182
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/lib/ber/reader.test.js
generated
vendored
Normal file
@ -0,0 +1,182 @@
|
||||
// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved.
|
||||
|
||||
const { test } = require('tap')
|
||||
const BerReader = require('./reader')
|
||||
|
||||
test('load library', function (t) {
|
||||
t.ok(BerReader)
|
||||
try {
|
||||
const reader = new BerReader()
|
||||
t.equal(reader, null, 'reader')
|
||||
t.fail('Should have thrown')
|
||||
} catch (e) {
|
||||
t.ok(e instanceof TypeError, 'Should have been a type error')
|
||||
}
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('read byte', function (t) {
|
||||
const reader = new BerReader(Buffer.from([0xde]))
|
||||
t.ok(reader)
|
||||
t.equal(reader.readByte(), 0xde, 'wrong value')
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('read 1 byte int', function (t) {
|
||||
const reader = new BerReader(Buffer.from([0x02, 0x01, 0x03]))
|
||||
t.ok(reader)
|
||||
t.equal(reader.readInt(), 0x03, 'wrong value')
|
||||
t.equal(reader.length, 0x01, 'wrong length')
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('read 2 byte int', function (t) {
|
||||
const reader = new BerReader(Buffer.from([0x02, 0x02, 0x7e, 0xde]))
|
||||
t.ok(reader)
|
||||
t.equal(reader.readInt(), 0x7ede, 'wrong value')
|
||||
t.equal(reader.length, 0x02, 'wrong length')
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('read 3 byte int', function (t) {
|
||||
const reader = new BerReader(Buffer.from([0x02, 0x03, 0x7e, 0xde, 0x03]))
|
||||
t.ok(reader)
|
||||
t.equal(reader.readInt(), 0x7ede03, 'wrong value')
|
||||
t.equal(reader.length, 0x03, 'wrong length')
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('read 4 byte int', function (t) {
|
||||
const reader = new BerReader(Buffer.from([0x02, 0x04, 0x7e, 0xde, 0x03, 0x01]))
|
||||
t.ok(reader)
|
||||
t.equal(reader.readInt(), 0x7ede0301, 'wrong value')
|
||||
t.equal(reader.length, 0x04, 'wrong length')
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('read 1 byte negative int', function (t) {
|
||||
const reader = new BerReader(Buffer.from([0x02, 0x01, 0xdc]))
|
||||
t.ok(reader)
|
||||
t.equal(reader.readInt(), -36, 'wrong value')
|
||||
t.equal(reader.length, 0x01, 'wrong length')
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('read 2 byte negative int', function (t) {
|
||||
const reader = new BerReader(Buffer.from([0x02, 0x02, 0xc0, 0x4e]))
|
||||
t.ok(reader)
|
||||
t.equal(reader.readInt(), -16306, 'wrong value')
|
||||
t.equal(reader.length, 0x02, 'wrong length')
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('read 3 byte negative int', function (t) {
|
||||
const reader = new BerReader(Buffer.from([0x02, 0x03, 0xff, 0x00, 0x19]))
|
||||
t.ok(reader)
|
||||
t.equal(reader.readInt(), -65511, 'wrong value')
|
||||
t.equal(reader.length, 0x03, 'wrong length')
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('read 4 byte negative int', function (t) {
|
||||
const reader = new BerReader(Buffer.from([0x02, 0x04, 0x91, 0x7c, 0x22, 0x1f]))
|
||||
t.ok(reader)
|
||||
t.equal(reader.readInt(), -1854135777, 'wrong value')
|
||||
t.equal(reader.length, 0x04, 'wrong length')
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('read boolean true', function (t) {
|
||||
const reader = new BerReader(Buffer.from([0x01, 0x01, 0xff]))
|
||||
t.ok(reader)
|
||||
t.equal(reader.readBoolean(), true, 'wrong value')
|
||||
t.equal(reader.length, 0x01, 'wrong length')
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('read boolean false', function (t) {
|
||||
const reader = new BerReader(Buffer.from([0x01, 0x01, 0x00]))
|
||||
t.ok(reader)
|
||||
t.equal(reader.readBoolean(), false, 'wrong value')
|
||||
t.equal(reader.length, 0x01, 'wrong length')
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('read enumeration', function (t) {
|
||||
const reader = new BerReader(Buffer.from([0x0a, 0x01, 0x20]))
|
||||
t.ok(reader)
|
||||
t.equal(reader.readEnumeration(), 0x20, 'wrong value')
|
||||
t.equal(reader.length, 0x01, 'wrong length')
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('read string', function (t) {
|
||||
const dn = 'cn=foo,ou=unit,o=test'
|
||||
const buf = Buffer.alloc(dn.length + 2)
|
||||
buf[0] = 0x04
|
||||
buf[1] = Buffer.byteLength(dn)
|
||||
buf.write(dn, 2)
|
||||
const reader = new BerReader(buf)
|
||||
t.ok(reader)
|
||||
t.equal(reader.readString(), dn, 'wrong value')
|
||||
t.equal(reader.length, dn.length, 'wrong length')
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('read sequence', function (t) {
|
||||
const reader = new BerReader(Buffer.from([0x30, 0x03, 0x01, 0x01, 0xff]))
|
||||
t.ok(reader)
|
||||
t.equal(reader.readSequence(), 0x30, 'wrong value')
|
||||
t.equal(reader.length, 0x03, 'wrong length')
|
||||
t.equal(reader.readBoolean(), true, 'wrong value')
|
||||
t.equal(reader.length, 0x01, 'wrong length')
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('anonymous LDAPv3 bind', function (t) {
|
||||
const BIND = Buffer.alloc(14)
|
||||
BIND[0] = 0x30 // Sequence
|
||||
BIND[1] = 12 // len
|
||||
BIND[2] = 0x02 // ASN.1 Integer
|
||||
BIND[3] = 1 // len
|
||||
BIND[4] = 0x04 // msgid (make up 4)
|
||||
BIND[5] = 0x60 // Bind Request
|
||||
BIND[6] = 7 // len
|
||||
BIND[7] = 0x02 // ASN.1 Integer
|
||||
BIND[8] = 1 // len
|
||||
BIND[9] = 0x03 // v3
|
||||
BIND[10] = 0x04 // String (bind dn)
|
||||
BIND[11] = 0 // len
|
||||
BIND[12] = 0x80 // ContextSpecific (choice)
|
||||
BIND[13] = 0 // simple bind
|
||||
|
||||
// Start testing ^^
|
||||
const ber = new BerReader(BIND)
|
||||
t.equal(ber.readSequence(), 48, 'Not an ASN.1 Sequence')
|
||||
t.equal(ber.length, 12, 'Message length should be 12')
|
||||
t.equal(ber.readInt(), 4, 'Message id should have been 4')
|
||||
t.equal(ber.readSequence(), 96, 'Bind Request should have been 96')
|
||||
t.equal(ber.length, 7, 'Bind length should have been 7')
|
||||
t.equal(ber.readInt(), 3, 'LDAP version should have been 3')
|
||||
t.equal(ber.readString(), '', 'Bind DN should have been empty')
|
||||
t.equal(ber.length, 0, 'string length should have been 0')
|
||||
t.equal(ber.readByte(), 0x80, 'Should have been ContextSpecific (choice)')
|
||||
t.equal(ber.readByte(), 0, 'Should have been simple bind')
|
||||
t.equal(null, ber.readByte(), 'Should be out of data')
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('long string', function (t) {
|
||||
const buf = Buffer.alloc(256)
|
||||
const s =
|
||||
'2;649;CN=Red Hat CS 71GA Demo,O=Red Hat CS 71GA Demo,C=US;' +
|
||||
'CN=RHCS Agent - admin01,UID=admin01,O=redhat,C=US [1] This is ' +
|
||||
'Teena Vradmin\'s description.'
|
||||
buf[0] = 0x04
|
||||
buf[1] = 0x81
|
||||
buf[2] = 0x94
|
||||
buf.write(s, 3)
|
||||
const ber = new BerReader(buf.slice(0, 3 + s.length))
|
||||
t.equal(ber.readString(), s)
|
||||
t.end()
|
||||
})
|
||||
35
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/lib/ber/types.js
generated
vendored
Normal file
35
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/lib/ber/types.js
generated
vendored
Normal file
@ -0,0 +1,35 @@
|
||||
// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved.
|
||||
|
||||
module.exports = {
|
||||
EOC: 0,
|
||||
Boolean: 1,
|
||||
Integer: 2,
|
||||
BitString: 3,
|
||||
OctetString: 4,
|
||||
Null: 5,
|
||||
OID: 6,
|
||||
ObjectDescriptor: 7,
|
||||
External: 8,
|
||||
Real: 9, // float
|
||||
Enumeration: 10,
|
||||
PDV: 11,
|
||||
Utf8String: 12,
|
||||
RelativeOID: 13,
|
||||
Sequence: 16,
|
||||
Set: 17,
|
||||
NumericString: 18,
|
||||
PrintableString: 19,
|
||||
T61String: 20,
|
||||
VideotexString: 21,
|
||||
IA5String: 22,
|
||||
UTCTime: 23,
|
||||
GeneralizedTime: 24,
|
||||
GraphicString: 25,
|
||||
VisibleString: 26,
|
||||
GeneralString: 28,
|
||||
UniversalString: 29,
|
||||
CharacterString: 30,
|
||||
BMPString: 31,
|
||||
Constructor: 32,
|
||||
Context: 128
|
||||
}
|
||||
298
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/lib/ber/writer.js
generated
vendored
Normal file
298
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/lib/ber/writer.js
generated
vendored
Normal file
@ -0,0 +1,298 @@
|
||||
// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved.
|
||||
|
||||
const assert = require('assert')
|
||||
const ASN1 = require('./types')
|
||||
const errors = require('./errors')
|
||||
|
||||
// --- Globals
|
||||
|
||||
const newInvalidAsn1Error = errors.newInvalidAsn1Error
|
||||
|
||||
const DEFAULT_OPTS = {
|
||||
size: 1024,
|
||||
growthFactor: 8
|
||||
}
|
||||
|
||||
// --- Helpers
|
||||
|
||||
function merge (from, to) {
|
||||
assert.ok(from)
|
||||
assert.equal(typeof (from), 'object')
|
||||
assert.ok(to)
|
||||
assert.equal(typeof (to), 'object')
|
||||
|
||||
const keys = Object.getOwnPropertyNames(from)
|
||||
keys.forEach(function (key) {
|
||||
if (to[key]) { return }
|
||||
|
||||
const value = Object.getOwnPropertyDescriptor(from, key)
|
||||
Object.defineProperty(to, key, value)
|
||||
})
|
||||
|
||||
return to
|
||||
}
|
||||
|
||||
// --- API
|
||||
|
||||
function Writer (options) {
|
||||
options = merge(DEFAULT_OPTS, options || {})
|
||||
|
||||
this._buf = Buffer.alloc(options.size || 1024)
|
||||
this._size = this._buf.length
|
||||
this._offset = 0
|
||||
this._options = options
|
||||
|
||||
// A list of offsets in the buffer where we need to insert
|
||||
// sequence tag/len pairs.
|
||||
this._seq = []
|
||||
}
|
||||
|
||||
Object.defineProperty(Writer.prototype, Symbol.toStringTag, { value: 'BerWriter' })
|
||||
|
||||
Object.defineProperty(Writer.prototype, 'buffer', {
|
||||
get: function () {
|
||||
if (this._seq.length) { throw newInvalidAsn1Error(this._seq.length + ' unended sequence(s)') }
|
||||
|
||||
return (this._buf.slice(0, this._offset))
|
||||
}
|
||||
})
|
||||
|
||||
/**
|
||||
* Append a raw buffer to the current writer instance. No validation to
|
||||
* determine if the buffer represents a valid BER encoding is performed.
|
||||
*
|
||||
* @param {Buffer} buffer The buffer to append. If this is not a valid BER
|
||||
* sequence of data, it will invalidate the BER represented by the `BerWriter`.
|
||||
*
|
||||
* @throws If the input is not an instance of Buffer.
|
||||
*/
|
||||
Writer.prototype.appendBuffer = function appendBuffer (buffer) {
|
||||
if (Buffer.isBuffer(buffer) === false) {
|
||||
throw Error('buffer must be an instance of Buffer')
|
||||
}
|
||||
for (const b of buffer.values()) {
|
||||
this.writeByte(b)
|
||||
}
|
||||
}
|
||||
|
||||
Writer.prototype.writeByte = function (b) {
|
||||
if (typeof (b) !== 'number') { throw new TypeError('argument must be a Number') }
|
||||
|
||||
this._ensure(1)
|
||||
this._buf[this._offset++] = b
|
||||
}
|
||||
|
||||
Writer.prototype.writeInt = function (i, tag) {
|
||||
if (typeof (i) !== 'number') { throw new TypeError('argument must be a Number') }
|
||||
if (typeof (tag) !== 'number') { tag = ASN1.Integer }
|
||||
|
||||
let sz = 4
|
||||
|
||||
while ((((i & 0xff800000) === 0) || ((i & 0xff800000) === 0xff800000 >> 0)) &&
|
||||
(sz > 1)) {
|
||||
sz--
|
||||
i <<= 8
|
||||
}
|
||||
|
||||
if (sz > 4) { throw newInvalidAsn1Error('BER ints cannot be > 0xffffffff') }
|
||||
|
||||
this._ensure(2 + sz)
|
||||
this._buf[this._offset++] = tag
|
||||
this._buf[this._offset++] = sz
|
||||
|
||||
while (sz-- > 0) {
|
||||
this._buf[this._offset++] = ((i & 0xff000000) >>> 24)
|
||||
i <<= 8
|
||||
}
|
||||
}
|
||||
|
||||
Writer.prototype.writeNull = function () {
|
||||
this.writeByte(ASN1.Null)
|
||||
this.writeByte(0x00)
|
||||
}
|
||||
|
||||
Writer.prototype.writeEnumeration = function (i, tag) {
|
||||
if (typeof (i) !== 'number') { throw new TypeError('argument must be a Number') }
|
||||
if (typeof (tag) !== 'number') { tag = ASN1.Enumeration }
|
||||
|
||||
return this.writeInt(i, tag)
|
||||
}
|
||||
|
||||
Writer.prototype.writeBoolean = function (b, tag) {
|
||||
if (typeof (b) !== 'boolean') { throw new TypeError('argument must be a Boolean') }
|
||||
if (typeof (tag) !== 'number') { tag = ASN1.Boolean }
|
||||
|
||||
this._ensure(3)
|
||||
this._buf[this._offset++] = tag
|
||||
this._buf[this._offset++] = 0x01
|
||||
this._buf[this._offset++] = b ? 0xff : 0x00
|
||||
}
|
||||
|
||||
Writer.prototype.writeString = function (s, tag) {
|
||||
if (typeof (s) !== 'string') { throw new TypeError('argument must be a string (was: ' + typeof (s) + ')') }
|
||||
if (typeof (tag) !== 'number') { tag = ASN1.OctetString }
|
||||
|
||||
const len = Buffer.byteLength(s)
|
||||
this.writeByte(tag)
|
||||
this.writeLength(len)
|
||||
if (len) {
|
||||
this._ensure(len)
|
||||
this._buf.write(s, this._offset)
|
||||
this._offset += len
|
||||
}
|
||||
}
|
||||
|
||||
Writer.prototype.writeBuffer = function (buf, tag) {
|
||||
if (typeof (tag) !== 'number') { throw new TypeError('tag must be a number') }
|
||||
if (!Buffer.isBuffer(buf)) { throw new TypeError('argument must be a buffer') }
|
||||
|
||||
this.writeByte(tag)
|
||||
this.writeLength(buf.length)
|
||||
this._ensure(buf.length)
|
||||
buf.copy(this._buf, this._offset, 0, buf.length)
|
||||
this._offset += buf.length
|
||||
}
|
||||
|
||||
Writer.prototype.writeStringArray = function (strings) {
|
||||
if (Array.isArray(strings) === false) { throw new TypeError('argument must be an Array[String]') }
|
||||
|
||||
const self = this
|
||||
strings.forEach(function (s) {
|
||||
self.writeString(s)
|
||||
})
|
||||
}
|
||||
|
||||
// This is really to solve DER cases, but whatever for now
|
||||
Writer.prototype.writeOID = function (s, tag) {
|
||||
if (typeof (s) !== 'string') { throw new TypeError('argument must be a string') }
|
||||
if (typeof (tag) !== 'number') { tag = ASN1.OID }
|
||||
|
||||
if (!/^([0-9]+\.){3,}[0-9]+$/.test(s)) { throw new Error('argument is not a valid OID string') }
|
||||
|
||||
function encodeOctet (bytes, octet) {
|
||||
if (octet < 128) {
|
||||
bytes.push(octet)
|
||||
} else if (octet < 16384) {
|
||||
bytes.push((octet >>> 7) | 0x80)
|
||||
bytes.push(octet & 0x7F)
|
||||
} else if (octet < 2097152) {
|
||||
bytes.push((octet >>> 14) | 0x80)
|
||||
bytes.push(((octet >>> 7) | 0x80) & 0xFF)
|
||||
bytes.push(octet & 0x7F)
|
||||
} else if (octet < 268435456) {
|
||||
bytes.push((octet >>> 21) | 0x80)
|
||||
bytes.push(((octet >>> 14) | 0x80) & 0xFF)
|
||||
bytes.push(((octet >>> 7) | 0x80) & 0xFF)
|
||||
bytes.push(octet & 0x7F)
|
||||
} else {
|
||||
bytes.push(((octet >>> 28) | 0x80) & 0xFF)
|
||||
bytes.push(((octet >>> 21) | 0x80) & 0xFF)
|
||||
bytes.push(((octet >>> 14) | 0x80) & 0xFF)
|
||||
bytes.push(((octet >>> 7) | 0x80) & 0xFF)
|
||||
bytes.push(octet & 0x7F)
|
||||
}
|
||||
}
|
||||
|
||||
const tmp = s.split('.')
|
||||
const bytes = []
|
||||
bytes.push(parseInt(tmp[0], 10) * 40 + parseInt(tmp[1], 10))
|
||||
tmp.slice(2).forEach(function (b) {
|
||||
encodeOctet(bytes, parseInt(b, 10))
|
||||
})
|
||||
|
||||
const self = this
|
||||
this._ensure(2 + bytes.length)
|
||||
this.writeByte(tag)
|
||||
this.writeLength(bytes.length)
|
||||
bytes.forEach(function (b) {
|
||||
self.writeByte(b)
|
||||
})
|
||||
}
|
||||
|
||||
Writer.prototype.writeLength = function (len) {
|
||||
if (typeof (len) !== 'number') { throw new TypeError('argument must be a Number') }
|
||||
|
||||
this._ensure(4)
|
||||
|
||||
if (len <= 0x7f) {
|
||||
this._buf[this._offset++] = len
|
||||
} else if (len <= 0xff) {
|
||||
this._buf[this._offset++] = 0x81
|
||||
this._buf[this._offset++] = len
|
||||
} else if (len <= 0xffff) {
|
||||
this._buf[this._offset++] = 0x82
|
||||
this._buf[this._offset++] = len >> 8
|
||||
this._buf[this._offset++] = len
|
||||
} else if (len <= 0xffffff) {
|
||||
this._buf[this._offset++] = 0x83
|
||||
this._buf[this._offset++] = len >> 16
|
||||
this._buf[this._offset++] = len >> 8
|
||||
this._buf[this._offset++] = len
|
||||
} else {
|
||||
throw newInvalidAsn1Error('Length too long (> 4 bytes)')
|
||||
}
|
||||
}
|
||||
|
||||
Writer.prototype.startSequence = function (tag) {
|
||||
if (typeof (tag) !== 'number') { tag = ASN1.Sequence | ASN1.Constructor }
|
||||
|
||||
this.writeByte(tag)
|
||||
this._seq.push(this._offset)
|
||||
this._ensure(3)
|
||||
this._offset += 3
|
||||
}
|
||||
|
||||
Writer.prototype.endSequence = function () {
|
||||
const seq = this._seq.pop()
|
||||
const start = seq + 3
|
||||
const len = this._offset - start
|
||||
|
||||
if (len <= 0x7f) {
|
||||
this._shift(start, len, -2)
|
||||
this._buf[seq] = len
|
||||
} else if (len <= 0xff) {
|
||||
this._shift(start, len, -1)
|
||||
this._buf[seq] = 0x81
|
||||
this._buf[seq + 1] = len
|
||||
} else if (len <= 0xffff) {
|
||||
this._buf[seq] = 0x82
|
||||
this._buf[seq + 1] = len >> 8
|
||||
this._buf[seq + 2] = len
|
||||
} else if (len <= 0xffffff) {
|
||||
this._shift(start, len, 1)
|
||||
this._buf[seq] = 0x83
|
||||
this._buf[seq + 1] = len >> 16
|
||||
this._buf[seq + 2] = len >> 8
|
||||
this._buf[seq + 3] = len
|
||||
} else {
|
||||
throw newInvalidAsn1Error('Sequence too long')
|
||||
}
|
||||
}
|
||||
|
||||
Writer.prototype._shift = function (start, len, shift) {
|
||||
assert.ok(start !== undefined)
|
||||
assert.ok(len !== undefined)
|
||||
assert.ok(shift)
|
||||
|
||||
this._buf.copy(this._buf, start + shift, start, start + len)
|
||||
this._offset += shift
|
||||
}
|
||||
|
||||
Writer.prototype._ensure = function (len) {
|
||||
assert.ok(len)
|
||||
|
||||
if (this._size - this._offset < len) {
|
||||
let sz = this._size * this._options.growthFactor
|
||||
if (sz - this._offset < len) { sz += len }
|
||||
|
||||
const buf = Buffer.alloc(sz)
|
||||
|
||||
this._buf.copy(buf, 0, 0, this._offset)
|
||||
this._buf = buf
|
||||
this._size = sz
|
||||
}
|
||||
}
|
||||
|
||||
// --- Exported API
|
||||
|
||||
module.exports = Writer
|
||||
344
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/lib/ber/writer.test.js
generated
vendored
Normal file
344
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/lib/ber/writer.test.js
generated
vendored
Normal file
@ -0,0 +1,344 @@
|
||||
// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved.
|
||||
|
||||
const { test } = require('tap')
|
||||
const BerWriter = require('./writer')
|
||||
|
||||
test('write byte', function (t) {
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.writeByte(0xC2)
|
||||
const ber = writer.buffer
|
||||
|
||||
t.ok(ber)
|
||||
t.equal(ber.length, 1, 'Wrong length')
|
||||
t.equal(ber[0], 0xC2, 'value wrong')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('write 1 byte int', function (t) {
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.writeInt(0x7f)
|
||||
const ber = writer.buffer
|
||||
|
||||
t.ok(ber)
|
||||
t.equal(ber.length, 3, 'Wrong length for an int: ' + ber.length)
|
||||
t.equal(ber[0], 0x02, 'ASN.1 tag wrong (2) -> ' + ber[0])
|
||||
t.equal(ber[1], 0x01, 'length wrong(1) -> ' + ber[1])
|
||||
t.equal(ber[2], 0x7f, 'value wrong(3) -> ' + ber[2])
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('write 2 byte int', function (t) {
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.writeInt(0x7ffe)
|
||||
const ber = writer.buffer
|
||||
|
||||
t.ok(ber)
|
||||
t.equal(ber.length, 4, 'Wrong length for an int')
|
||||
t.equal(ber[0], 0x02, 'ASN.1 tag wrong')
|
||||
t.equal(ber[1], 0x02, 'length wrong')
|
||||
t.equal(ber[2], 0x7f, 'value wrong (byte 1)')
|
||||
t.equal(ber[3], 0xfe, 'value wrong (byte 2)')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('write 3 byte int', function (t) {
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.writeInt(0x7ffffe)
|
||||
const ber = writer.buffer
|
||||
|
||||
t.ok(ber)
|
||||
t.equal(ber.length, 5, 'Wrong length for an int')
|
||||
t.equal(ber[0], 0x02, 'ASN.1 tag wrong')
|
||||
t.equal(ber[1], 0x03, 'length wrong')
|
||||
t.equal(ber[2], 0x7f, 'value wrong (byte 1)')
|
||||
t.equal(ber[3], 0xff, 'value wrong (byte 2)')
|
||||
t.equal(ber[4], 0xfe, 'value wrong (byte 3)')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('write 4 byte int', function (t) {
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.writeInt(0x7ffffffe)
|
||||
const ber = writer.buffer
|
||||
|
||||
t.ok(ber)
|
||||
|
||||
t.equal(ber.length, 6, 'Wrong length for an int')
|
||||
t.equal(ber[0], 0x02, 'ASN.1 tag wrong')
|
||||
t.equal(ber[1], 0x04, 'length wrong')
|
||||
t.equal(ber[2], 0x7f, 'value wrong (byte 1)')
|
||||
t.equal(ber[3], 0xff, 'value wrong (byte 2)')
|
||||
t.equal(ber[4], 0xff, 'value wrong (byte 3)')
|
||||
t.equal(ber[5], 0xfe, 'value wrong (byte 4)')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('write 1 byte negative int', function (t) {
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.writeInt(-128)
|
||||
const ber = writer.buffer
|
||||
|
||||
t.ok(ber)
|
||||
|
||||
t.equal(ber.length, 3, 'Wrong length for an int')
|
||||
t.equal(ber[0], 0x02, 'ASN.1 tag wrong')
|
||||
t.equal(ber[1], 0x01, 'length wrong')
|
||||
t.equal(ber[2], 0x80, 'value wrong (byte 1)')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('write 2 byte negative int', function (t) {
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.writeInt(-22400)
|
||||
const ber = writer.buffer
|
||||
|
||||
t.ok(ber)
|
||||
|
||||
t.equal(ber.length, 4, 'Wrong length for an int')
|
||||
t.equal(ber[0], 0x02, 'ASN.1 tag wrong')
|
||||
t.equal(ber[1], 0x02, 'length wrong')
|
||||
t.equal(ber[2], 0xa8, 'value wrong (byte 1)')
|
||||
t.equal(ber[3], 0x80, 'value wrong (byte 2)')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('write 3 byte negative int', function (t) {
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.writeInt(-481653)
|
||||
const ber = writer.buffer
|
||||
|
||||
t.ok(ber)
|
||||
|
||||
t.equal(ber.length, 5, 'Wrong length for an int')
|
||||
t.equal(ber[0], 0x02, 'ASN.1 tag wrong')
|
||||
t.equal(ber[1], 0x03, 'length wrong')
|
||||
t.equal(ber[2], 0xf8, 'value wrong (byte 1)')
|
||||
t.equal(ber[3], 0xa6, 'value wrong (byte 2)')
|
||||
t.equal(ber[4], 0x8b, 'value wrong (byte 3)')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('write 4 byte negative int', function (t) {
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.writeInt(-1522904131)
|
||||
const ber = writer.buffer
|
||||
|
||||
t.ok(ber)
|
||||
|
||||
t.equal(ber.length, 6, 'Wrong length for an int')
|
||||
t.equal(ber[0], 0x02, 'ASN.1 tag wrong')
|
||||
t.equal(ber[1], 0x04, 'length wrong')
|
||||
t.equal(ber[2], 0xa5, 'value wrong (byte 1)')
|
||||
t.equal(ber[3], 0x3a, 'value wrong (byte 2)')
|
||||
t.equal(ber[4], 0x53, 'value wrong (byte 3)')
|
||||
t.equal(ber[5], 0xbd, 'value wrong (byte 4)')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('write boolean', function (t) {
|
||||
const writer = new BerWriter()
|
||||
|
||||
writer.writeBoolean(true)
|
||||
writer.writeBoolean(false)
|
||||
const ber = writer.buffer
|
||||
|
||||
t.ok(ber)
|
||||
t.equal(ber.length, 6, 'Wrong length')
|
||||
t.equal(ber[0], 0x01, 'tag wrong')
|
||||
t.equal(ber[1], 0x01, 'length wrong')
|
||||
t.equal(ber[2], 0xff, 'value wrong')
|
||||
t.equal(ber[3], 0x01, 'tag wrong')
|
||||
t.equal(ber[4], 0x01, 'length wrong')
|
||||
t.equal(ber[5], 0x00, 'value wrong')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('write string', function (t) {
|
||||
const writer = new BerWriter()
|
||||
writer.writeString('hello world')
|
||||
const ber = writer.buffer
|
||||
|
||||
t.ok(ber)
|
||||
t.equal(ber.length, 13, 'wrong length')
|
||||
t.equal(ber[0], 0x04, 'wrong tag')
|
||||
t.equal(ber[1], 11, 'wrong length')
|
||||
t.equal(ber.slice(2).toString('utf8'), 'hello world', 'wrong value')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('write buffer', function (t) {
|
||||
const writer = new BerWriter()
|
||||
// write some stuff to start with
|
||||
writer.writeString('hello world')
|
||||
let ber = writer.buffer
|
||||
const buf = Buffer.from([0x04, 0x0b, 0x30, 0x09, 0x02, 0x01, 0x0f, 0x01, 0x01,
|
||||
0xff, 0x01, 0x01, 0xff])
|
||||
writer.writeBuffer(buf.slice(2, buf.length), 0x04)
|
||||
ber = writer.buffer
|
||||
|
||||
t.ok(ber)
|
||||
t.equal(ber.length, 26, 'wrong length')
|
||||
t.equal(ber[0], 0x04, 'wrong tag')
|
||||
t.equal(ber[1], 11, 'wrong length')
|
||||
t.equal(ber.slice(2, 13).toString('utf8'), 'hello world', 'wrong value')
|
||||
t.equal(ber[13], buf[0], 'wrong tag')
|
||||
t.equal(ber[14], buf[1], 'wrong length')
|
||||
for (let i = 13, j = 0; i < ber.length && j < buf.length; i++, j++) {
|
||||
t.equal(ber[i], buf[j], 'buffer contents not identical')
|
||||
}
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('write string array', function (t) {
|
||||
const writer = new BerWriter()
|
||||
writer.writeStringArray(['hello world', 'fubar!'])
|
||||
const ber = writer.buffer
|
||||
|
||||
t.ok(ber)
|
||||
|
||||
t.equal(ber.length, 21, 'wrong length')
|
||||
t.equal(ber[0], 0x04, 'wrong tag')
|
||||
t.equal(ber[1], 11, 'wrong length')
|
||||
t.equal(ber.slice(2, 13).toString('utf8'), 'hello world', 'wrong value')
|
||||
|
||||
t.equal(ber[13], 0x04, 'wrong tag')
|
||||
t.equal(ber[14], 6, 'wrong length')
|
||||
t.equal(ber.slice(15).toString('utf8'), 'fubar!', 'wrong value')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('resize internal buffer', function (t) {
|
||||
const writer = new BerWriter({ size: 2 })
|
||||
writer.writeString('hello world')
|
||||
const ber = writer.buffer
|
||||
|
||||
t.ok(ber)
|
||||
t.equal(ber.length, 13, 'wrong length')
|
||||
t.equal(ber[0], 0x04, 'wrong tag')
|
||||
t.equal(ber[1], 11, 'wrong length')
|
||||
t.equal(ber.slice(2).toString('utf8'), 'hello world', 'wrong value')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('sequence', function (t) {
|
||||
const writer = new BerWriter({ size: 25 })
|
||||
writer.startSequence()
|
||||
writer.writeString('hello world')
|
||||
writer.endSequence()
|
||||
const ber = writer.buffer
|
||||
|
||||
t.ok(ber)
|
||||
t.equal(ber.length, 15, 'wrong length')
|
||||
t.equal(ber[0], 0x30, 'wrong tag')
|
||||
t.equal(ber[1], 13, 'wrong length')
|
||||
t.equal(ber[2], 0x04, 'wrong tag')
|
||||
t.equal(ber[3], 11, 'wrong length')
|
||||
t.equal(ber.slice(4).toString('utf8'), 'hello world', 'wrong value')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('nested sequence', function (t) {
|
||||
const writer = new BerWriter({ size: 25 })
|
||||
writer.startSequence()
|
||||
writer.writeString('hello world')
|
||||
writer.startSequence()
|
||||
writer.writeString('hello world')
|
||||
writer.endSequence()
|
||||
writer.endSequence()
|
||||
const ber = writer.buffer
|
||||
|
||||
t.ok(ber)
|
||||
t.equal(ber.length, 30, 'wrong length')
|
||||
t.equal(ber[0], 0x30, 'wrong tag')
|
||||
t.equal(ber[1], 28, 'wrong length')
|
||||
t.equal(ber[2], 0x04, 'wrong tag')
|
||||
t.equal(ber[3], 11, 'wrong length')
|
||||
t.equal(ber.slice(4, 15).toString('utf8'), 'hello world', 'wrong value')
|
||||
t.equal(ber[15], 0x30, 'wrong tag')
|
||||
t.equal(ber[16], 13, 'wrong length')
|
||||
t.equal(ber[17], 0x04, 'wrong tag')
|
||||
t.equal(ber[18], 11, 'wrong length')
|
||||
t.equal(ber.slice(19, 30).toString('utf8'), 'hello world', 'wrong value')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('LDAP bind message', function (t) {
|
||||
const dn = 'cn=foo,ou=unit,o=test'
|
||||
const writer = new BerWriter()
|
||||
writer.startSequence()
|
||||
writer.writeInt(3) // msgid = 3
|
||||
writer.startSequence(0x60) // ldap bind
|
||||
writer.writeInt(3) // ldap v3
|
||||
writer.writeString(dn)
|
||||
writer.writeByte(0x80)
|
||||
writer.writeByte(0x00)
|
||||
writer.endSequence()
|
||||
writer.endSequence()
|
||||
const ber = writer.buffer
|
||||
|
||||
t.ok(ber)
|
||||
t.equal(ber.length, 35, 'wrong length (buffer)')
|
||||
t.equal(ber[0], 0x30, 'wrong tag')
|
||||
t.equal(ber[1], 33, 'wrong length')
|
||||
t.equal(ber[2], 0x02, 'wrong tag')
|
||||
t.equal(ber[3], 1, 'wrong length')
|
||||
t.equal(ber[4], 0x03, 'wrong value')
|
||||
t.equal(ber[5], 0x60, 'wrong tag')
|
||||
t.equal(ber[6], 28, 'wrong length')
|
||||
t.equal(ber[7], 0x02, 'wrong tag')
|
||||
t.equal(ber[8], 1, 'wrong length')
|
||||
t.equal(ber[9], 0x03, 'wrong value')
|
||||
t.equal(ber[10], 0x04, 'wrong tag')
|
||||
t.equal(ber[11], dn.length, 'wrong length')
|
||||
t.equal(ber.slice(12, 33).toString('utf8'), dn, 'wrong value')
|
||||
t.equal(ber[33], 0x80, 'wrong tag')
|
||||
t.equal(ber[34], 0x00, 'wrong len')
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('Write OID', function (t) {
|
||||
const oid = '1.2.840.113549.1.1.1'
|
||||
const writer = new BerWriter()
|
||||
writer.writeOID(oid)
|
||||
|
||||
const expected = Buffer.from([0x06, 0x09, 0x2a, 0x86,
|
||||
0x48, 0x86, 0xf7, 0x0d,
|
||||
0x01, 0x01, 0x01])
|
||||
const ber = writer.buffer
|
||||
t.equal(ber.compare(expected), 0)
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
test('appendBuffer appends a buffer', async t => {
|
||||
const expected = Buffer.from([0x04, 0x03, 0x66, 0x6f, 0x6f, 0x66, 0x6f, 0x6f])
|
||||
const writer = new BerWriter()
|
||||
writer.writeString('foo')
|
||||
writer.appendBuffer(Buffer.from('foo'))
|
||||
t.equal(Buffer.compare(writer.buffer, expected), 0)
|
||||
})
|
||||
18
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/lib/index.js
generated
vendored
Normal file
18
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/lib/index.js
generated
vendored
Normal file
@ -0,0 +1,18 @@
|
||||
// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved.
|
||||
|
||||
// If you have no idea what ASN.1 or BER is, see this:
|
||||
// https://web.archive.org/web/20220314051854/http://luca.ntop.org/Teaching/Appunti/asn1.html
|
||||
|
||||
const Ber = require('./ber/index')
|
||||
|
||||
// --- Exported API
|
||||
|
||||
module.exports = {
|
||||
|
||||
Ber: Ber,
|
||||
|
||||
BerReader: Ber.Reader,
|
||||
|
||||
BerWriter: Ber.Writer
|
||||
|
||||
}
|
||||
34
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/package.json
generated
vendored
Normal file
34
node_modules/@ldapjs/controls/node_modules/@ldapjs/asn1/package.json
generated
vendored
Normal file
@ -0,0 +1,34 @@
|
||||
{
|
||||
"originalAuthor": "Joyent (joyent.com)",
|
||||
"contributors": [
|
||||
"Mark Cavage <mcavage@gmail.com>",
|
||||
"David Gwynne <loki@animata.net>",
|
||||
"Yunong Xiao <yunong@joyent.com>",
|
||||
"Alex Wilson <alex.wilson@joyent.com>"
|
||||
],
|
||||
"name": "@ldapjs/asn1",
|
||||
"description": "Contains parsers and serializers for ASN.1 (currently BER only)",
|
||||
"version": "1.2.0",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git://github.com/ldapjs/asn1.git"
|
||||
},
|
||||
"main": "lib/index.js",
|
||||
"devDependencies": {
|
||||
"@fastify/pre-commit": "^2.0.2",
|
||||
"standard": "^16.0.4",
|
||||
"tap": "^16.0.1"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "tap --no-coverage-report -R terse",
|
||||
"test:cov": "tap -R terse",
|
||||
"test:cov:html": "tap -R terse --coverage-report=html",
|
||||
"test:watch": "tap -w --no-coverage-report -R terse",
|
||||
"lint": "standard"
|
||||
},
|
||||
"license": "MIT",
|
||||
"pre-commit": [
|
||||
"lint",
|
||||
"test"
|
||||
]
|
||||
}
|
||||
45
node_modules/@ldapjs/controls/package.json
generated
vendored
Normal file
45
node_modules/@ldapjs/controls/package.json
generated
vendored
Normal file
@ -0,0 +1,45 @@
|
||||
{
|
||||
"name": "@ldapjs/controls",
|
||||
"version": "2.1.0",
|
||||
"description": "LDAP control objects",
|
||||
"main": "index.js",
|
||||
"scripts": {
|
||||
"lint": "eslint .",
|
||||
"lint:ci": "eslint .",
|
||||
"test": "tap --no-coverage-report -R terse",
|
||||
"test:cov": "tap -R terse",
|
||||
"test:cov:html": "tap -R terse --coverage-report=html",
|
||||
"test:watch": "tap -w --no-coverage-report -R terse"
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git+ssh://git@github.com/ldapjs/controls.git"
|
||||
},
|
||||
"keywords": [
|
||||
"ldapjs"
|
||||
],
|
||||
"author": "James Sumners",
|
||||
"license": "MIT",
|
||||
"bugs": {
|
||||
"url": "https://github.com/ldapjs/controls/issues"
|
||||
},
|
||||
"homepage": "https://github.com/ldapjs/controls#readme",
|
||||
"devDependencies": {
|
||||
"@fastify/pre-commit": "^2.0.2",
|
||||
"eslint": "^8.34.0",
|
||||
"eslint-config-standard": "^17.0.0",
|
||||
"eslint-plugin-import": "^2.27.5",
|
||||
"eslint-plugin-n": "^15.6.1",
|
||||
"eslint-plugin-node": "^11.1.0",
|
||||
"eslint-plugin-promise": "^6.1.1",
|
||||
"tap": "^16.3.4"
|
||||
},
|
||||
"dependencies": {
|
||||
"@ldapjs/asn1": "^1.2.0",
|
||||
"@ldapjs/protocol": "^1.2.1"
|
||||
},
|
||||
"pre-commit": [
|
||||
"lint",
|
||||
"test"
|
||||
]
|
||||
}
|
||||
13
node_modules/@ldapjs/dn/.eslintrc
generated
vendored
Normal file
13
node_modules/@ldapjs/dn/.eslintrc
generated
vendored
Normal file
@ -0,0 +1,13 @@
|
||||
{
|
||||
"parserOptions": {
|
||||
"ecmaVersion": "latest"
|
||||
},
|
||||
|
||||
"extends": [
|
||||
"standard"
|
||||
],
|
||||
|
||||
"rules": {
|
||||
"no-labels": ["error", {"allowLoop": true}]
|
||||
}
|
||||
}
|
||||
10
node_modules/@ldapjs/dn/.github/workflows/main.yml
generated
vendored
Normal file
10
node_modules/@ldapjs/dn/.github/workflows/main.yml
generated
vendored
Normal file
@ -0,0 +1,10 @@
|
||||
name: "CI"
|
||||
on:
|
||||
pull_request:
|
||||
push:
|
||||
branches:
|
||||
- master
|
||||
|
||||
jobs:
|
||||
call-core-ci:
|
||||
uses: ldapjs/.github/.github/workflows/node-ci.yml@main
|
||||
5
node_modules/@ldapjs/dn/.taprc.yaml
generated
vendored
Normal file
5
node_modules/@ldapjs/dn/.taprc.yaml
generated
vendored
Normal file
@ -0,0 +1,5 @@
|
||||
reporter: terse
|
||||
coverage-map: coverage-map.js
|
||||
|
||||
files:
|
||||
- 'lib/**/*.test.js'
|
||||
21
node_modules/@ldapjs/dn/LICENSE
generated
vendored
Normal file
21
node_modules/@ldapjs/dn/LICENSE
generated
vendored
Normal file
@ -0,0 +1,21 @@
|
||||
Copyright (c) 2014 Patrick Mooney. All rights reserved.
|
||||
Copyright (c) 2014 Mark Cavage, Inc. All rights reserved.
|
||||
Copyright (c) 2022 The LDAPJS Collaborators.
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE
|
||||
8
node_modules/@ldapjs/dn/README.md
generated
vendored
Normal file
8
node_modules/@ldapjs/dn/README.md
generated
vendored
Normal file
@ -0,0 +1,8 @@
|
||||
# dn
|
||||
|
||||
Provides objects for representing and working with LDAP distinguished name
|
||||
strings as defined by [RFC 4514](https://www.rfc-editor.org/rfc/rfc4514).
|
||||
|
||||
## License
|
||||
|
||||
MIT.
|
||||
3
node_modules/@ldapjs/dn/coverage-map.js
generated
vendored
Normal file
3
node_modules/@ldapjs/dn/coverage-map.js
generated
vendored
Normal file
@ -0,0 +1,3 @@
|
||||
'use strict'
|
||||
|
||||
module.exports = testFile => testFile.replace(/\.test\.js$/, '.js')
|
||||
6
node_modules/@ldapjs/dn/index.js
generated
vendored
Normal file
6
node_modules/@ldapjs/dn/index.js
generated
vendored
Normal file
@ -0,0 +1,6 @@
|
||||
'use strict'
|
||||
|
||||
module.exports = {
|
||||
DN: require('./lib/dn'),
|
||||
RDN: require('./lib/rdn')
|
||||
}
|
||||
11
node_modules/@ldapjs/dn/lib/deprecations.js
generated
vendored
Normal file
11
node_modules/@ldapjs/dn/lib/deprecations.js
generated
vendored
Normal file
@ -0,0 +1,11 @@
|
||||
'use strict'
|
||||
|
||||
const warning = require('process-warning')()
|
||||
const clazz = 'LdapjsDnWarning'
|
||||
|
||||
warning.create(clazz, 'LDAP_DN_DEP_001', 'attribute options is deprecated and are ignored')
|
||||
warning.create(clazz, 'LDAP_DN_DEP_002', '.format() is deprecated. Use .toString() instead')
|
||||
warning.create(clazz, 'LDAP_DN_DEP_003', '.set() is deprecated. Use .setAttribute() instead')
|
||||
warning.create(clazz, 'LDAP_DN_DEP_004', '.setFormat() is deprecated. Options will be ignored')
|
||||
|
||||
module.exports = warning
|
||||
336
node_modules/@ldapjs/dn/lib/dn.js
generated
vendored
Normal file
336
node_modules/@ldapjs/dn/lib/dn.js
generated
vendored
Normal file
@ -0,0 +1,336 @@
|
||||
'use strict'
|
||||
|
||||
const warning = require('./deprecations')
|
||||
const RDN = require('./rdn')
|
||||
const parseString = require('./utils/parse-string')
|
||||
|
||||
/**
|
||||
* Implements distinguished name strings as described in
|
||||
* https://www.rfc-editor.org/rfc/rfc4514 as an object.
|
||||
* This is the primary implementation for parsing and generating DN strings.
|
||||
*
|
||||
* @example
|
||||
* const dn = new DN({rdns: [{cn: 'jdoe', givenName: 'John'}] })
|
||||
* dn.toString() // 'cn=jdoe+givenName=John'
|
||||
*/
|
||||
class DN {
|
||||
#rdns = []
|
||||
|
||||
/**
|
||||
* @param {object} input
|
||||
* @param {RDN[]} [input.rdns=[]] A set of RDN objects that define the DN.
|
||||
* Remember that DNs are in reverse domain order. Thus, the target RDN must
|
||||
* be the first item and the top-level RDN the last item.
|
||||
*
|
||||
* @throws When the provided `rdns` array is invalid.
|
||||
*/
|
||||
constructor ({ rdns = [] } = {}) {
|
||||
if (Array.isArray(rdns) === false) {
|
||||
throw Error('rdns must be an array')
|
||||
}
|
||||
|
||||
const hasNonRdn = rdns.some(
|
||||
r => RDN.isRdn(r) === false
|
||||
)
|
||||
if (hasNonRdn === true) {
|
||||
throw Error('rdns must be an array of RDN objects')
|
||||
}
|
||||
|
||||
Array.prototype.push.apply(
|
||||
this.#rdns,
|
||||
rdns.map(r => {
|
||||
if (Object.prototype.toString.call(r) === '[object LdapRdn]') {
|
||||
return r
|
||||
}
|
||||
return new RDN(r)
|
||||
})
|
||||
)
|
||||
}
|
||||
|
||||
get [Symbol.toStringTag] () {
|
||||
return 'LdapDn'
|
||||
}
|
||||
|
||||
/**
|
||||
* The number of RDNs that make up the DN.
|
||||
*
|
||||
* @returns {number}
|
||||
*/
|
||||
get length () {
|
||||
return this.#rdns.length
|
||||
}
|
||||
|
||||
/**
|
||||
* Determine if the current instance is the child of another DN instance or
|
||||
* DN string.
|
||||
*
|
||||
* @param {DN|string} dn
|
||||
*
|
||||
* @returns {boolean}
|
||||
*/
|
||||
childOf (dn) {
|
||||
if (typeof dn === 'string') {
|
||||
const parsedDn = DN.fromString(dn)
|
||||
return parsedDn.parentOf(this)
|
||||
}
|
||||
return dn.parentOf(this)
|
||||
}
|
||||
|
||||
/**
|
||||
* Get a new instance that is a replica of the current instance.
|
||||
*
|
||||
* @returns {DN}
|
||||
*/
|
||||
clone () {
|
||||
return new DN({ rdns: this.#rdns })
|
||||
}
|
||||
|
||||
/**
|
||||
* Determine if the instance is equal to another DN.
|
||||
*
|
||||
* @param {DN|string} dn
|
||||
*
|
||||
* @returns {boolean}
|
||||
*/
|
||||
equals (dn) {
|
||||
if (typeof dn === 'string') {
|
||||
const parsedDn = DN.fromString(dn)
|
||||
return parsedDn.equals(this)
|
||||
}
|
||||
|
||||
if (this.length !== dn.length) return false
|
||||
|
||||
for (let i = 0; i < this.length; i += 1) {
|
||||
if (this.#rdns[i].equals(dn.rdnAt(i)) === false) {
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
return true
|
||||
}
|
||||
|
||||
/**
|
||||
* @deprecated Use .toString() instead.
|
||||
*
|
||||
* @returns {string}
|
||||
*/
|
||||
format () {
|
||||
warning.emit('LDAP_DN_DEP_002')
|
||||
return this.toString()
|
||||
}
|
||||
|
||||
/**
|
||||
* Determine if the instance has any RDNs defined.
|
||||
*
|
||||
* @returns {boolean}
|
||||
*/
|
||||
isEmpty () {
|
||||
return this.#rdns.length === 0
|
||||
}
|
||||
|
||||
/**
|
||||
* Get a DN representation of the parent of this instance.
|
||||
*
|
||||
* @returns {DN|undefined}
|
||||
*/
|
||||
parent () {
|
||||
if (this.length === 0) return undefined
|
||||
const save = this.shift()
|
||||
const dn = new DN({ rdns: this.#rdns })
|
||||
this.unshift(save)
|
||||
return dn
|
||||
}
|
||||
|
||||
/**
|
||||
* Determine if the instance is the parent of a given DN instance or DN
|
||||
* string.
|
||||
*
|
||||
* @param {DN|string} dn
|
||||
*
|
||||
* @returns {boolean}
|
||||
*/
|
||||
parentOf (dn) {
|
||||
if (typeof dn === 'string') {
|
||||
const parsedDn = DN.fromString(dn)
|
||||
return this.parentOf(parsedDn)
|
||||
}
|
||||
|
||||
if (this.length >= dn.length) {
|
||||
// If we have more RDNs in our set then we must be a descendent at least.
|
||||
return false
|
||||
}
|
||||
|
||||
const numberOfElementsDifferent = dn.length - this.length
|
||||
for (let i = this.length - 1; i >= 0; i -= 1) {
|
||||
const myRdn = this.#rdns[i]
|
||||
const theirRdn = dn.rdnAt(i + numberOfElementsDifferent)
|
||||
if (myRdn.equals(theirRdn) === false) {
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
return true
|
||||
}
|
||||
|
||||
/**
|
||||
* Removes the last RDN from the list and returns it. This alters the
|
||||
* instance.
|
||||
*
|
||||
* @returns {RDN}
|
||||
*/
|
||||
pop () {
|
||||
return this.#rdns.pop()
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds a new RDN to the end of the list (i.e. the "top most" RDN in the
|
||||
* directory path) and returns the new RDN count.
|
||||
*
|
||||
* @param {RDN} rdn
|
||||
*
|
||||
* @returns {number}
|
||||
*
|
||||
* @throws When the input is not a valid RDN.
|
||||
*/
|
||||
push (rdn) {
|
||||
if (Object.prototype.toString.call(rdn) !== '[object LdapRdn]') {
|
||||
throw Error('rdn must be a RDN instance')
|
||||
}
|
||||
return this.#rdns.push(rdn)
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the RDN at the provided index in the list of RDNs associated with
|
||||
* this instance.
|
||||
*
|
||||
* @param {number} index
|
||||
*
|
||||
* @returns {RDN}
|
||||
*/
|
||||
rdnAt (index) {
|
||||
return this.#rdns[index]
|
||||
}
|
||||
|
||||
/**
|
||||
* Reverse the RDNs list such that the first element becomes the last, and
|
||||
* the last becomes the first. This is useful when the RDNs were added in the
|
||||
* opposite order of how they should have been.
|
||||
*
|
||||
* This is an in-place operation. The instance is changed as a result of
|
||||
* this operation.
|
||||
*
|
||||
* @returns {DN} The current instance (i.e. this method is chainable).
|
||||
*/
|
||||
reverse () {
|
||||
this.#rdns.reverse()
|
||||
return this
|
||||
}
|
||||
|
||||
/**
|
||||
* @deprecated Formatting options are not supported.
|
||||
*/
|
||||
setFormat () {
|
||||
warning.emit('LDAP_DN_DEP_004')
|
||||
}
|
||||
|
||||
/**
|
||||
* Remove the first RDN from the set of RDNs and return it.
|
||||
*
|
||||
* @returns {RDN}
|
||||
*/
|
||||
shift () {
|
||||
return this.#rdns.shift()
|
||||
}
|
||||
|
||||
/**
|
||||
* Render the DN instance as a spec compliant DN string.
|
||||
*
|
||||
* @returns {string}
|
||||
*/
|
||||
toString () {
|
||||
let result = ''
|
||||
for (const rdn of this.#rdns) {
|
||||
const rdnString = rdn.toString()
|
||||
result += `,${rdnString}`
|
||||
}
|
||||
return result.substring(1)
|
||||
}
|
||||
|
||||
/**
|
||||
* Adds an RDN to the beginning of the RDN list and returns the new length.
|
||||
*
|
||||
* @param {RDN} rdn
|
||||
*
|
||||
* @returns {number}
|
||||
*
|
||||
* @throws When the RDN is invalid.
|
||||
*/
|
||||
unshift (rdn) {
|
||||
if (Object.prototype.toString.call(rdn) !== '[object LdapRdn]') {
|
||||
throw Error('rdn must be a RDN instance')
|
||||
}
|
||||
return this.#rdns.unshift(rdn)
|
||||
}
|
||||
|
||||
/**
|
||||
* Determine if an object is an instance of {@link DN} or is at least
|
||||
* a DN-like object. It is safer to perform a `toString` check.
|
||||
*
|
||||
* @example Valid Instance
|
||||
* const dn = new DN()
|
||||
* DN.isDn(dn) // true
|
||||
*
|
||||
* @example DN-like Instance
|
||||
* let dn = { rdns: [{name: 'cn', value: 'foo'}] }
|
||||
* DN.isDn(dn) // true
|
||||
*
|
||||
* dn = { rdns: [{cn: 'foo', sn: 'bar'}, {dc: 'example'}, {dc: 'com'}]}
|
||||
* DN.isDn(dn) // true
|
||||
*
|
||||
* @example Preferred Check
|
||||
* let dn = new DN()
|
||||
* Object.prototype.toString.call(dn) === '[object LdapDn]' // true
|
||||
*
|
||||
* dn = { rdns: [{name: 'cn', value: 'foo'}] }
|
||||
* Object.prototype.toString.call(dn) === '[object LdapDn]' // false
|
||||
*
|
||||
* @param {object} dn
|
||||
* @returns {boolean}
|
||||
*/
|
||||
static isDn (dn) {
|
||||
if (Object.prototype.toString.call(dn) === '[object LdapDn]') {
|
||||
return true
|
||||
}
|
||||
if (
|
||||
Object.prototype.toString.call(dn) !== '[object Object]' ||
|
||||
Array.isArray(dn.rdns) === false
|
||||
) {
|
||||
return false
|
||||
}
|
||||
if (dn.rdns.some(dn => RDN.isRdn(dn) === false) === true) {
|
||||
return false
|
||||
}
|
||||
|
||||
return true
|
||||
}
|
||||
|
||||
/**
|
||||
* Parses a DN string and returns a new {@link DN} instance.
|
||||
*
|
||||
* @example
|
||||
* const dn = DN.fromString('cn=foo,dc=example,dc=com')
|
||||
* DN.isDn(dn) // true
|
||||
*
|
||||
* @param {string} dnString
|
||||
*
|
||||
* @returns {DN}
|
||||
*
|
||||
* @throws If the string is not parseable.
|
||||
*/
|
||||
static fromString (dnString) {
|
||||
const rdns = parseString(dnString)
|
||||
return new DN({ rdns })
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = DN
|
||||
527
node_modules/@ldapjs/dn/lib/dn.test.js
generated
vendored
Normal file
527
node_modules/@ldapjs/dn/lib/dn.test.js
generated
vendored
Normal file
@ -0,0 +1,527 @@
|
||||
'use strict'
|
||||
|
||||
const tap = require('tap')
|
||||
const warning = require('./deprecations')
|
||||
const RDN = require('./rdn')
|
||||
const DN = require('./dn')
|
||||
|
||||
// Silence the standard warning logs. We will test the messages explicitly.
|
||||
process.removeAllListeners('warning')
|
||||
|
||||
tap.test('constructor', t => {
|
||||
t.test('throws for non-array', async t => {
|
||||
t.throws(
|
||||
() => new DN({ rdns: 42 }),
|
||||
Error('rdns must be an array')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('throws for non-rdn in array', async t => {
|
||||
const rdns = [
|
||||
new RDN(),
|
||||
{ 'non-string-value': 42 },
|
||||
new RDN()
|
||||
]
|
||||
t.throws(
|
||||
() => new DN({ rdns })
|
||||
)
|
||||
})
|
||||
|
||||
t.test('handles mixed array', async t => {
|
||||
const rdns = [
|
||||
{ cn: 'foo' },
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
const dn = new DN({ rdns })
|
||||
t.equal(dn.length, 3)
|
||||
t.equal(dn.toString(), 'cn=foo,dc=example,dc=com')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('childOf', t => {
|
||||
t.test('false if we are shallower', async t => {
|
||||
const dn = new DN({
|
||||
rdns: [
|
||||
new RDN({ ou: 'people' }),
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
const target = new DN({
|
||||
rdns: [
|
||||
new RDN({ cn: 'foo' }),
|
||||
new RDN({ ou: 'people' }),
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
t.equal(dn.childOf(target), false)
|
||||
})
|
||||
|
||||
t.test('false if differing path', async t => {
|
||||
const dn = new DN({
|
||||
rdns: [
|
||||
new RDN({ ou: 'people' }),
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
const target = new DN({
|
||||
rdns: [
|
||||
new RDN({ dc: 'ldapjs' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
t.equal(dn.childOf(target), false)
|
||||
})
|
||||
|
||||
t.test('true if we are a child', async t => {
|
||||
const dn = new DN({
|
||||
rdns: [
|
||||
new RDN({ ou: 'people' }),
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
const target = new DN({
|
||||
rdns: [
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
t.equal(dn.childOf(target), true)
|
||||
})
|
||||
|
||||
t.test('handles string input', async t => {
|
||||
const dn = new DN({
|
||||
rdns: [
|
||||
new RDN({ ou: 'people' }),
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
const target = 'dc=example,dc=com'
|
||||
t.equal(dn.childOf(target), true)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('clone', t => {
|
||||
t.test('returns a copy', async t => {
|
||||
const rdns = [new RDN({ cn: 'foo' })]
|
||||
const src = new DN({ rdns })
|
||||
const clone = src.clone()
|
||||
|
||||
t.equal(src.length, clone.length)
|
||||
t.equal(src.toString(), clone.toString())
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('equals', t => {
|
||||
t.test('false for non-equal length', async t => {
|
||||
const dn = new DN({
|
||||
rdns: [
|
||||
new RDN({ ou: 'people' }),
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
const target = new DN({
|
||||
rdns: [
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
t.equal(dn.equals(target), false)
|
||||
})
|
||||
|
||||
t.test('false for non-equal paths', async t => {
|
||||
const dn = new DN({
|
||||
rdns: [
|
||||
new RDN({ ou: 'people' }),
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
const target = new DN({
|
||||
rdns: [
|
||||
new RDN({ ou: 'computers' }),
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
t.equal(dn.equals(target), false)
|
||||
})
|
||||
|
||||
t.test('true for equal paths', async t => {
|
||||
const dn = new DN({
|
||||
rdns: [
|
||||
new RDN({ ou: 'people' }),
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
const target = new DN({
|
||||
rdns: [
|
||||
new RDN({ ou: 'people' }),
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
t.equal(dn.equals(target), true)
|
||||
})
|
||||
|
||||
t.test('handles string input', async t => {
|
||||
const dn = new DN({
|
||||
rdns: [
|
||||
new RDN({ ou: 'people' }),
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
const target = 'ou=people,dc=example,dc=com'
|
||||
t.equal(dn.equals(target), true)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('format', t => {
|
||||
t.test('emits warning', t => {
|
||||
process.on('warning', handler)
|
||||
t.teardown(async () => {
|
||||
process.removeListener('warning', handler)
|
||||
warning.emitted.set('LDAP_DN_DEP_002', false)
|
||||
})
|
||||
|
||||
const rdns = [{ cn: 'foo' }]
|
||||
const dnString = (new DN({ rdns })).format()
|
||||
t.equal(dnString, 'cn=foo')
|
||||
|
||||
function handler (error) {
|
||||
t.equal(error.message, '.format() is deprecated. Use .toString() instead')
|
||||
t.end()
|
||||
}
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('isEmpty', t => {
|
||||
t.test('returns correct result', async t => {
|
||||
let dn = new DN()
|
||||
t.equal(dn.isEmpty(), true)
|
||||
|
||||
dn = new DN({
|
||||
rdns: [new RDN({ cn: 'foo' })]
|
||||
})
|
||||
t.equal(dn.isEmpty(), false)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('parent', t => {
|
||||
t.test('undefined for an empty DN', async t => {
|
||||
const dn = new DN()
|
||||
const parent = dn.parent()
|
||||
t.equal(parent, undefined)
|
||||
})
|
||||
|
||||
t.test('returns correct DN', async t => {
|
||||
const dn = new DN({
|
||||
rdns: [
|
||||
new RDN({ cn: 'jdoe', givenName: 'John' }),
|
||||
new RDN({ ou: 'people' }),
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
const parent = dn.parent()
|
||||
t.equal(parent.toString(), 'ou=people,dc=example,dc=com')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('parentOf', t => {
|
||||
t.test('false if we are deeper', async t => {
|
||||
const target = new DN({
|
||||
rdns: [
|
||||
new RDN({ ou: 'people' }),
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
const dn = new DN({
|
||||
rdns: [
|
||||
new RDN({ cn: 'foo' }),
|
||||
new RDN({ ou: 'people' }),
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
t.equal(dn.parentOf(target), false)
|
||||
})
|
||||
|
||||
t.test('false if differing path', async t => {
|
||||
const target = new DN({
|
||||
rdns: [
|
||||
new RDN({ ou: 'people' }),
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
const dn = new DN({
|
||||
rdns: [
|
||||
new RDN({ dc: 'ldapjs' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
t.equal(dn.parentOf(target), false)
|
||||
})
|
||||
|
||||
t.test('true if we are a parent', async t => {
|
||||
const target = new DN({
|
||||
rdns: [
|
||||
new RDN({ ou: 'people' }),
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
const dn = new DN({
|
||||
rdns: [
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
t.equal(dn.parentOf(target), true)
|
||||
})
|
||||
|
||||
t.test('handles string input', async t => {
|
||||
const dn = new DN({
|
||||
rdns: [
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
const target = 'ou=people,dc=example,dc=com'
|
||||
t.equal(dn.parentOf(target), true)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('pop', t => {
|
||||
t.test('returns the last element and shortens the list', async t => {
|
||||
const dn = new DN({
|
||||
rdns: [
|
||||
new RDN({ cn: 'foo' }),
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
t.equal(dn.toString(), 'cn=foo,dc=example,dc=com')
|
||||
|
||||
const rdn = dn.pop()
|
||||
t.equal(rdn.toString(), 'dc=com')
|
||||
t.equal(dn.toString(), 'cn=foo,dc=example')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('push', t => {
|
||||
t.test('throws for bad input', async t => {
|
||||
const dn = new DN()
|
||||
t.throws(
|
||||
() => dn.push({ cn: 'foo' }),
|
||||
Error('rdn must be a RDN instance')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('adds to the front of the list', async t => {
|
||||
const dn = new DN({
|
||||
rdns: [
|
||||
new RDN({ cn: 'foo' }),
|
||||
new RDN({ dc: 'example' })
|
||||
|
||||
]
|
||||
})
|
||||
t.equal(dn.toString(), 'cn=foo,dc=example')
|
||||
|
||||
const newLength = dn.push(new RDN({ dc: 'com' }))
|
||||
t.equal(newLength, 3)
|
||||
t.equal(dn.toString(), 'cn=foo,dc=example,dc=com')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('rdnAt', t => {
|
||||
t.test('returns correct RDN', async t => {
|
||||
const dn = new DN({
|
||||
rdns: [
|
||||
new RDN({ cn: 'jdoe', givenName: 'John' }),
|
||||
new RDN({ ou: 'people' }),
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
const rdn = dn.rdnAt(1)
|
||||
t.equal(rdn.toString(), 'ou=people')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('reverse', t => {
|
||||
t.test('reverses the list', async t => {
|
||||
const dn = new DN({
|
||||
rdns: [
|
||||
new RDN({ dc: 'com' }),
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ cn: 'foo' })
|
||||
]
|
||||
})
|
||||
t.equal(dn.toString(), 'dc=com,dc=example,cn=foo')
|
||||
|
||||
const result = dn.reverse()
|
||||
t.equal(dn, result)
|
||||
t.equal(dn.toString(), 'cn=foo,dc=example,dc=com')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('setFormat', t => {
|
||||
t.test('emits warning', t => {
|
||||
process.on('warning', handler)
|
||||
t.teardown(async () => {
|
||||
process.removeListener('warning', handler)
|
||||
warning.emitted.set('LDAP_DN_DEP_004', false)
|
||||
})
|
||||
|
||||
const rdns = [{ cn: 'foo' }]
|
||||
new DN({ rdns }).setFormat()
|
||||
|
||||
function handler (error) {
|
||||
t.equal(error.message, '.setFormat() is deprecated. Options will be ignored')
|
||||
t.end()
|
||||
}
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('shift', t => {
|
||||
t.test('returns the first element and shortens the list', async t => {
|
||||
const dn = new DN({
|
||||
rdns: [
|
||||
new RDN({ cn: 'foo' }),
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
t.equal(dn.toString(), 'cn=foo,dc=example,dc=com')
|
||||
|
||||
const rdn = dn.shift()
|
||||
t.equal(rdn.toString(), 'cn=foo')
|
||||
t.equal(dn.toString(), 'dc=example,dc=com')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('toString', t => {
|
||||
t.test('renders correctly', async t => {
|
||||
const dn = new DN({
|
||||
rdns: [
|
||||
new RDN({ cn: 'jdoe', givenName: 'John' }),
|
||||
new RDN({ ou: 'people' }),
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
t.equal(dn.toString(), 'cn=jdoe+givenName=John,ou=people,dc=example,dc=com')
|
||||
})
|
||||
|
||||
t.test('empty string for empty DN', async t => {
|
||||
const dn = new DN()
|
||||
t.equal(dn.toString(), '')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('unshift', t => {
|
||||
t.test('throws for bad input', async t => {
|
||||
const dn = new DN()
|
||||
t.throws(
|
||||
() => dn.unshift({ cn: 'foo' }),
|
||||
Error('rdn must be a RDN instance')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('adds to the front of the list', async t => {
|
||||
const dn = new DN({
|
||||
rdns: [
|
||||
new RDN({ dc: 'example' }),
|
||||
new RDN({ dc: 'com' })
|
||||
]
|
||||
})
|
||||
t.equal(dn.toString(), 'dc=example,dc=com')
|
||||
|
||||
const newLength = dn.unshift(new RDN({ cn: 'foo' }))
|
||||
t.equal(newLength, 3)
|
||||
t.equal(dn.toString(), 'cn=foo,dc=example,dc=com')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('#isDn', t => {
|
||||
t.test('true for instance', async t => {
|
||||
const dn = new DN()
|
||||
t.equal(DN.isDn(dn), true)
|
||||
})
|
||||
|
||||
t.test('false for non-object', async t => {
|
||||
t.equal(DN.isDn(42), false)
|
||||
})
|
||||
|
||||
t.test('false for non-array rdns', async t => {
|
||||
const input = { rdns: 42 }
|
||||
t.equal(DN.isDn(input), false)
|
||||
})
|
||||
|
||||
t.test('false for bad rdn', async t => {
|
||||
const input = { rdns: [{ bad: 'rdn', answer: 42 }] }
|
||||
t.equal(DN.isDn(input), false)
|
||||
})
|
||||
|
||||
t.test('true for dn-like', async t => {
|
||||
const input = { rdns: [{ name: 'cn', value: 'foo' }] }
|
||||
t.equal(DN.isDn(input), true)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('#fromString', t => {
|
||||
t.test('parses a basic string into an instance', async t => {
|
||||
const input = 'cn=foo+sn=bar,dc=example,dc=com'
|
||||
const dn = DN.fromString(input)
|
||||
t.equal(DN.isDn(dn), true)
|
||||
t.equal(dn.length, 3)
|
||||
t.equal(dn.rdnAt(0).toString(), 'cn=foo+sn=bar')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
257
node_modules/@ldapjs/dn/lib/rdn.js
generated
vendored
Normal file
257
node_modules/@ldapjs/dn/lib/rdn.js
generated
vendored
Normal file
@ -0,0 +1,257 @@
|
||||
'use strict'
|
||||
|
||||
const warning = require('./deprecations')
|
||||
const escapeValue = require('./utils/escape-value')
|
||||
const isDottedDecimal = require('./utils/is-dotted-decimal')
|
||||
|
||||
/**
|
||||
* Implements a relative distinguished name as described in
|
||||
* https://www.rfc-editor.org/rfc/rfc4514.
|
||||
*
|
||||
* @example
|
||||
* const rdn = new RDN({cn: 'jdoe', givenName: 'John'})
|
||||
* rdn.toString() // 'cn=jdoe+givenName=John'
|
||||
*/
|
||||
class RDN {
|
||||
#attributes = new Map()
|
||||
|
||||
/**
|
||||
* @param {object} rdn An object of key-values to use as RDN attribute
|
||||
* types and attribute values. Attribute values should be strings.
|
||||
*/
|
||||
constructor (rdn = {}) {
|
||||
for (const [key, val] of Object.entries(rdn)) {
|
||||
this.setAttribute({ name: key, value: val })
|
||||
}
|
||||
}
|
||||
|
||||
get [Symbol.toStringTag] () {
|
||||
return 'LdapRdn'
|
||||
}
|
||||
|
||||
/**
|
||||
* The number attributes associated with the RDN.
|
||||
*
|
||||
* @returns {number}
|
||||
*/
|
||||
get size () {
|
||||
return this.#attributes.size
|
||||
}
|
||||
|
||||
/**
|
||||
* Very naive equality check against another RDN instance. In short, if they
|
||||
* do not have the exact same key names with the exact same values, then
|
||||
* this check will return `false`.
|
||||
*
|
||||
* @param {RDN} rdn
|
||||
*
|
||||
* @returns {boolean}
|
||||
*
|
||||
* @todo Should implement support for the attribute types listed in https://www.rfc-editor.org/rfc/rfc4514#section-3
|
||||
*/
|
||||
equals (rdn) {
|
||||
if (Object.prototype.toString.call(rdn) !== '[object LdapRdn]') {
|
||||
return false
|
||||
}
|
||||
if (this.size !== rdn.size) {
|
||||
return false
|
||||
}
|
||||
|
||||
for (const key of this.keys()) {
|
||||
if (rdn.has(key) === false) return false
|
||||
if (this.getValue(key) !== rdn.getValue(key)) return false
|
||||
}
|
||||
|
||||
return true
|
||||
}
|
||||
|
||||
/**
|
||||
* The value associated with the given attribute name.
|
||||
*
|
||||
* @param {string} name An attribute name associated with the RDN.
|
||||
*
|
||||
* @returns {*}
|
||||
*/
|
||||
getValue (name) {
|
||||
return this.#attributes.get(name)?.value
|
||||
}
|
||||
|
||||
/**
|
||||
* Determine if the RDN has a specific attribute assigned.
|
||||
*
|
||||
* @param {string} name The name of the attribute.
|
||||
*
|
||||
* @returns {boolean}
|
||||
*/
|
||||
has (name) {
|
||||
return this.#attributes.has(name)
|
||||
}
|
||||
|
||||
/**
|
||||
* All attribute names associated with the RDN.
|
||||
*
|
||||
* @returns {IterableIterator<string>}
|
||||
*/
|
||||
keys () {
|
||||
return this.#attributes.keys()
|
||||
}
|
||||
|
||||
/**
|
||||
* Define an attribute type and value on the RDN.
|
||||
*
|
||||
* @param {string} name
|
||||
* @param {string | import('@ldapjs/asn1').BerReader} value
|
||||
* @param {object} options Deprecated. All options will be ignored.
|
||||
*
|
||||
* @throws If any parameter is invalid.
|
||||
*/
|
||||
setAttribute ({ name, value, options = {} }) {
|
||||
if (typeof name !== 'string') {
|
||||
throw Error('name must be a string')
|
||||
}
|
||||
|
||||
const valType = Object.prototype.toString.call(value)
|
||||
if (typeof value !== 'string' && valType !== '[object BerReader]') {
|
||||
throw Error('value must be a string or BerReader')
|
||||
}
|
||||
if (Object.prototype.toString.call(options) !== '[object Object]') {
|
||||
throw Error('options must be an object')
|
||||
}
|
||||
|
||||
const startsWithAlpha = str => /^[a-zA-Z]/.test(str) === true
|
||||
if (startsWithAlpha(name) === false && isDottedDecimal(name) === false) {
|
||||
throw Error('attribute name must start with an ASCII alpha character or be a numeric OID')
|
||||
}
|
||||
|
||||
const attr = { value, name }
|
||||
for (const [key, val] of Object.entries(options)) {
|
||||
warning.emit('LDAP_DN_DEP_001')
|
||||
if (key === 'value') continue
|
||||
attr[key] = val
|
||||
}
|
||||
|
||||
this.#attributes.set(name, attr)
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert the RDN to a string representation. If an attribute value is
|
||||
* an instance of `BerReader`, the value will be encoded appropriately.
|
||||
*
|
||||
* @example Dotted Decimal Type
|
||||
* const rdn = new RDN({
|
||||
* cn: '#foo',
|
||||
* '1.3.6.1.4.1.1466.0': '#04024869'
|
||||
* })
|
||||
* rnd.toString()
|
||||
* // => 'cn=\23foo+1.3.6.1.4.1.1466.0=#04024869'
|
||||
*
|
||||
* @example Unescaped Value
|
||||
* const rdn = new RDN({
|
||||
* cn: '#foo'
|
||||
* })
|
||||
* rdn.toString({ unescaped: true })
|
||||
* // => 'cn=#foo'
|
||||
*
|
||||
* @param {object} [options]
|
||||
* @param {boolean} [options.unescaped=false] Return the unescaped version
|
||||
* of the RDN string.
|
||||
*
|
||||
* @returns {string}
|
||||
*/
|
||||
toString ({ unescaped = false } = {}) {
|
||||
let result = ''
|
||||
const isHexEncodedValue = val => /^#([0-9a-fA-F]{2})+$/.test(val) === true
|
||||
|
||||
for (const entry of this.#attributes.values()) {
|
||||
result += entry.name + '='
|
||||
|
||||
if (isHexEncodedValue(entry.value)) {
|
||||
result += entry.value
|
||||
} else if (Object.prototype.toString.call(entry.value) === '[object BerReader]') {
|
||||
let encoded = '#'
|
||||
for (const byte of entry.value.buffer) {
|
||||
encoded += Number(byte).toString(16).padStart(2, '0')
|
||||
}
|
||||
result += encoded
|
||||
} else {
|
||||
result += unescaped === false ? escapeValue(entry.value) : entry.value
|
||||
}
|
||||
|
||||
result += '+'
|
||||
}
|
||||
|
||||
return result.substring(0, result.length - 1)
|
||||
}
|
||||
|
||||
/**
|
||||
* @returns {string}
|
||||
*
|
||||
* @deprecated Use {@link toString}.
|
||||
*/
|
||||
format () {
|
||||
// If we decide to add back support for this, we should do it as
|
||||
// `.toStringWithFormatting(options)`.
|
||||
warning.emit('LDAP_DN_DEP_002')
|
||||
return this.toString()
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {string} name
|
||||
* @param {string} value
|
||||
* @param {object} options
|
||||
*
|
||||
* @deprecated Use {@link setAttribute}.
|
||||
*/
|
||||
set (name, value, options) {
|
||||
warning.emit('LDAP_DN_DEP_003')
|
||||
this.setAttribute({ name, value, options })
|
||||
}
|
||||
|
||||
/**
|
||||
* Determine if an object is an instance of {@link RDN} or is at least
|
||||
* a RDN-like object. It is safer to perform a `toString` check.
|
||||
*
|
||||
* @example Valid Instance
|
||||
* const Rdn = new RDN()
|
||||
* RDN.isRdn(rdn) // true
|
||||
*
|
||||
* @example RDN-like Instance
|
||||
* const rdn = { name: 'cn', value: 'foo' }
|
||||
* RDN.isRdn(rdn) // true
|
||||
*
|
||||
* @example Preferred Check
|
||||
* let rdn = new RDN()
|
||||
* Object.prototype.toString.call(rdn) === '[object LdapRdn]' // true
|
||||
*
|
||||
* dn = { name: 'cn', value: 'foo' }
|
||||
* Object.prototype.toString.call(dn) === '[object LdapRdn]' // false
|
||||
*
|
||||
* @param {object} rdn
|
||||
* @returns {boolean}
|
||||
*/
|
||||
static isRdn (rdn) {
|
||||
if (Object.prototype.toString.call(rdn) === '[object LdapRdn]') {
|
||||
return true
|
||||
}
|
||||
|
||||
const isObject = Object.prototype.toString.call(rdn) === '[object Object]'
|
||||
if (isObject === false) {
|
||||
return false
|
||||
}
|
||||
|
||||
if (typeof rdn.name === 'string' && typeof rdn.value === 'string') {
|
||||
return true
|
||||
}
|
||||
|
||||
for (const value of Object.values(rdn)) {
|
||||
if (
|
||||
typeof value !== 'string' &&
|
||||
Object.prototype.toString.call(value) !== '[object BerReader]'
|
||||
) return false
|
||||
}
|
||||
|
||||
return true
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = RDN
|
||||
214
node_modules/@ldapjs/dn/lib/rdn.test.js
generated
vendored
Normal file
214
node_modules/@ldapjs/dn/lib/rdn.test.js
generated
vendored
Normal file
@ -0,0 +1,214 @@
|
||||
'use strict'
|
||||
|
||||
const tap = require('tap')
|
||||
const warning = require('./deprecations')
|
||||
const { BerReader } = require('@ldapjs/asn1')
|
||||
const RDN = require('./rdn')
|
||||
|
||||
// Silence the standard warning logs. We will test the messages explicitly.
|
||||
process.removeAllListeners('warning')
|
||||
|
||||
tap.test('equals', t => {
|
||||
t.test('false for non-rdn object', async t => {
|
||||
const rdn = new RDN()
|
||||
t.equal(rdn.equals({}), false)
|
||||
})
|
||||
|
||||
t.test('false for size mis-match', async t => {
|
||||
const rdn1 = new RDN({ cn: 'foo' })
|
||||
const rdn2 = new RDN({ cn: 'foo', sn: 'bar' })
|
||||
t.equal(rdn1.equals(rdn2), false)
|
||||
})
|
||||
|
||||
t.test('false for keys mis-match', async t => {
|
||||
const rdn1 = new RDN({ cn: 'foo' })
|
||||
const rdn2 = new RDN({ sn: 'bar' })
|
||||
t.equal(rdn1.equals(rdn2), false)
|
||||
})
|
||||
|
||||
t.test('false for value mis-match', async t => {
|
||||
const rdn1 = new RDN({ cn: 'foo' })
|
||||
const rdn2 = new RDN({ cn: 'bar' })
|
||||
t.equal(rdn1.equals(rdn2), false)
|
||||
})
|
||||
|
||||
t.test('true for match', async t => {
|
||||
const rdn1 = new RDN({ cn: 'foo' })
|
||||
const rdn2 = new RDN({ cn: 'foo' })
|
||||
t.equal(rdn1.equals(rdn2), true)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('setAttribute', async t => {
|
||||
t.test('throws for bad name', async t => {
|
||||
const rdn = new RDN()
|
||||
t.throws(
|
||||
() => rdn.setAttribute({ name: 42 }),
|
||||
Error('name must be a string')
|
||||
)
|
||||
|
||||
t.throws(
|
||||
() => rdn.setAttribute({ name: '3cn', value: 'foo' }),
|
||||
Error('attribute name must start with an ASCII alpha character or be a numeric OID')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('throws for bad value', async t => {
|
||||
const rdn = new RDN()
|
||||
t.throws(
|
||||
() => rdn.setAttribute({ name: 'cn', value: 42 }),
|
||||
Error('value must be a string')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('throws for options', async t => {
|
||||
const rdn = new RDN()
|
||||
t.throws(
|
||||
() => rdn.setAttribute({ name: 'cn', value: 'foo', options: 42 }),
|
||||
Error('options must be an object')
|
||||
)
|
||||
})
|
||||
|
||||
t.test('sets an attribute with value', async t => {
|
||||
const rdn = new RDN()
|
||||
rdn.setAttribute({ name: 'cn', value: 'foo' })
|
||||
t.equal(rdn.getValue('cn'), 'foo')
|
||||
})
|
||||
|
||||
t.test('options generates warning', t => {
|
||||
process.on('warning', handler)
|
||||
t.teardown(async () => {
|
||||
process.removeListener('warning', handler)
|
||||
warning.emitted.set('LDAP_DN_DEP_001', false)
|
||||
})
|
||||
|
||||
const rdn = new RDN()
|
||||
rdn.setAttribute({ name: 'cn', value: 'foo', options: { foo: 'bar' } })
|
||||
|
||||
function handler (error) {
|
||||
t.equal(error.message, 'attribute options is deprecated and are ignored')
|
||||
t.end()
|
||||
}
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('toString', t => {
|
||||
t.test('basic single value', async t => {
|
||||
const rdn = new RDN({ cn: 'foo' })
|
||||
t.equal(rdn.toString(), 'cn=foo')
|
||||
})
|
||||
|
||||
t.test('escaped single value', async t => {
|
||||
const rdn = new RDN({ cn: ' foo, bar\n' })
|
||||
t.equal(rdn.toString(), 'cn=\\20foo\\2c bar\\0a')
|
||||
})
|
||||
|
||||
t.test('basic multi-value', async t => {
|
||||
const rdn = new RDN({ cn: 'foo', sn: 'bar' })
|
||||
t.equal(rdn.toString(), 'cn=foo+sn=bar')
|
||||
})
|
||||
|
||||
t.test('escaped multi-value', async t => {
|
||||
const rdn = new RDN({ cn: '#foo', sn: 'bar' })
|
||||
t.equal(rdn.toString(), 'cn=\\23foo+sn=bar')
|
||||
})
|
||||
|
||||
t.test('recognizes encoded string values', async t => {
|
||||
const rdn = new RDN({
|
||||
cn: '#foo',
|
||||
'1.3.6.1.4.1.1466.0': '#04024869'
|
||||
})
|
||||
t.equal(rdn.toString(), 'cn=\\23foo+1.3.6.1.4.1.1466.0=#04024869')
|
||||
})
|
||||
|
||||
t.test('encodes BerReader instances', async t => {
|
||||
const rdn = new RDN({
|
||||
cn: new BerReader(Buffer.from([0x04, 0x03, 0x66, 0x6f, 0x6f]))
|
||||
})
|
||||
t.equal(rdn.toString(), 'cn=#0403666f6f')
|
||||
})
|
||||
|
||||
t.test('honors unescaped options', async t => {
|
||||
const rdn = new RDN({
|
||||
ou: '研发二组'
|
||||
})
|
||||
t.equal(rdn.toString({ unescaped: true }), 'ou=研发二组')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('deprecations', t => {
|
||||
t.test('format', t => {
|
||||
process.on('warning', handler)
|
||||
t.teardown(async () => {
|
||||
process.removeListener('warning', handler)
|
||||
warning.emitted.set('LDAP_DN_DEP_002', false)
|
||||
})
|
||||
|
||||
const rdn = new RDN({ cn: 'foo' })
|
||||
t.equal(rdn.format(), 'cn=foo')
|
||||
|
||||
function handler (error) {
|
||||
t.equal(error.message, '.format() is deprecated. Use .toString() instead')
|
||||
t.end()
|
||||
}
|
||||
})
|
||||
|
||||
t.test('set', t => {
|
||||
process.on('warning', handler)
|
||||
t.teardown(async () => {
|
||||
process.removeListener('warning', handler)
|
||||
warning.emitted.set('LDAP_DN_DEP_002', false)
|
||||
})
|
||||
|
||||
const rdn = new RDN()
|
||||
rdn.set('cn', 'foo', { value: 'ignored' })
|
||||
|
||||
function handler (error) {
|
||||
t.equal(error.message, '.set() is deprecated. Use .setAttribute() instead')
|
||||
t.end()
|
||||
}
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('#isRdn', t => {
|
||||
t.test('true for instance', async t => {
|
||||
const rdn = new RDN()
|
||||
t.equal(RDN.isRdn(rdn), true)
|
||||
})
|
||||
|
||||
t.test('false for non-object', async t => {
|
||||
t.equal(RDN.isRdn(42), false)
|
||||
})
|
||||
|
||||
t.test('false for bad object', async t => {
|
||||
const input = { bad: 'rdn', 'non-string-value': 42 }
|
||||
t.equal(RDN.isRdn(input), false)
|
||||
})
|
||||
|
||||
t.test('true for rdn-like with name+value keys', async t => {
|
||||
const input = { name: 'cn', value: 'foo' }
|
||||
t.equal(RDN.isRdn(input), true)
|
||||
})
|
||||
|
||||
t.test('true for pojo representation', async t => {
|
||||
const input = { cn: 'foo', sn: 'bar' }
|
||||
t.equal(RDN.isRdn(input), true)
|
||||
})
|
||||
|
||||
t.test('true for pojo with BerReader', async t => {
|
||||
const input = {
|
||||
foo: new BerReader(Buffer.from([0x04, 0x03, 0x66, 0x6f, 0x6f]))
|
||||
}
|
||||
t.equal(RDN.isRdn(input), true)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
104
node_modules/@ldapjs/dn/lib/utils/escape-value.js
generated
vendored
Normal file
104
node_modules/@ldapjs/dn/lib/utils/escape-value.js
generated
vendored
Normal file
@ -0,0 +1,104 @@
|
||||
'use strict'
|
||||
|
||||
/**
|
||||
* Converts an attribute value into an escaped string as described in
|
||||
* https://www.rfc-editor.org/rfc/rfc4514#section-2.4.
|
||||
*
|
||||
* This function supports up to 4 byte unicode characters.
|
||||
*
|
||||
* @param {string} value
|
||||
* @returns {string} The escaped string.
|
||||
*/
|
||||
module.exports = function escapeValue (value) {
|
||||
if (typeof value !== 'string') {
|
||||
throw Error('value must be a string')
|
||||
}
|
||||
|
||||
const toEscape = Buffer.from(value, 'utf8')
|
||||
const escaped = []
|
||||
|
||||
// We will handle the reverse solidus ('\') on its own.
|
||||
const embeddedReservedChars = [
|
||||
0x22, // '"'
|
||||
0x2b, // '+'
|
||||
0x2c, // ','
|
||||
0x3b, // ';'
|
||||
0x3c, // '<'
|
||||
0x3e // '>'
|
||||
]
|
||||
for (let i = 0; i < toEscape.byteLength;) {
|
||||
const charHex = toEscape[i]
|
||||
|
||||
// Handle leading space or #.
|
||||
if (i === 0 && (charHex === 0x20 || charHex === 0x23)) {
|
||||
escaped.push(toEscapedHexString(charHex))
|
||||
i += 1
|
||||
continue
|
||||
}
|
||||
// Handle trailing space.
|
||||
if (i === toEscape.byteLength - 1 && charHex === 0x20) {
|
||||
escaped.push(toEscapedHexString(charHex))
|
||||
i += 1
|
||||
continue
|
||||
}
|
||||
|
||||
if (embeddedReservedChars.includes(charHex) === true) {
|
||||
escaped.push(toEscapedHexString(charHex))
|
||||
i += 1
|
||||
continue
|
||||
}
|
||||
|
||||
if (charHex >= 0xc0 && charHex <= 0xdf) {
|
||||
// Represents the first byte in a 2-byte UTF-8 character.
|
||||
escaped.push(toEscapedHexString(charHex))
|
||||
escaped.push(toEscapedHexString(toEscape[i + 1]))
|
||||
i += 2
|
||||
continue
|
||||
}
|
||||
|
||||
if (charHex >= 0xe0 && charHex <= 0xef) {
|
||||
// Represents the first byte in a 3-byte UTF-8 character.
|
||||
escaped.push(toEscapedHexString(charHex))
|
||||
escaped.push(toEscapedHexString(toEscape[i + 1]))
|
||||
escaped.push(toEscapedHexString(toEscape[i + 2]))
|
||||
i += 3
|
||||
continue
|
||||
}
|
||||
|
||||
if (charHex >= 0xf0 && charHex <= 0xf7) {
|
||||
// Represents the first byte in a 4-byte UTF-8 character.
|
||||
escaped.push(toEscapedHexString(charHex))
|
||||
escaped.push(toEscapedHexString(toEscape[i + 1]))
|
||||
escaped.push(toEscapedHexString(toEscape[i + 2]))
|
||||
escaped.push(toEscapedHexString(toEscape[i + 3]))
|
||||
i += 4
|
||||
continue
|
||||
}
|
||||
|
||||
if (charHex <= 31) {
|
||||
// Represents an ASCII control character.
|
||||
escaped.push(toEscapedHexString(charHex))
|
||||
i += 1
|
||||
continue
|
||||
}
|
||||
|
||||
escaped.push(String.fromCharCode(charHex))
|
||||
i += 1
|
||||
continue
|
||||
}
|
||||
|
||||
return escaped.join('')
|
||||
}
|
||||
|
||||
/**
|
||||
* Given a byte, convert it to an escaped hex string.
|
||||
*
|
||||
* @example
|
||||
* toEscapedHexString(0x20) // '\20'
|
||||
*
|
||||
* @param {number} char
|
||||
* @returns {string}
|
||||
*/
|
||||
function toEscapedHexString (char) {
|
||||
return '\\' + char.toString(16).padStart(2, '0')
|
||||
}
|
||||
62
node_modules/@ldapjs/dn/lib/utils/escape-value.test.js
generated
vendored
Normal file
62
node_modules/@ldapjs/dn/lib/utils/escape-value.test.js
generated
vendored
Normal file
@ -0,0 +1,62 @@
|
||||
'use strict'
|
||||
|
||||
const tap = require('tap')
|
||||
const escapeValue = require('./escape-value')
|
||||
|
||||
tap.test('throws for bad input', async t => {
|
||||
t.throws(
|
||||
() => escapeValue(42),
|
||||
Error('value must be a string')
|
||||
)
|
||||
})
|
||||
|
||||
tap.test('reserved chars', t => {
|
||||
t.test('space', async t => {
|
||||
const input = ' has a leading and trailing space '
|
||||
const expected = '\\20has a leading and trailing space\\20'
|
||||
const result = escapeValue(input)
|
||||
t.equal(result, expected)
|
||||
})
|
||||
|
||||
t.test('leading #', async t => {
|
||||
t.equal(escapeValue('#hashtag'), '\\23hashtag')
|
||||
})
|
||||
|
||||
t.test('pompous name', async t => {
|
||||
t.equal(
|
||||
escapeValue('James "Jim" Smith, III'),
|
||||
'James \\22Jim\\22 Smith\\2c III'
|
||||
)
|
||||
})
|
||||
|
||||
t.test('carriage return', async t => {
|
||||
t.equal(escapeValue('Before\rAfter'), 'Before\\0dAfter')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('2-byte utf-8', t => {
|
||||
t.test('Lučić', async t => {
|
||||
const expected = 'Lu\\c4\\8di\\c4\\87'
|
||||
t.equal(escapeValue('Lučić'), expected)
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('3-byte utf-8', t => {
|
||||
t.test('₠', async t => {
|
||||
t.equal(escapeValue('₠'), '\\e2\\82\\a0')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
|
||||
tap.test('4-byte utf-8', t => {
|
||||
t.test('😀', async t => {
|
||||
t.equal(escapeValue('😀'), '\\f0\\9f\\98\\80')
|
||||
})
|
||||
|
||||
t.end()
|
||||
})
|
||||
19
node_modules/@ldapjs/dn/lib/utils/is-dotted-decimal.js
generated
vendored
Normal file
19
node_modules/@ldapjs/dn/lib/utils/is-dotted-decimal.js
generated
vendored
Normal file
@ -0,0 +1,19 @@
|
||||
'use strict'
|
||||
|
||||
const partIsNotNumeric = part => /^\d+$/.test(part) === false
|
||||
|
||||
/**
|
||||
* Determines if a passed in string is a dotted decimal string.
|
||||
*
|
||||
* @param {string} value
|
||||
*
|
||||
* @returns {boolean}
|
||||
*/
|
||||
module.exports = function isDottedDecimal (value) {
|
||||
if (typeof value !== 'string') return false
|
||||
|
||||
const parts = value.split('.')
|
||||
const nonNumericParts = parts.filter(partIsNotNumeric)
|
||||
|
||||
return nonNumericParts.length === 0
|
||||
}
|
||||
24
node_modules/@ldapjs/dn/lib/utils/is-dotted-decimal.test.js
generated
vendored
Normal file
24
node_modules/@ldapjs/dn/lib/utils/is-dotted-decimal.test.js
generated
vendored
Normal file
@ -0,0 +1,24 @@
|
||||
'use strict'
|
||||
|
||||
const tap = require('tap')
|
||||
const isDottedDecimal = require('./is-dotted-decimal')
|
||||
|
||||
tap.test('false for non-string', async t => {
|
||||
t.equal(isDottedDecimal(), false)
|
||||
})
|
||||
|
||||
tap.test('false for empty string', async t => {
|
||||
t.equal(isDottedDecimal(''), false)
|
||||
})
|
||||
|
||||
tap.test('false for alpha string', async t => {
|
||||
t.equal(isDottedDecimal('foo'), false)
|
||||
})
|
||||
|
||||
tap.test('false for alpha-num string', async t => {
|
||||
t.equal(isDottedDecimal('foo.123'), false)
|
||||
})
|
||||
|
||||
tap.test('true for valid string', async t => {
|
||||
t.equal(isDottedDecimal('1.2.3'), true)
|
||||
})
|
||||
58
node_modules/@ldapjs/dn/lib/utils/parse-string/find-name-end.js
generated
vendored
Normal file
58
node_modules/@ldapjs/dn/lib/utils/parse-string/find-name-end.js
generated
vendored
Normal file
@ -0,0 +1,58 @@
|
||||
'use strict'
|
||||
|
||||
/**
|
||||
* Find the ending position of the attribute type name portion of an RDN.
|
||||
* This function does not verify if the name is a valid description string
|
||||
* or numeric OID. It merely reads a string from the given starting position
|
||||
* to the spec defined end of an attribute type string.
|
||||
*
|
||||
* @param {Buffer} searchBuffer A buffer representing the RDN.
|
||||
* @param {number} startPos The position in the `searchBuffer` to start
|
||||
* searching from.
|
||||
*
|
||||
* @returns {number} The position of the end of the RDN's attribute type name,
|
||||
* or `-1` if an invalid character has been encountered.
|
||||
*/
|
||||
module.exports = function findNameEnd ({ searchBuffer, startPos }) {
|
||||
let pos = startPos
|
||||
|
||||
while (pos < searchBuffer.byteLength) {
|
||||
const char = searchBuffer[pos]
|
||||
if (char === 0x20 || char === 0x3d) {
|
||||
// Name ends with a space or an '=' character.
|
||||
break
|
||||
}
|
||||
if (isValidNameChar(char) === true) {
|
||||
pos += 1
|
||||
continue
|
||||
}
|
||||
return -1
|
||||
}
|
||||
|
||||
return pos
|
||||
}
|
||||
|
||||
/**
|
||||
* Determine if a character is a valid `attributeType` character as defined
|
||||
* in RFC 4514 §3.
|
||||
*
|
||||
* @param {number} c The character to verify. Should be the byte representation
|
||||
* of the character from a {@link Buffer} instance.
|
||||
*
|
||||
* @returns {boolean}
|
||||
*/
|
||||
function isValidNameChar (c) {
|
||||
if (c >= 0x41 && c <= 0x5a) { // A - Z
|
||||
return true
|
||||
}
|
||||
if (c >= 0x61 && c <= 0x7a) { // a - z
|
||||
return true
|
||||
}
|
||||
if (c >= 0x30 && c <= 0x39) { // 0 - 9
|
||||
return true
|
||||
}
|
||||
if (c === 0x2d || c === 0x2e) { // - or .
|
||||
return true
|
||||
}
|
||||
return false
|
||||
}
|
||||
28
node_modules/@ldapjs/dn/lib/utils/parse-string/find-name-end.test.js
generated
vendored
Normal file
28
node_modules/@ldapjs/dn/lib/utils/parse-string/find-name-end.test.js
generated
vendored
Normal file
@ -0,0 +1,28 @@
|
||||
'use strict'
|
||||
|
||||
const tap = require('tap')
|
||||
const findNameEnd = require('./find-name-end')
|
||||
|
||||
tap.test('stops on a space', async t => {
|
||||
const input = Buffer.from('foo = bar')
|
||||
const pos = findNameEnd({ searchBuffer: input, startPos: 0 })
|
||||
t.equal(pos, 3)
|
||||
})
|
||||
|
||||
tap.test('stops on an equals', async t => {
|
||||
const input = Buffer.from('foo=bar')
|
||||
const pos = findNameEnd({ searchBuffer: input, startPos: 0 })
|
||||
t.equal(pos, 3)
|
||||
})
|
||||
|
||||
tap.test('returns -1 for bad character', async t => {
|
||||
const input = Buffer.from('føø=bar')
|
||||
const pos = findNameEnd({ searchBuffer: input, startPos: 0 })
|
||||
t.equal(pos, -1)
|
||||
})
|
||||
|
||||
tap.test('recognizes all valid characters', async t => {
|
||||
const input = Buffer.from('Foo.0-bar=baz')
|
||||
const pos = findNameEnd({ searchBuffer: input, startPos: 0 })
|
||||
t.equal(pos, 9)
|
||||
})
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user