mirror of
https://github.com/nlohmann/json.git
synced 2025-06-08 05:34:05 +08:00
Compare commits
40 Commits
Author | SHA1 | Date | |
---|---|---|---|
![]() |
c633693d3e | ||
![]() |
cf16c5ab9f | ||
![]() |
b19f058465 | ||
![]() |
5f77df4e22 | ||
![]() |
64ebc6d511 | ||
![]() |
82f4f70669 | ||
![]() |
68c25aec60 | ||
![]() |
ac0133ea89 | ||
![]() |
281d1e929b | ||
![]() |
7421ac31a7 | ||
![]() |
4b17f90f65 | ||
![]() |
2d9a251266 | ||
![]() |
3cca3ad210 | ||
![]() |
828c891427 | ||
![]() |
46e7cd3dc2 | ||
![]() |
6f6be39332 | ||
![]() |
e02de2f971 | ||
![]() |
410c96228c | ||
![]() |
4e518d43d7 | ||
![]() |
230bfd15a2 | ||
![]() |
e9391dc5bc | ||
![]() |
697c7e557c | ||
![]() |
9110918cf8 | ||
![]() |
0a8b48ac6a | ||
![]() |
dff2b4756c | ||
![]() |
eef76c200e | ||
![]() |
3b02afb9d9 | ||
![]() |
6b9199382b | ||
![]() |
51a77f1dca | ||
![]() |
756ca22ec5 | ||
![]() |
85df7ed593 | ||
![]() |
2be2c83d5c | ||
![]() |
c67d538274 | ||
![]() |
88c92e605c | ||
![]() |
96c1b52f1c | ||
![]() |
4cca3b9cb2 | ||
![]() |
93e957332b | ||
![]() |
7ddea2686f | ||
![]() |
2b876ee671 | ||
![]() |
1705bfe914 |
@ -1,4 +1,5 @@
|
||||
# TODO: The first three checks are only removed to get the CI going. They have to be addressed at some point.
|
||||
# TODO: portability-avoid-pragma-once: should be fixed eventually
|
||||
|
||||
Checks: '*,
|
||||
|
||||
@ -59,6 +60,7 @@ Checks: '*,
|
||||
-modernize-use-std-numbers,
|
||||
-modernize-use-trailing-return-type,
|
||||
-performance-enum-size,
|
||||
-portability-avoid-pragma-once,
|
||||
-readability-function-cognitive-complexity,
|
||||
-readability-function-size,
|
||||
-readability-identifier-length,
|
||||
|
26
.github/CONTRIBUTING.md
vendored
26
.github/CONTRIBUTING.md
vendored
@ -1,7 +1,7 @@
|
||||
# Contribution Guidelines
|
||||
|
||||
Thank you for your interest in contributing to this project! What began as an exercise to explore the exciting features
|
||||
of C++11 has evolved into a [widely-used](https://json.nlohmann.me/home/customers/) JSON library. I truly appreciate all
|
||||
of C++11 has evolved into a [widely used](https://json.nlohmann.me/home/customers/) JSON library. I truly appreciate all
|
||||
the contributions from the community, whether it's proposing features, identifying bugs, or fixing mistakes! To ensure
|
||||
that our collaboration is efficient and effective, please follow these guidelines.
|
||||
|
||||
@ -21,7 +21,7 @@ Clearly describe the issue:
|
||||
|
||||
- If it is a bug, please describe how to **reproduce** it. If possible, attach a _complete example_ which demonstrates
|
||||
the error. Please also state what you **expected** to happen instead of the error.
|
||||
- If you propose a change or addition, try to give an **example** how the improved code could look like or how to use
|
||||
- If you propose a change or addition, try to give an **example** what the improved code could look like or how to use
|
||||
it.
|
||||
- If you found a compilation error, please tell us which **compiler** (version and operating system) you used and paste
|
||||
the (relevant part of) the error messages to the ticket.
|
||||
@ -66,21 +66,21 @@ certification that he or she has the right to submit the patch for inclusion int
|
||||
|
||||
### Describe your changes
|
||||
|
||||
This library is primarily maintained as a spare-time project. As such, I can not make any guarantee how quickly changes
|
||||
This library is primarily maintained as a spare-time project. As such, I cannot make any guarantee how quickly changes
|
||||
are merged and released. Therefore, it is very important to make the review as smooth as possible by explaining not only
|
||||
_what_ you changed, but _why_. This rationale can be very valuable down the road when improvements or bugs are discussed
|
||||
years later.
|
||||
|
||||
### Reference existing issues
|
||||
### Reference an existing issue
|
||||
|
||||
[Link a pull request to an issue](https://docs.github.com/en/issues/tracking-your-work-with-issues/using-issues/linking-a-pull-request-to-an-issue)
|
||||
to clarify that a fix is forthcoming and which issue can be closed after merging. Only few cases (e.g., fixing typos)
|
||||
don’t require prior discussions.
|
||||
to clarify that a fix is forthcoming and which issue can be closed after merging. Only a few cases (e.g., fixing typos)
|
||||
do not require prior discussions.
|
||||
|
||||
### Write tests
|
||||
|
||||
The library has an extensive test suite that currently covers [100 %](https://coveralls.io/github/nlohmann/json) of the
|
||||
library's code. These test are crucial to maintain API stability and give future contributors confidence that they do
|
||||
library's code. These tests are crucial to maintain API stability and give future contributors confidence that they do
|
||||
not accidentally break things. As Titus Winters aptly put it:
|
||||
|
||||
> If you liked it, you should have put a test on it.
|
||||
@ -118,14 +118,14 @@ exception into account.
|
||||
#### Coverage
|
||||
|
||||
If test coverage decreases, an automatic warning comment will be posted on the pull request. You can access a code
|
||||
coverage report as artifact to the “Ubuntu” workflow.
|
||||
coverage report as an artifact to the “Ubuntu” workflow.
|
||||
|
||||
### Update the documentation
|
||||
|
||||
The [main documentation](https://json.nlohmann.me) of the library is generated from the files
|
||||
[`docs/mkdocs/docs`](https://github.com/nlohmann/json/blob/develop/docs/mkdocs/docs). This folder contains dedicated
|
||||
pages for [certain features](https://github.com/nlohmann/json/tree/develop/docs/mkdocs/docs/features), a list of
|
||||
[all exceptions](https://github.com/nlohmann/json/blob/develop/docs/mkdocs/docs/home/exceptions.md), and an
|
||||
[all exceptions](https://github.com/nlohmann/json/blob/develop/docs/mkdocs/docs/home/exceptions.md), and
|
||||
[extensive API documentation](https://github.com/nlohmann/json/tree/develop/docs/mkdocs/docs/api) with details on every
|
||||
public API function.
|
||||
|
||||
@ -136,7 +136,7 @@ make install_venv -C docs/mkdocs
|
||||
make serve -C docs/mkdocs
|
||||
```
|
||||
|
||||
The documentation will then available at <http://127.0.0.1:8000/>. See the documentation of
|
||||
The documentation will then be available at <http://127.0.0.1:8000/>. See the documentation of
|
||||
[mkdocs](https://www.mkdocs.org) and [Material for MkDocs](https://squidfunk.github.io/mkdocs-material/) for more
|
||||
information.
|
||||
|
||||
@ -184,8 +184,8 @@ API of the 3.x.y version is broken. This includes:
|
||||
Although these guidelines may seem restrictive, they are essential for maintaining the library’s utility.
|
||||
|
||||
Breaking changes may be introduced when they are guarded with a feature macro such as
|
||||
[`JSON_USE_IMPLICIT_CONVERSIONS`](https://json.nlohmann.me/api/macros/json_use_implicit_conversions/) which allows to
|
||||
selectively change the behavior of the library. In next steps, the current behavior can then be deprecated. Using
|
||||
[`JSON_USE_IMPLICIT_CONVERSIONS`](https://json.nlohmann.me/api/macros/json_use_implicit_conversions/) which allows
|
||||
selectively changing the behavior of the library. In next steps, the current behavior can then be deprecated. Using
|
||||
feature macros then allows users to test their code against the library in the next major release.
|
||||
|
||||
### Break C++11 language conformance
|
||||
@ -211,7 +211,7 @@ The following areas really need contribution and are always welcomed:
|
||||
- Extending the **continuous integration** toward more exotic compilers such as Android NDK, Intel's Compiler, or the
|
||||
bleeding-edge versions Clang.
|
||||
- Improving the efficiency of the **JSON parser**. The current parser is implemented as a naive recursive descent parser
|
||||
with hand coded string handling. More sophisticated approaches like LALR parsers would be really appreciated. That
|
||||
with hand-coded string handling. More sophisticated approaches like LALR parsers would be really appreciated. That
|
||||
said, parser generators like Bison or ANTLR do not play nice with single-header files -- I really would like to keep
|
||||
the parser inside the `json.hpp` header, and I am not aware of approaches similar to [`re2c`](http://re2c.org) for
|
||||
parsing.
|
||||
|
2
.github/PULL_REQUEST_TEMPLATE.md
vendored
2
.github/PULL_REQUEST_TEMPLATE.md
vendored
@ -1,4 +1,4 @@
|
||||
[Describe your pull request here. Please read the text below the line, and make sure you follow the checklist.]
|
||||
[Describe your pull request here. Please read the text below the line and make sure you follow the checklist.]
|
||||
|
||||
- [ ] The changes are described in detail, both the what and why.
|
||||
- [ ] If applicable, an [existing issue](https://github.com/nlohmann/json/issues) is referenced.
|
||||
|
4
.github/workflows/check_amalgamation.yml
vendored
4
.github/workflows/check_amalgamation.yml
vendored
@ -11,7 +11,7 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Harden Runner
|
||||
uses: step-security/harden-runner@c6295a65d1254861815972266d5933fd6e532bdf # v2.11.1
|
||||
uses: step-security/harden-runner@0634a2670c59f64b4a01f0f96f84700a4088b9f0 # v2.12.0
|
||||
with:
|
||||
egress-policy: audit
|
||||
|
||||
@ -34,7 +34,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Harden Runner
|
||||
uses: step-security/harden-runner@c6295a65d1254861815972266d5933fd6e532bdf # v2.11.1
|
||||
uses: step-security/harden-runner@0634a2670c59f64b4a01f0f96f84700a4088b9f0 # v2.12.0
|
||||
with:
|
||||
egress-policy: audit
|
||||
|
||||
|
2
.github/workflows/cifuzz.yml
vendored
2
.github/workflows/cifuzz.yml
vendored
@ -9,7 +9,7 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Harden Runner
|
||||
uses: step-security/harden-runner@c6295a65d1254861815972266d5933fd6e532bdf # v2.11.1
|
||||
uses: step-security/harden-runner@0634a2670c59f64b4a01f0f96f84700a4088b9f0 # v2.12.0
|
||||
with:
|
||||
egress-policy: audit
|
||||
|
||||
|
8
.github/workflows/codeql-analysis.yml
vendored
8
.github/workflows/codeql-analysis.yml
vendored
@ -27,7 +27,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Harden Runner
|
||||
uses: step-security/harden-runner@c6295a65d1254861815972266d5933fd6e532bdf # v2.11.1
|
||||
uses: step-security/harden-runner@0634a2670c59f64b4a01f0f96f84700a4088b9f0 # v2.12.0
|
||||
with:
|
||||
egress-policy: audit
|
||||
|
||||
@ -36,14 +36,14 @@ jobs:
|
||||
|
||||
# Initializes the CodeQL tools for scanning.
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@fc7e4a0fa01c3cca5fd6a1fddec5c0740c977aa2 # v3.28.14
|
||||
uses: github/codeql-action/init@ff0a06e83cb2de871e5a09832bc6a81e7276941f # v3.28.18
|
||||
with:
|
||||
languages: c-cpp
|
||||
|
||||
# Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
|
||||
# If this step fails, then you should remove it and run the build manually (see below)
|
||||
- name: Autobuild
|
||||
uses: github/codeql-action/autobuild@fc7e4a0fa01c3cca5fd6a1fddec5c0740c977aa2 # v3.28.14
|
||||
uses: github/codeql-action/autobuild@ff0a06e83cb2de871e5a09832bc6a81e7276941f # v3.28.18
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
uses: github/codeql-action/analyze@fc7e4a0fa01c3cca5fd6a1fddec5c0740c977aa2 # v3.28.14
|
||||
uses: github/codeql-action/analyze@ff0a06e83cb2de871e5a09832bc6a81e7276941f # v3.28.18
|
||||
|
@ -19,7 +19,7 @@ jobs:
|
||||
pull-requests: write
|
||||
steps:
|
||||
- name: Harden Runner
|
||||
uses: step-security/harden-runner@c6295a65d1254861815972266d5933fd6e532bdf # v2.11.1
|
||||
uses: step-security/harden-runner@0634a2670c59f64b4a01f0f96f84700a4088b9f0 # v2.12.0
|
||||
with:
|
||||
egress-policy: audit
|
||||
|
||||
|
4
.github/workflows/dependency-review.yml
vendored
4
.github/workflows/dependency-review.yml
vendored
@ -17,11 +17,11 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Harden Runner
|
||||
uses: step-security/harden-runner@c6295a65d1254861815972266d5933fd6e532bdf # v2.11.1
|
||||
uses: step-security/harden-runner@0634a2670c59f64b4a01f0f96f84700a4088b9f0 # v2.12.0
|
||||
with:
|
||||
egress-policy: audit
|
||||
|
||||
- name: 'Checkout Repository'
|
||||
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
- name: 'Dependency Review'
|
||||
uses: actions/dependency-review-action@ce3cf9537a52e8119d91fd484ab5b8a807627bf8 # v4.6.0
|
||||
uses: actions/dependency-review-action@da24556b548a50705dd671f47852072ea4c105d9 # v4.7.1
|
||||
|
4
.github/workflows/labeler.yml
vendored
4
.github/workflows/labeler.yml
vendored
@ -17,10 +17,10 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Harden Runner
|
||||
uses: step-security/harden-runner@c6295a65d1254861815972266d5933fd6e532bdf # v2.11.1
|
||||
uses: step-security/harden-runner@0634a2670c59f64b4a01f0f96f84700a4088b9f0 # v2.12.0
|
||||
with:
|
||||
egress-policy: audit
|
||||
|
||||
- uses: srvaroa/labeler@e216fb40e2e6d3b17d90fb1d950f98bee92f65ce # master
|
||||
- uses: srvaroa/labeler@e7bef2249506ba9cbbd3ca5cee256abd9f930b04 # master
|
||||
env:
|
||||
GITHUB_TOKEN: "${{ secrets.GITHUB_TOKEN }}"
|
||||
|
2
.github/workflows/macos.yml
vendored
2
.github/workflows/macos.yml
vendored
@ -91,7 +91,7 @@ jobs:
|
||||
runs-on: macos-15 # https://github.com/actions/runner-images/blob/main/images/macos/macos-15-Readme.md
|
||||
strategy:
|
||||
matrix:
|
||||
xcode: ['16.0', '16.1', '16.2']
|
||||
xcode: ['16.0', '16.1', '16.2', '16.3']
|
||||
env:
|
||||
DEVELOPER_DIR: /Applications/Xcode_${{ matrix.xcode }}.app/Contents/Developer
|
||||
|
||||
|
2
.github/workflows/publish_documentation.yml
vendored
2
.github/workflows/publish_documentation.yml
vendored
@ -27,7 +27,7 @@ jobs:
|
||||
runs-on: ubuntu-22.04
|
||||
steps:
|
||||
- name: Harden Runner
|
||||
uses: step-security/harden-runner@c6295a65d1254861815972266d5933fd6e532bdf # v2.11.1
|
||||
uses: step-security/harden-runner@0634a2670c59f64b4a01f0f96f84700a4088b9f0 # v2.12.0
|
||||
with:
|
||||
egress-policy: audit
|
||||
|
||||
|
6
.github/workflows/scorecards.yml
vendored
6
.github/workflows/scorecards.yml
vendored
@ -36,7 +36,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Harden Runner
|
||||
uses: step-security/harden-runner@c6295a65d1254861815972266d5933fd6e532bdf # v2.11.1
|
||||
uses: step-security/harden-runner@0634a2670c59f64b4a01f0f96f84700a4088b9f0 # v2.12.0
|
||||
with:
|
||||
egress-policy: audit
|
||||
|
||||
@ -46,7 +46,7 @@ jobs:
|
||||
persist-credentials: false
|
||||
|
||||
- name: "Run analysis"
|
||||
uses: ossf/scorecard-action@f49aabe0b5af0936a0987cfb85d86b75731b0186 # v2.4.1
|
||||
uses: ossf/scorecard-action@05b42c624433fc40578a4040d5cf5e36ddca8cde # v2.4.2
|
||||
with:
|
||||
results_file: results.sarif
|
||||
results_format: sarif
|
||||
@ -76,6 +76,6 @@ jobs:
|
||||
|
||||
# Upload the results to GitHub's code scanning dashboard.
|
||||
- name: "Upload to code-scanning"
|
||||
uses: github/codeql-action/upload-sarif@fc7e4a0fa01c3cca5fd6a1fddec5c0740c977aa2 # v3.28.14
|
||||
uses: github/codeql-action/upload-sarif@ff0a06e83cb2de871e5a09832bc6a81e7276941f # v3.28.18
|
||||
with:
|
||||
sarif_file: results.sarif
|
||||
|
2
.github/workflows/stale.yml
vendored
2
.github/workflows/stale.yml
vendored
@ -16,7 +16,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Harden Runner
|
||||
uses: step-security/harden-runner@c6295a65d1254861815972266d5933fd6e532bdf # v2.11.1
|
||||
uses: step-security/harden-runner@0634a2670c59f64b4a01f0f96f84700a4088b9f0 # v2.12.0
|
||||
with:
|
||||
egress-policy: audit
|
||||
|
||||
|
57
.github/workflows/ubuntu.yml
vendored
57
.github/workflows/ubuntu.yml
vendored
@ -23,7 +23,7 @@ jobs:
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
- name: Get latest CMake and ninja
|
||||
uses: lukka/get-cmake@28983e0d3955dba2bb0a6810caae0c6cf268ec0c # v4.0.0
|
||||
uses: lukka/get-cmake@ea004816823209b8d1211e47b216185caee12cc5 # v4.02
|
||||
- name: Run CMake
|
||||
run: cmake -S . -B build -DJSON_CI=On
|
||||
- name: Build
|
||||
@ -46,7 +46,7 @@ jobs:
|
||||
target: [ci_test_amalgamation, ci_test_single_header, ci_cppcheck, ci_cpplint, ci_reproducible_tests, ci_non_git_tests, ci_offline_testdata, ci_reuse_compliance, ci_test_valgrind]
|
||||
steps:
|
||||
- name: Harden Runner
|
||||
uses: step-security/harden-runner@c6295a65d1254861815972266d5933fd6e532bdf # v2.11.1
|
||||
uses: step-security/harden-runner@0634a2670c59f64b4a01f0f96f84700a4088b9f0 # v2.12.0
|
||||
with:
|
||||
egress-policy: audit
|
||||
|
||||
@ -54,7 +54,7 @@ jobs:
|
||||
run: sudo apt-get update ; sudo apt-get install -y valgrind
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
- name: Get latest CMake and ninja
|
||||
uses: lukka/get-cmake@28983e0d3955dba2bb0a6810caae0c6cf268ec0c # v4.0.0
|
||||
uses: lukka/get-cmake@ea004816823209b8d1211e47b216185caee12cc5 # v4.02
|
||||
- name: Run CMake
|
||||
run: cmake -S . -B build -DJSON_CI=On
|
||||
- name: Build
|
||||
@ -71,7 +71,7 @@ jobs:
|
||||
run: apt-get update ; apt-get install -y git clang-tools iwyu unzip
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
- name: Get latest CMake and ninja
|
||||
uses: lukka/get-cmake@28983e0d3955dba2bb0a6810caae0c6cf268ec0c # v4.0.0
|
||||
uses: lukka/get-cmake@ea004816823209b8d1211e47b216185caee12cc5 # v4.02
|
||||
- name: Run CMake
|
||||
run: cmake -S . -B build -DJSON_CI=On
|
||||
- name: Build
|
||||
@ -88,7 +88,7 @@ jobs:
|
||||
run: apt-get update ; apt-get install -y build-essential unzip wget git libssl-dev
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
- name: Get latest CMake and ninja
|
||||
uses: lukka/get-cmake@28983e0d3955dba2bb0a6810caae0c6cf268ec0c # v4.0.0
|
||||
uses: lukka/get-cmake@ea004816823209b8d1211e47b216185caee12cc5 # v4.02
|
||||
- name: Run CMake
|
||||
run: cmake -S . -B build -DJSON_CI=On
|
||||
- name: Build
|
||||
@ -98,7 +98,7 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Harden Runner
|
||||
uses: step-security/harden-runner@c6295a65d1254861815972266d5933fd6e532bdf # v2.11.1
|
||||
uses: step-security/harden-runner@0634a2670c59f64b4a01f0f96f84700a4088b9f0 # v2.12.0
|
||||
with:
|
||||
egress-policy: audit
|
||||
|
||||
@ -143,12 +143,12 @@ jobs:
|
||||
strategy:
|
||||
matrix:
|
||||
# older GCC docker images (4, 5, 6) fail to check out code
|
||||
compiler: ['7', '8', '9', '10', '11', '12', '13', '14', 'latest']
|
||||
compiler: ['7', '8', '9', '10', '11', '12', '13', '14', '15', 'latest']
|
||||
container: gcc:${{ matrix.compiler }}
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
- name: Get latest CMake and ninja
|
||||
uses: lukka/get-cmake@28983e0d3955dba2bb0a6810caae0c6cf268ec0c # v4.0.0
|
||||
uses: lukka/get-cmake@ea004816823209b8d1211e47b216185caee12cc5 # v4.02
|
||||
- name: Run CMake
|
||||
run: cmake -S . -B build -DJSON_CI=On
|
||||
- name: Build
|
||||
@ -165,7 +165,7 @@ jobs:
|
||||
run: apt-get update ; apt-get install -y unzip git
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
- name: Get latest CMake and ninja
|
||||
uses: lukka/get-cmake@28983e0d3955dba2bb0a6810caae0c6cf268ec0c # v4.0.0
|
||||
uses: lukka/get-cmake@ea004816823209b8d1211e47b216185caee12cc5 # v4.02
|
||||
- name: Set env FORCE_STDCPPFS_FLAG for clang 7 / 8 / 9 / 10
|
||||
run: echo "JSON_FORCED_GLOBAL_COMPILE_OPTIONS=-DJSON_HAS_FILESYSTEM=0;-DJSON_HAS_EXPERIMENTAL_FILESYSTEM=0" >> "$GITHUB_ENV"
|
||||
if: ${{ matrix.compiler == '7' || matrix.compiler == '8' || matrix.compiler == '9' || matrix.compiler == '10' }}
|
||||
@ -183,7 +183,7 @@ jobs:
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
- name: Get latest CMake and ninja
|
||||
uses: lukka/get-cmake@28983e0d3955dba2bb0a6810caae0c6cf268ec0c # v4.0.0
|
||||
uses: lukka/get-cmake@ea004816823209b8d1211e47b216185caee12cc5 # v4.02
|
||||
- name: Run CMake
|
||||
run: cmake -S . -B build -DJSON_CI=On
|
||||
- name: Build
|
||||
@ -201,7 +201,7 @@ jobs:
|
||||
run: apt-get update ; apt-get install -y git unzip
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
- name: Get latest CMake and ninja
|
||||
uses: lukka/get-cmake@28983e0d3955dba2bb0a6810caae0c6cf268ec0c # v4.0.0
|
||||
uses: lukka/get-cmake@ea004816823209b8d1211e47b216185caee12cc5 # v4.02
|
||||
- name: Run CMake
|
||||
run: cmake -S . -B build -DJSON_CI=On
|
||||
- name: Build with libc++
|
||||
@ -221,6 +221,21 @@ jobs:
|
||||
- name: Build
|
||||
run: cmake --build build --target ci_cuda_example
|
||||
|
||||
ci_module_cpp20:
|
||||
strategy:
|
||||
matrix:
|
||||
container: ['gcc:latest', 'silkeh/clang:latest']
|
||||
runs-on: ubuntu-latest
|
||||
container: ${{ matrix.container }}
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
- name: Get latest CMake and ninja
|
||||
uses: lukka/get-cmake@ea004816823209b8d1211e47b216185caee12cc5 # v4.02
|
||||
- name: Run CMake
|
||||
run: cmake -S . -B build -DJSON_CI=On
|
||||
- name: Build
|
||||
run: cmake --build build --target ci_module_cpp20
|
||||
|
||||
ci_icpc:
|
||||
runs-on: ubuntu-latest
|
||||
container: ghcr.io/nlohmann/json-ci:v2.2.0
|
||||
@ -233,6 +248,24 @@ jobs:
|
||||
. /opt/intel/oneapi/setvars.sh
|
||||
cmake --build build --target ci_icpc
|
||||
|
||||
ci_emscripten:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Harden Runner
|
||||
uses: step-security/harden-runner@0634a2670c59f64b4a01f0f96f84700a4088b9f0 # v2.12.0
|
||||
with:
|
||||
egress-policy: audit
|
||||
|
||||
- name: Install emscripten
|
||||
uses: mymindstorm/setup-emsdk@v14
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
- name: Get latest CMake and ninja
|
||||
uses: lukka/get-cmake@ea004816823209b8d1211e47b216185caee12cc5 # v4.02
|
||||
- name: Run CMake
|
||||
run: cmake -S . -B build -DCMAKE_TOOLCHAIN_FILE=$EMSDK/upstream/emscripten/cmake/Modules/Platform/Emscripten.cmake -GNinja
|
||||
- name: Build
|
||||
run: cmake --build build
|
||||
|
||||
ci_test_documentation:
|
||||
runs-on: ubuntu-latest
|
||||
strategy:
|
||||
@ -240,7 +273,7 @@ jobs:
|
||||
target: [ci_test_examples, ci_test_build_documentation]
|
||||
steps:
|
||||
- name: Harden Runner
|
||||
uses: step-security/harden-runner@c6295a65d1254861815972266d5933fd6e532bdf # v2.11.1
|
||||
uses: step-security/harden-runner@0634a2670c59f64b4a01f0f96f84700a4088b9f0 # v2.12.0
|
||||
with:
|
||||
egress-policy: audit
|
||||
|
||||
|
95
.github/workflows/windows.yml
vendored
95
.github/workflows/windows.yml
vendored
@ -37,69 +37,48 @@ jobs:
|
||||
- name: Test
|
||||
run: cd build ; ctest -j 10 -C Debug --output-on-failure
|
||||
|
||||
msvc2019:
|
||||
runs-on: windows-2019
|
||||
msvc:
|
||||
strategy:
|
||||
matrix:
|
||||
runs_on: [windows-2019, windows-2022]
|
||||
build_type: [Debug, Release]
|
||||
architecture: [Win32, x64]
|
||||
std_version: [default, latest]
|
||||
|
||||
runs-on: ${{ matrix.runs_on }}
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
- name: Run CMake
|
||||
run: cmake -S . -B build -G "Visual Studio 16 2019" -A ${{ matrix.architecture }} -DJSON_BuildTests=On -DCMAKE_CXX_FLAGS="/W4 /WX"
|
||||
if: matrix.build_type == 'Release'
|
||||
- name: Run CMake
|
||||
run: cmake -S . -B build -G "Visual Studio 16 2019" -A ${{ matrix.architecture }} -DJSON_BuildTests=On -DJSON_FastTests=ON -DCMAKE_CXX_FLAGS="/W4 /WX"
|
||||
if: matrix.build_type == 'Debug'
|
||||
- name: Build
|
||||
run: cmake --build build --config ${{ matrix.build_type }} --parallel 10
|
||||
- name: Test
|
||||
run: cd build ; ctest -j 10 -C ${{ matrix.build_type }} --output-on-failure
|
||||
|
||||
msvc2019_latest:
|
||||
runs-on: windows-2019
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
- name: Run CMake
|
||||
run: cmake -S . -B build -G "Visual Studio 16 2019" -DJSON_BuildTests=On -DCMAKE_CXX_FLAGS="/permissive- /std:c++latest /utf-8 /W4 /WX"
|
||||
- name: Build
|
||||
run: cmake --build build --config Release --parallel 10
|
||||
- name: Test
|
||||
run: cd build ; ctest -j 10 -C Release --output-on-failure
|
||||
|
||||
msvc2022:
|
||||
runs-on: windows-2022
|
||||
strategy:
|
||||
matrix:
|
||||
build_type: [Debug, Release]
|
||||
architecture: [Win32, x64]
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
- name: Run CMake
|
||||
run: cmake -S . -B build -G "Visual Studio 17 2022" -A ${{ matrix.architecture }} -DJSON_BuildTests=On -DCMAKE_CXX_FLAGS="/W4 /WX"
|
||||
if: matrix.build_type == 'Release'
|
||||
- name: Run CMake
|
||||
run: cmake -S . -B build -G "Visual Studio 17 2022" -A ${{ matrix.architecture }} -DJSON_BuildTests=On -DJSON_FastTests=ON -DCMAKE_CXX_FLAGS="/W4 /WX"
|
||||
if: matrix.build_type == 'Debug'
|
||||
- name: Build
|
||||
run: cmake --build build --config ${{ matrix.build_type }} --parallel 10
|
||||
- name: Test
|
||||
run: cd build ; ctest -j 10 -C ${{ matrix.build_type }} --output-on-failure
|
||||
|
||||
msvc2022_latest:
|
||||
runs-on: windows-2022
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
- name: Run CMake
|
||||
run: cmake -S . -B build -G "Visual Studio 17 2022" -DJSON_BuildTests=On -DCMAKE_CXX_FLAGS="/permissive- /std:c++latest /utf-8 /W4 /WX"
|
||||
- name: Build
|
||||
run: cmake --build build --config Release --parallel 10
|
||||
- name: Test
|
||||
run: cd build ; ctest -j 10 -C Release --output-on-failure
|
||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
||||
- name: Set generator
|
||||
id: generator
|
||||
run: |
|
||||
if [ "${{ matrix.runs_on }}" = "windows-2019" ]; then
|
||||
echo "generator=Visual Studio 16 2019" >> $GITHUB_ENV
|
||||
else
|
||||
echo "generator=Visual Studio 17 2022" >> $GITHUB_ENV
|
||||
fi
|
||||
shell: bash
|
||||
- name: Set extra CXX_FLAGS for latest std_version
|
||||
id: cxxflags
|
||||
run: |
|
||||
if [ "${{ matrix.std_version }}" = "latest" ]; then
|
||||
echo "flags=/permissive- /std:c++latest /utf-8 /W4 /WX" >> $GITHUB_ENV
|
||||
else
|
||||
echo "flags=/W4 /WX" >> $GITHUB_ENV
|
||||
fi
|
||||
shell: bash
|
||||
- name: Run CMake (Release)
|
||||
run: cmake -S . -B build -G "$env:generator" -A ${{ matrix.architecture }} -DJSON_BuildTests=On -DCMAKE_CXX_FLAGS="$env:flags"
|
||||
if: matrix.build_type == 'Release'
|
||||
shell: pwsh
|
||||
- name: Run CMake (Debug)
|
||||
run: cmake -S . -B build -G "$env:generator" -A ${{ matrix.architecture }} -DJSON_BuildTests=On -DJSON_FastTests=ON -DCMAKE_CXX_FLAGS="$env:flags"
|
||||
if: matrix.build_type == 'Debug'
|
||||
shell: pwsh
|
||||
- name: Build
|
||||
run: cmake --build build --config ${{ matrix.build_type }} --parallel 10
|
||||
- name: Test
|
||||
run: cd build ; ctest -j 10 -C ${{ matrix.build_type }} --output-on-failure
|
||||
|
||||
clang:
|
||||
runs-on: windows-2019
|
||||
@ -118,7 +97,7 @@ jobs:
|
||||
- name: Test
|
||||
run: cd build ; ctest -j 10 -C Debug --exclude-regex "test-unicode" --output-on-failure
|
||||
|
||||
clang-cl-11:
|
||||
clang-cl-12:
|
||||
runs-on: windows-2019
|
||||
strategy:
|
||||
matrix:
|
||||
|
6
FILES.md
6
FILES.md
@ -92,7 +92,7 @@ Further documentation:
|
||||
|
||||
### `.github/dependabot.yml`
|
||||
|
||||
The configuration of [dependabot](https://github.com/dependabot) which ensures the dependencies (GitHub actions and Python packages used in the CI) remain up-to-date.
|
||||
The configuration of [dependabot](https://github.com/dependabot) which ensures the dependencies (GitHub actions and Python packages used in the CI) remain up to date.
|
||||
|
||||
Further documentation:
|
||||
|
||||
@ -185,7 +185,7 @@ Further documentation:
|
||||
|
||||
### `.reuse/dep5`
|
||||
|
||||
The file defines the licenses of certain third-party component in the repository. The root `Makefile` contains a target `reuse` that checks for compliance.
|
||||
The file defines the licenses of certain third-party components in the repository. The root `Makefile` contains a target `reuse` that checks for compliance.
|
||||
|
||||
Further documentation:
|
||||
|
||||
@ -212,7 +212,7 @@ Further information:
|
||||
|
||||
### `LICENSES`
|
||||
|
||||
A folder that contains every license of all licenses files (library and third-party code).
|
||||
A folder that contains every license of all license files (library and third-party code).
|
||||
|
||||
Further documentation:
|
||||
|
||||
|
81
README.md
81
README.md
@ -19,6 +19,7 @@
|
||||
[](https://isitmaintained.com/project/nlohmann/json "Average time to resolve an issue")
|
||||
[](https://bestpractices.coreinfrastructure.org/projects/289)
|
||||
[](https://scorecard.dev/viewer/?uri=github.com/nlohmann/json)
|
||||
[](https://cloudback.it)
|
||||
[](https://github.com/sponsors/nlohmann)
|
||||
[](https://api.reuse.software/info/github.com/nlohmann/json)
|
||||
[](https://discord.gg/6mrGXKvX7y)
|
||||
@ -30,7 +31,7 @@
|
||||
- [Examples](#examples)
|
||||
- [Read JSON from a file](#read-json-from-a-file)
|
||||
- [Creating `json` objects from JSON literals](#creating-json-objects-from-json-literals)
|
||||
- [JSON as first-class data type](#json-as-first-class-data-type)
|
||||
- [JSON as a first-class data type](#json-as-a-first-class-data-type)
|
||||
- [Serialization / Deserialization](#serialization--deserialization)
|
||||
- [STL-like access](#stl-like-access)
|
||||
- [Conversion from STL containers](#conversion-from-stl-containers)
|
||||
@ -57,7 +58,7 @@
|
||||
|
||||
There are myriads of [JSON](https://json.org) libraries out there, and each may even have its reason to exist. Our class had these design goals:
|
||||
|
||||
- **Intuitive syntax**. In languages such as Python, JSON feels like a first class data type. We used all the operator magic of modern C++ to achieve the same feeling in your code. Check out the [examples below](#examples) and you'll know what I mean.
|
||||
- **Intuitive syntax**. In languages such as Python, JSON feels like a first-class data type. We used all the operator magic of modern C++ to achieve the same feeling in your code. Check out the [examples below](#examples) and you'll know what I mean.
|
||||
|
||||
- **Trivial integration**. Our whole code consists of a single header file [`json.hpp`](https://github.com/nlohmann/json/blob/develop/single_include/nlohmann/json.hpp). That's it. No library, no subproject, no dependencies, no complex build system. The class is written in vanilla C++11. All in all, everything should require no adjustment of your compiler flags or project settings. The library is also included in all popular [package managers](https://json.nlohmann.me/integration/package_managers/).
|
||||
|
||||
@ -107,7 +108,7 @@ Thanks everyone!
|
||||
|
||||
:bug: If you found a **bug**, please check the [**FAQ**](https://json.nlohmann.me/home/faq/) if it is a known issue or the result of a design decision. Please also have a look at the [**issue list**](https://github.com/nlohmann/json/issues) before you [**create a new issue**](https://github.com/nlohmann/json/issues/new/choose). Please provide as much information as possible to help us understand and reproduce your issue.
|
||||
|
||||
There is also a [**docset**](https://github.com/Kapeli/Dash-User-Contributions/tree/master/docsets/JSON_for_Modern_C%2B%2B) for the documentation browsers [Dash](https://kapeli.com/dash), [Velocity](https://velocity.silverlakesoftware.com), and [Zeal](https://zealdocs.org) that contains the full [documentation](https://json.nlohmann.me) as offline resource.
|
||||
There is also a [**docset**](https://github.com/Kapeli/Dash-User-Contributions/tree/master/docsets/JSON_for_Modern_C%2B%2B) for the documentation browsers [Dash](https://kapeli.com/dash), [Velocity](https://velocity.silverlakesoftware.com), and [Zeal](https://zealdocs.org) that contains the full [documentation](https://json.nlohmann.me) as an offline resource.
|
||||
|
||||
## Quick reference
|
||||
|
||||
@ -136,7 +137,7 @@ There is also a [**docset**](https://github.com/Kapeli/Dash-User-Contributions/t
|
||||
|
||||
Here are some examples to give you an idea how to use the class.
|
||||
|
||||
Beside the examples below, you may want to:
|
||||
Besides the examples below, you may want to:
|
||||
|
||||
→ Check the [documentation](https://json.nlohmann.me/)\
|
||||
→ Browse the [standalone example files](https://github.com/nlohmann/json/tree/develop/docs/mkdocs/docs/examples)\
|
||||
@ -195,7 +196,7 @@ json ex3 = {
|
||||
};
|
||||
```
|
||||
|
||||
### JSON as first-class data type
|
||||
### JSON as a first-class data type
|
||||
|
||||
Here are some examples to give you an idea how to use the class.
|
||||
|
||||
@ -224,13 +225,13 @@ With this library, you could write:
|
||||
// create an empty structure (null)
|
||||
json j;
|
||||
|
||||
// add a number that is stored as double (note the implicit conversion of j to an object)
|
||||
// add a number stored as double (note the implicit conversion of j to an object)
|
||||
j["pi"] = 3.141;
|
||||
|
||||
// add a Boolean that is stored as bool
|
||||
// add a Boolean stored as bool
|
||||
j["happy"] = true;
|
||||
|
||||
// add a string that is stored as std::string
|
||||
// add a string stored as std::string
|
||||
j["name"] = "Niels";
|
||||
|
||||
// add another null object by passing nullptr
|
||||
@ -239,7 +240,7 @@ j["nothing"] = nullptr;
|
||||
// add an object inside the object
|
||||
j["answer"]["everything"] = 42;
|
||||
|
||||
// add an array that is stored as std::vector (using an initializer list)
|
||||
// add an array stored as std::vector (using an initializer list)
|
||||
j["list"] = { 1, 0, 2 };
|
||||
|
||||
// add another object (using an initializer list of pairs)
|
||||
@ -349,7 +350,7 @@ std::cout << j_string << " == " << serialized_string << std::endl;
|
||||
|
||||
Note the library only supports UTF-8. When you store strings with different encodings in the library, calling [`dump()`](https://json.nlohmann.me/api/basic_json/dump/) may throw an exception unless `json::error_handler_t::replace` or `json::error_handler_t::ignore` are used as error handlers.
|
||||
|
||||
#### To/from streams (e.g. files, string streams)
|
||||
#### To/from streams (e.g., files, string streams)
|
||||
|
||||
You can also use streams to serialize and deserialize:
|
||||
|
||||
@ -382,7 +383,7 @@ Please note that setting the exception bit for `failbit` is inappropriate for th
|
||||
|
||||
#### Read from iterator range
|
||||
|
||||
You can also parse JSON from an iterator range; that is, from any container accessible by iterators whose `value_type` is an integral type of 1, 2 or 4 bytes, which will be interpreted as UTF-8, UTF-16 and UTF-32 respectively. For instance, a `std::vector<std::uint8_t>`, or a `std::list<std::uint16_t>`:
|
||||
You can also parse JSON from an iterator range; that is, from any container accessible by iterators whose `value_type` is an integral type of 1, 2, or 4 bytes, which will be interpreted as UTF-8, UTF-16, and UTF-32 respectively. For instance, a `std::vector<std::uint8_t>`, or a `std::list<std::uint16_t>`:
|
||||
|
||||
```cpp
|
||||
std::vector<std::uint8_t> v = {'t', 'r', 'u', 'e'};
|
||||
@ -486,7 +487,7 @@ To implement your own SAX handler, proceed as follows:
|
||||
2. Create an object of your SAX interface class, e.g. `my_sax`.
|
||||
3. Call `bool json::sax_parse(input, &my_sax)`; where the first parameter can be any input like a string or an input stream and the second parameter is a pointer to your SAX interface.
|
||||
|
||||
Note the `sax_parse` function only returns a `bool` indicating the result of the last executed SAX event. It does not return a `json` value - it is up to you to decide what to do with the SAX events. Furthermore, no exceptions are thrown in case of a parse error - it is up to you what to do with the exception object passed to your `parse_error` implementation. Internally, the SAX interface is used for the DOM parser (class `json_sax_dom_parser`) as well as the acceptor (`json_sax_acceptor`), see file [`json_sax.hpp`](https://github.com/nlohmann/json/blob/develop/include/nlohmann/detail/input/json_sax.hpp).
|
||||
Note the `sax_parse` function only returns a `bool` indicating the result of the last executed SAX event. It does not return a `json` value - it is up to you to decide what to do with the SAX events. Furthermore, no exceptions are thrown in case of a parse error -- it is up to you what to do with the exception object passed to your `parse_error` implementation. Internally, the SAX interface is used for the DOM parser (class `json_sax_dom_parser`) as well as the acceptor (`json_sax_acceptor`), see file [`json_sax.hpp`](https://github.com/nlohmann/json/blob/develop/include/nlohmann/detail/input/json_sax.hpp).
|
||||
|
||||
### STL-like access
|
||||
|
||||
@ -618,7 +619,7 @@ json j_umset(c_umset); // both entries for "one" are used
|
||||
// maybe ["one", "two", "one", "four"]
|
||||
```
|
||||
|
||||
Likewise, any associative key-value containers (`std::map`, `std::multimap`, `std::unordered_map`, `std::unordered_multimap`) whose keys can construct an `std::string` and whose values can be used to construct JSON values (see examples above) can be used to create a JSON object. Note that in case of multimaps only one key is used in the JSON object and the value depends on the internal order of the STL container.
|
||||
Likewise, any associative key-value containers (`std::map`, `std::multimap`, `std::unordered_map`, `std::unordered_multimap`) whose keys can construct an `std::string` and whose values can be used to construct JSON values (see examples above) can be used to create a JSON object. Note that in case of multimaps, only one key is used in the JSON object and the value depends on the internal order of the STL container.
|
||||
|
||||
```cpp
|
||||
std::map<std::string, int> c_map { {"one", 1}, {"two", 2}, {"three", 3} };
|
||||
@ -640,7 +641,7 @@ json j_ummap(c_ummap); // only one entry for key "three" is used
|
||||
|
||||
### JSON Pointer and JSON Patch
|
||||
|
||||
The library supports **JSON Pointer** ([RFC 6901](https://tools.ietf.org/html/rfc6901)) as alternative means to address structured values. On top of this, **JSON Patch** ([RFC 6902](https://tools.ietf.org/html/rfc6902)) allows describing differences between two JSON values - effectively allowing patch and diff operations known from Unix.
|
||||
The library supports **JSON Pointer** ([RFC 6901](https://tools.ietf.org/html/rfc6901)) as an alternative means to address structured values. On top of this, **JSON Patch** ([RFC 6902](https://tools.ietf.org/html/rfc6902)) allows describing differences between two JSON values -- effectively allowing patch and diff operations known from Unix.
|
||||
|
||||
```cpp
|
||||
// a JSON value
|
||||
@ -873,7 +874,7 @@ namespace ns {
|
||||
|
||||
This requires a bit more advanced technique. But first, let's see how this conversion mechanism works:
|
||||
|
||||
The library uses **JSON Serializers** to convert types to json.
|
||||
The library uses **JSON Serializers** to convert types to JSON.
|
||||
The default serializer for `nlohmann::json` is `nlohmann::adl_serializer` (ADL means [Argument-Dependent Lookup](https://en.cppreference.com/w/cpp/language/adl)).
|
||||
|
||||
It is implemented like this (simplified):
|
||||
@ -923,7 +924,7 @@ namespace nlohmann {
|
||||
|
||||
#### How can I use `get()` for non-default constructible/non-copyable types?
|
||||
|
||||
There is a way, if your type is [MoveConstructible](https://en.cppreference.com/w/cpp/named_req/MoveConstructible). You will need to specialize the `adl_serializer` as well, but with a special `from_json` overload:
|
||||
There is a way if your type is [MoveConstructible](https://en.cppreference.com/w/cpp/named_req/MoveConstructible). You will need to specialize the `adl_serializer` as well, but with a special `from_json` overload:
|
||||
|
||||
```cpp
|
||||
struct move_only_type {
|
||||
@ -1012,7 +1013,7 @@ struct bad_serializer
|
||||
|
||||
### Specializing enum conversion
|
||||
|
||||
By default, enum values are serialized to JSON as integers. In some cases this could result in undesired behavior. If an enum is modified or re-ordered after data has been serialized to JSON, the later de-serialized JSON data may be undefined or a different enum value than was originally intended.
|
||||
By default, enum values are serialized to JSON as integers. In some cases, this could result in undesired behavior. If an enum is modified or re-ordered after data has been serialized to JSON, the later deserialized JSON data may be undefined or a different enum value than was originally intended.
|
||||
|
||||
It is possible to more precisely specify how a given enum is mapped to and from JSON as shown below:
|
||||
|
||||
@ -1168,7 +1169,7 @@ Please note:
|
||||
|
||||
The code compiles successfully with [Android NDK](https://developer.android.com/ndk/index.html?hl=ml), Revision 9 - 11 (and possibly later) and [CrystaX's Android NDK](https://www.crystax.net/en/android/ndk) version 10.
|
||||
|
||||
- For GCC running on MinGW or Android SDK, the error `'to_string' is not a member of 'std'` (or similarly, for `strtod` or `strtof`) may occur. Note this is not an issue with the code, but rather with the compiler itself. On Android, see above to build with a newer environment. For MinGW, please refer to [this site](https://tehsausage.com/mingw-to-string) and [this discussion](https://github.com/nlohmann/json/issues/136) for information on how to fix this bug. For Android NDK using `APP_STL := gnustl_static`, please refer to [this discussion](https://github.com/nlohmann/json/issues/219).
|
||||
- For GCC running on MinGW or Android SDK, the error `'to_string' is not a member of 'std'` (or similarly, for `strtod` or `strtof`) may occur. Note this is not an issue with the code, but rather with the compiler itself. On Android, see above to build with a newer environment. For MinGW, please refer to [this site](https://tehsausage.com/mingw-to-string) and [this discussion](https://github.com/nlohmann/json/issues/136) for information on how to fix this bug. For Android NDK using `APP_STL := gnustl_static`, please refer to [this discussion](https://github.com/nlohmann/json/issues/219).
|
||||
|
||||
- Unsupported versions of GCC and Clang are rejected by `#error` directives. This can be switched off by defining `JSON_SKIP_UNSUPPORTED_COMPILER_CHECK`. Note that you can expect no support in this case.
|
||||
|
||||
@ -1187,7 +1188,7 @@ using json = nlohmann::json;
|
||||
|
||||
to the files you want to process JSON and set the necessary switches to enable C++11 (e.g., `-std=c++11` for GCC and Clang).
|
||||
|
||||
You can further use file [`include/nlohmann/json_fwd.hpp`](https://github.com/nlohmann/json/blob/develop/include/nlohmann/json_fwd.hpp) for forward-declarations. The installation of json_fwd.hpp (as part of cmake's install step), can be achieved by setting `-DJSON_MultipleHeaders=ON`.
|
||||
You can further use file [`include/nlohmann/json_fwd.hpp`](https://github.com/nlohmann/json/blob/develop/include/nlohmann/json_fwd.hpp) for forward-declarations. The installation of `json_fwd.hpp` (as part of cmake's install step) can be achieved by setting `-DJSON_MultipleHeaders=ON`.
|
||||
|
||||
### CMake
|
||||
|
||||
@ -1249,7 +1250,7 @@ FetchContent_MakeAvailable(json)
|
||||
target_link_libraries(foo PRIVATE nlohmann_json::nlohmann_json)
|
||||
```
|
||||
|
||||
**Note**: It is recommended to use the URL approach described above which is supported as of version 3.10.0. See
|
||||
**Note**: It is recommended to use the URL approach described above, which is supported as of version 3.10.0. See
|
||||
<https://json.nlohmann.me/integration/cmake/#fetchcontent> for more information.
|
||||
|
||||
#### Supporting Both
|
||||
@ -1371,13 +1372,13 @@ I deeply appreciate the help of the following people.
|
||||
9. [Florian Weber](https://github.com/Florianjw) fixed a bug in and improved the performance of the comparison operators.
|
||||
10. [Eric Cornelius](https://github.com/EricMCornelius) pointed out a bug in the handling with NaN and infinity values. He also improved the performance of the string escaping.
|
||||
11. [易思龙](https://github.com/likebeta) implemented a conversion from anonymous enums.
|
||||
12. [kepkin](https://github.com/kepkin) patiently pushed forward the support for Microsoft Visual studio.
|
||||
12. [kepkin](https://github.com/kepkin) patiently pushed forward the support for Microsoft Visual Studio.
|
||||
13. [gregmarr](https://github.com/gregmarr) simplified the implementation of reverse iterators and helped with numerous hints and improvements. In particular, he pushed forward the implementation of user-defined types.
|
||||
14. [Caio Luppi](https://github.com/caiovlp) fixed a bug in the Unicode handling.
|
||||
15. [dariomt](https://github.com/dariomt) fixed some typos in the examples.
|
||||
16. [Daniel Frey](https://github.com/d-frey) cleaned up some pointers and implemented exception-safe memory allocation.
|
||||
17. [Colin Hirsch](https://github.com/ColinH) took care of a small namespace issue.
|
||||
18. [Huu Nguyen](https://github.com/whoshuu) correct a variable name in the documentation.
|
||||
18. [Huu Nguyen](https://github.com/whoshuu) corrected a variable name in the documentation.
|
||||
19. [Silverweed](https://github.com/silverweed) overloaded `parse()` to accept an rvalue reference.
|
||||
20. [dariomt](https://github.com/dariomt) fixed a subtlety in MSVC type support and implemented the `get_ref()` function to get a reference to stored values.
|
||||
21. [ZahlGraf](https://github.com/ZahlGraf) added a workaround that allows compilation using Android NDK.
|
||||
@ -1412,7 +1413,7 @@ I deeply appreciate the help of the following people.
|
||||
50. [Jared Grubb](https://github.com/jaredgrubb) silenced a nasty documentation warning.
|
||||
51. [Yixin Zhang](https://github.com/qwename) fixed an integer overflow check.
|
||||
52. [Bosswestfalen](https://github.com/Bosswestfalen) merged two iterator classes into a smaller one.
|
||||
53. [Daniel599](https://github.com/Daniel599) helped to get Travis execute the tests with Clang's sanitizers.
|
||||
53. [Daniel599](https://github.com/Daniel599) helped to get Travis to execute the tests with Clang's sanitizers.
|
||||
54. [Jonathan Lee](https://github.com/vjon) fixed an example in the README file.
|
||||
55. [gnzlbg](https://github.com/gnzlbg) supported the implementation of user-defined types.
|
||||
56. [Alexej Harm](https://github.com/qis) helped to get the user-defined types working with Visual Studio.
|
||||
@ -1433,7 +1434,7 @@ I deeply appreciate the help of the following people.
|
||||
71. [Vincent Thiery](https://github.com/vthiery) maintains a package for the Conan package manager.
|
||||
72. [Steffen](https://github.com/koemeet) fixed a potential issue with MSVC and `std::min`.
|
||||
73. [Mike Tzou](https://github.com/Chocobo1) fixed some typos.
|
||||
74. [amrcode](https://github.com/amrcode) noted a misleading documentation about comparison of floats.
|
||||
74. [amrcode](https://github.com/amrcode) noted misleading documentation about comparison of floats.
|
||||
75. [Oleg Endo](https://github.com/olegendo) reduced the memory consumption by replacing `<iostream>` with `<iosfwd>`.
|
||||
76. [dan-42](https://github.com/dan-42) cleaned up the CMake files to simplify including/reusing of the library.
|
||||
77. [Nikita Ofitserov](https://github.com/himikof) allowed for moving values from initializer lists.
|
||||
@ -1460,13 +1461,13 @@ I deeply appreciate the help of the following people.
|
||||
98. [Vadim Evard](https://github.com/Pipeliner) fixed a Markdown issue in the README.
|
||||
99. [zerodefect](https://github.com/zerodefect) fixed a compiler warning.
|
||||
100. [Kert](https://github.com/kaidokert) allowed to template the string type in the serialization and added the possibility to override the exceptional behavior.
|
||||
101. [mark-99](https://github.com/mark-99) helped fixing an ICC error.
|
||||
101. [mark-99](https://github.com/mark-99) helped fix an ICC error.
|
||||
102. [Patrik Huber](https://github.com/patrikhuber) fixed links in the README file.
|
||||
103. [johnfb](https://github.com/johnfb) found a bug in the implementation of CBOR's indefinite length strings.
|
||||
104. [Paul Fultz II](https://github.com/pfultz2) added a note on the cget package manager.
|
||||
105. [Wilson Lin](https://github.com/wla80) made the integration section of the README more concise.
|
||||
106. [RalfBielig](https://github.com/ralfbielig) detected and fixed a memory leak in the parser callback.
|
||||
107. [agrianius](https://github.com/agrianius) allowed to dump JSON to an alternative string type.
|
||||
107. [agrianius](https://github.com/agrianius) allowed dumping JSON to an alternative string type.
|
||||
108. [Kevin Tonon](https://github.com/ktonon) overworked the C++11 compiler checks in CMake.
|
||||
109. [Axel Huebl](https://github.com/ax3l) simplified a CMake check and added support for the [Spack package manager](https://spack.io).
|
||||
110. [Carlos O'Ryan](https://github.com/coryan) fixed a typo.
|
||||
@ -1515,12 +1516,12 @@ I deeply appreciate the help of the following people.
|
||||
153. [Ivor Wanders](https://github.com/iwanders) helped to reduce the CMake requirement to version 3.1.
|
||||
154. [njlr](https://github.com/njlr) updated the Buckaroo instructions.
|
||||
155. [Lion](https://github.com/lieff) fixed a compilation issue with GCC 7 on CentOS.
|
||||
156. [Isaac Nickaein](https://github.com/nickaein) improved the integer serialization performance and implemented the `contains()` function.
|
||||
156. [Isaac Nickaein](https://github.com/nickaein) improved the integer serialization performance and implemented the `contains()` function.
|
||||
157. [past-due](https://github.com/past-due) suppressed an unfixable warning.
|
||||
158. [Elvis Oric](https://github.com/elvisoric) improved Meson support.
|
||||
159. [Matěj Plch](https://github.com/Afforix) fixed an example in the README.
|
||||
160. [Mark Beckwith](https://github.com/wythe) fixed a typo.
|
||||
161. [scinart](https://github.com/scinart) fixed bug in the serializer.
|
||||
161. [scinart](https://github.com/scinart) fixed a bug in the serializer.
|
||||
162. [Patrick Boettcher](https://github.com/pboettch) implemented `push_back()` and `pop_back()` for JSON Pointers.
|
||||
163. [Bruno Oliveira](https://github.com/nicoddemus) added support for Conda.
|
||||
164. [Michele Caini](https://github.com/skypjack) fixed links in the README.
|
||||
@ -1562,7 +1563,7 @@ I deeply appreciate the help of the following people.
|
||||
200. [Alexander “weej” Jones](https://github.com/alex-weej) fixed an example in the README.
|
||||
201. [Antoine Cœur](https://github.com/Coeur) fixed some typos in the documentation.
|
||||
202. [jothepro](https://github.com/jothepro) updated links to the Hunter package.
|
||||
203. [Dave Lee](https://github.com/kastiglione) fixed link in the README.
|
||||
203. [Dave Lee](https://github.com/kastiglione) fixed a link in the README.
|
||||
204. [Joël Lamotte](https://github.com/Klaim) added instruction for using Build2's package manager.
|
||||
205. [Paul Jurczak](https://github.com/pauljurczak) fixed an example in the README.
|
||||
206. [Sonu Lohani](https://github.com/sonulohani) fixed a warning.
|
||||
@ -1605,7 +1606,7 @@ I deeply appreciate the help of the following people.
|
||||
243. [raduteo](https://github.com/raduteo) fixed a warning.
|
||||
244. [David Pfahler](https://github.com/theShmoo) added the possibility to compile the library without I/O support.
|
||||
245. [Morten Fyhn Amundsen](https://github.com/mortenfyhn) fixed a typo.
|
||||
246. [jpl-mac](https://github.com/jpl-mac) allowed to treat the library as a system header in CMake.
|
||||
246. [jpl-mac](https://github.com/jpl-mac) allowed treating the library as a system header in CMake.
|
||||
247. [Jason Dsouza](https://github.com/jasmcaus) fixed the indentation of the CMake file.
|
||||
248. [offa](https://github.com/offa) added a link to Conan Center to the documentation.
|
||||
249. [TotalCaesar659](https://github.com/TotalCaesar659) updated the links in the documentation to use HTTPS.
|
||||
@ -1647,7 +1648,7 @@ I deeply appreciate the help of the following people.
|
||||
285. [Wolf Vollprecht](https://github.com/wolfv) added the `patch_inplace` function.
|
||||
286. [Jake Zimmerman](https://github.com/jez) highlighted common usage patterns in the README file.
|
||||
287. [NN](https://github.com/NN---) added the Visual Studio output directory to `.gitignore`.
|
||||
288. [Romain Reignier](https://github.com/romainreignier) improved the performance the vector output adapter.
|
||||
288. [Romain Reignier](https://github.com/romainreignier) improved the performance of the vector output adapter.
|
||||
289. [Mike](https://github.com/Mike-Leo-Smith) fixed the `std::iterator_traits`.
|
||||
290. [Richard Hozák](https://github.com/zxey) added macro `JSON_NO_ENUM` to disable default enum conversions.
|
||||
291. [vakokako](https://github.com/vakokako) fixed tests when compiling with C++20.
|
||||
@ -1694,9 +1695,9 @@ I deeply appreciate the help of the following people.
|
||||
332. [taro](https://github.com/tarolling) fixed a typo in the `CODEOWNERS` file.
|
||||
333. [Ikko Eltociear Ashimine](https://github.com/eltociear) fixed a typo.
|
||||
334. [Felix Yan](https://github.com/felixonmars) fixed a typo in the README.
|
||||
335. [HO-COOH](https://github.com/HO-COOH) fixed a parentheses in the documentation.
|
||||
335. [HO-COOH](https://github.com/HO-COOH) fixed a parenthesis in the documentation.
|
||||
336. [Ivor Wanders](https://github.com/iwanders) fixed the examples to catch exception by `const&`.
|
||||
337. [miny1233](https://github.com/miny1233) fixed a parentheses in the documentation.
|
||||
337. [miny1233](https://github.com/miny1233) fixed a parenthesis in the documentation.
|
||||
338. [tomalakgeretkal](https://github.com/tomalakgeretkal) fixed a compilation error.
|
||||
339. [alferov](https://github.com/ALF-ONE) fixed a compilation error.
|
||||
340. [Craig Scott](https://github.com/craigscott-crascit) fixed a deprecation warning in CMake.
|
||||
@ -1779,7 +1780,7 @@ The library itself consists of a single header file licensed under the MIT licen
|
||||
|
||||
The library supports **Unicode input** as follows:
|
||||
|
||||
- Only **UTF-8** encoded input is supported which is the default encoding for JSON according to [RFC 8259](https://tools.ietf.org/html/rfc8259.html#section-8.1).
|
||||
- Only **UTF-8** encoded input is supported, which is the default encoding for JSON according to [RFC 8259](https://tools.ietf.org/html/rfc8259.html#section-8.1).
|
||||
- `std::u16string` and `std::u32string` can be parsed, assuming UTF-16 and UTF-32 encoding, respectively. These encodings are not supported when reading from files or other input containers.
|
||||
- Other encodings such as Latin-1 or ISO 8859-1 are **not** supported and will yield parse or serialization errors.
|
||||
- [Unicode noncharacters](https://www.unicode.org/faq/private_use.html#nonchar1) will not be replaced by the library.
|
||||
@ -1801,7 +1802,17 @@ This library does not support comments by default. It does so for three reasons:
|
||||
|
||||
3. It is dangerous for interoperability if some libraries would add comment support while others don't. Please check [The Harmful Consequences of the Robustness Principle](https://tools.ietf.org/html/draft-iab-protocol-maintenance-01) on this.
|
||||
|
||||
However, you can pass set parameter `ignore_comments` to true in the `parse` function to ignore `//` or `/* */` comments. Comments will then be treated as whitespace.
|
||||
However, you can set set parameter `ignore_comments` to true in the `parse` function to ignore `//` or `/* */` comments. Comments will then be treated as whitespace.
|
||||
|
||||
### Trailing commas
|
||||
|
||||
The JSON specification does not allow trailing commas in arrays and objects, and hence this library is treating them as parsing errors by default.
|
||||
|
||||
Like comments, you can set parameter `ignore_trailing_commas` to true in the `parse` function to ignore trailing commas in arrays and objects. Note that a single comma as the only content of the array or object (`[,]` or `{,}`) is not allowed, and multiple trailing commas (`[1,,]`) are not allowed either.
|
||||
|
||||
This library does not add trailing commas when serializing JSON data.
|
||||
|
||||
For more information, see [JSON With Commas and Comments (JWCC)](https://nigeltao.github.io/blog/2021/json-with-commas-comments.html).
|
||||
|
||||
### Order of object keys
|
||||
|
||||
|
@ -659,6 +659,17 @@ add_custom_target(ci_cuda_example
|
||||
COMMAND ${CMAKE_COMMAND} --build ${PROJECT_BINARY_DIR}/build_cuda_example
|
||||
)
|
||||
|
||||
###############################################################################
|
||||
# C++ 20 modules
|
||||
###############################################################################
|
||||
|
||||
add_custom_target(ci_module_cpp20
|
||||
COMMAND ${CMAKE_COMMAND}
|
||||
-DCMAKE_BUILD_TYPE=Debug -GNinja
|
||||
-S${PROJECT_SOURCE_DIR}/tests/module_cpp20 -B${PROJECT_BINARY_DIR}/ci_module_cpp20
|
||||
COMMAND ${CMAKE_COMMAND} --build ${PROJECT_BINARY_DIR}/ci_module_cpp20
|
||||
)
|
||||
|
||||
###############################################################################
|
||||
# Intel C++ Compiler
|
||||
###############################################################################
|
||||
|
31
cmake/detect_libcpp_version.cpp
Normal file
31
cmake/detect_libcpp_version.cpp
Normal file
@ -0,0 +1,31 @@
|
||||
/*
|
||||
* Detect used C++ Standard Library
|
||||
*
|
||||
* This file is compiled and run via try_run in download_test_data.cmake.
|
||||
*/
|
||||
|
||||
#include <cstdio>
|
||||
|
||||
// see https://en.cppreference.com/w/cpp/header/ciso646
|
||||
#if __cplusplus >= 202002L
|
||||
#include <version>
|
||||
#else
|
||||
#include <ciso646>
|
||||
#endif
|
||||
|
||||
int main()
|
||||
{
|
||||
#if defined(_LIBCPP_VERSION)
|
||||
std::printf("LLVM C++ Standard Library (libc++), _LIBCPP_VERSION=%d", _LIBCPP_VERSION);
|
||||
#elif defined(__GLIBCXX__)
|
||||
std::printf("GNU C++ Standard Library (libstdc++), __GLIBCXX__=%d", __GLIBCXX__);
|
||||
#elif defined(_MSVC_STL_VERSION)
|
||||
std::printf("Microsoft C++ Standard Library (MSVC STL), _MSVC_STL_VERSION=%d", _MSVC_STL_VERSION);
|
||||
#elif defined(_LIBCUDACXX_VERSION)
|
||||
std::printf("NVIDIA C++ Standard Library (libcudacxx), _LIBCUDACXX_VERSION=%d", _LIBCUDACXX_VERSION);
|
||||
#elif defined(EASTL_VERSION)
|
||||
std::printf("Electronic Arts Standard Template Library (EASTL), EASTL_VERSION=%d", EASTL_VERSION);
|
||||
#else
|
||||
std::printf("unknown");
|
||||
#endif
|
||||
}
|
@ -54,3 +54,18 @@ else()
|
||||
endif()
|
||||
string(REGEX REPLACE "[ ]*\n" "; " CXX_VERSION_RESULT "${CXX_VERSION_RESULT}")
|
||||
message(STATUS "Compiler: ${CXX_VERSION_RESULT}")
|
||||
|
||||
# determine used C++ standard library (for debug and support purposes)
|
||||
if(NOT DEFINED LIBCPP_VERSION_OUTPUT_CACHED)
|
||||
try_run(RUN_RESULT_VAR COMPILE_RESULT_VAR
|
||||
"${CMAKE_BINARY_DIR}" SOURCES "${CMAKE_SOURCE_DIR}/cmake/detect_libcpp_version.cpp"
|
||||
RUN_OUTPUT_VARIABLE LIBCPP_VERSION_OUTPUT COMPILE_OUTPUT_VARIABLE LIBCPP_VERSION_COMPILE_OUTPUT
|
||||
)
|
||||
if(NOT LIBCPP_VERSION_OUTPUT)
|
||||
set(LIBCPP_VERSION_OUTPUT "Unknown")
|
||||
message(AUTHOR_WARNING "Failed to compile cmake/detect_libcpp_version to detect the used C++ standard library. This does not affect the library or the test cases. Please still create an issue at https://github.com/nlohmann/json to investigate this.\n${LIBCPP_VERSION_COMPILE_OUTPUT}")
|
||||
endif()
|
||||
set(LIBCPP_VERSION_OUTPUT_CACHED "${LIBCPP_VERSION_OUTPUT}" CACHE STRING "Detected C++ standard library version")
|
||||
endif()
|
||||
|
||||
message(STATUS "C++ standard library: ${LIBCPP_VERSION_OUTPUT_CACHED}")
|
||||
|
@ -1,13 +1,13 @@
|
||||
# Warning flags determined for GCC 14.2.0 with https://github.com/nlohmann/gcc_flags:
|
||||
# Warning flags determined for GCC 15.1.0 with https://github.com/nlohmann/gcc_flags:
|
||||
# Ignored GCC warnings:
|
||||
# -Wno-abi-tag We do not care about ABI tags.
|
||||
# -Wno-aggregate-return The library uses aggregate returns.
|
||||
# -Wno-long-long The library uses the long long type to interface with system functions.
|
||||
# -Wno-namespaces The library uses namespaces.
|
||||
# -Wno-nrvo Doctest triggers this warning.
|
||||
# -Wno-padded We do not care about padding warnings.
|
||||
# -Wno-system-headers We do not care about warnings in system headers.
|
||||
# -Wno-templates The library uses templates.
|
||||
# -Wno-abi-tag We do not care about ABI tags.
|
||||
# -Wno-aggregate-return The library uses aggregate returns.
|
||||
# -Wno-long-long The library uses the long long type to interface with system functions.
|
||||
# -Wno-namespaces The library uses namespaces.
|
||||
# -Wno-nrvo Doctest triggers this warning.
|
||||
# -Wno-padded We do not care about padding warnings.
|
||||
# -Wno-system-headers We do not care about warnings in system headers.
|
||||
# -Wno-templates The library uses templates.
|
||||
|
||||
set(GCC_CXXFLAGS
|
||||
-pedantic
|
||||
@ -65,6 +65,7 @@ set(GCC_CXXFLAGS
|
||||
-Wanalyzer-tainted-offset
|
||||
-Wanalyzer-tainted-size
|
||||
-Wanalyzer-too-complex
|
||||
-Wanalyzer-undefined-behavior-ptrdiff
|
||||
-Wanalyzer-undefined-behavior-strtok
|
||||
-Wanalyzer-unsafe-call-within-signal-handler
|
||||
-Wanalyzer-use-after-free
|
||||
@ -123,6 +124,7 @@ set(GCC_CXXFLAGS
|
||||
-Wcoverage-invalid-line-number
|
||||
-Wcoverage-mismatch
|
||||
-Wcoverage-too-many-conditions
|
||||
-Wcoverage-too-many-paths
|
||||
-Wcpp
|
||||
-Wctad-maybe-unsupported
|
||||
-Wctor-dtor-privacy
|
||||
@ -130,6 +132,7 @@ set(GCC_CXXFLAGS
|
||||
-Wdangling-pointer=2
|
||||
-Wdangling-reference
|
||||
-Wdate-time
|
||||
-Wdefaulted-function-deleted
|
||||
-Wdelete-incomplete
|
||||
-Wdelete-non-virtual-dtor
|
||||
-Wdeprecated
|
||||
@ -138,6 +141,8 @@ set(GCC_CXXFLAGS
|
||||
-Wdeprecated-declarations
|
||||
-Wdeprecated-enum-enum-conversion
|
||||
-Wdeprecated-enum-float-conversion
|
||||
-Wdeprecated-literal-operator
|
||||
-Wdeprecated-variadic-comma-omission
|
||||
-Wdisabled-optimization
|
||||
-Wdiv-by-zero
|
||||
-Wdouble-promotion
|
||||
@ -157,20 +162,21 @@ set(GCC_CXXFLAGS
|
||||
-Wfloat-conversion
|
||||
-Wfloat-equal
|
||||
-Wformat -Wformat-contains-nul
|
||||
-Wformat -Wformat-diag
|
||||
-Wformat -Wformat-extra-args
|
||||
-Wformat -Wformat-nonliteral
|
||||
-Wformat -Wformat-overflow=2
|
||||
-Wformat -Wformat-security
|
||||
-Wformat -Wformat-signedness
|
||||
-Wformat -Wformat-truncation=2
|
||||
-Wformat -Wformat-y2k
|
||||
-Wformat -Wformat-zero-length
|
||||
-Wformat-diag
|
||||
-Wformat-overflow=2
|
||||
-Wformat-signedness
|
||||
-Wformat-truncation=2
|
||||
-Wformat=2
|
||||
-Wframe-address
|
||||
-Wfree-nonheap-object
|
||||
-Wglobal-module
|
||||
-Whardened
|
||||
-Wheader-guard
|
||||
-Whsa
|
||||
-Wif-not-aligned
|
||||
-Wignored-attributes
|
||||
@ -197,6 +203,7 @@ set(GCC_CXXFLAGS
|
||||
-Wno-long-long
|
||||
-Wlto-type-mismatch
|
||||
-Wmain
|
||||
-Wmaybe-musttail-local-addr
|
||||
-Wmaybe-uninitialized
|
||||
-Wmemset-elt-size
|
||||
-Wmemset-transposed-args
|
||||
@ -215,6 +222,7 @@ set(GCC_CXXFLAGS
|
||||
-Wmultichar
|
||||
-Wmultiple-inheritance
|
||||
-Wmultistatement-macros
|
||||
-Wmusttail-local-addr
|
||||
-Wno-namespaces
|
||||
-Wnarrowing
|
||||
-Wnoexcept
|
||||
@ -245,6 +253,7 @@ set(GCC_CXXFLAGS
|
||||
-Wpmf-conversions
|
||||
-Wpointer-arith
|
||||
-Wpointer-compare
|
||||
-Wpragma-once-outside-header
|
||||
-Wpragmas
|
||||
-Wprio-ctor-dtor
|
||||
-Wpsabi
|
||||
@ -276,10 +285,12 @@ set(GCC_CXXFLAGS
|
||||
-Wsizeof-pointer-div
|
||||
-Wsizeof-pointer-memaccess
|
||||
-Wstack-protector
|
||||
-Wstrict-aliasing
|
||||
-Wstrict-aliasing=3
|
||||
-Wstrict-null-sentinel
|
||||
-Wstrict-overflow
|
||||
-Wstring-compare
|
||||
-Wstringop-overflow
|
||||
-Wstringop-overflow=4
|
||||
-Wstringop-overread
|
||||
-Wstringop-truncation
|
||||
@ -304,8 +315,12 @@ set(GCC_CXXFLAGS
|
||||
-Wsynth
|
||||
-Wno-system-headers
|
||||
-Wtautological-compare
|
||||
-Wtemplate-body
|
||||
-Wtemplate-id-cdtor
|
||||
-Wtemplate-names-tu-local
|
||||
-Wno-templates
|
||||
-Wterminate
|
||||
-Wtrailing-whitespace
|
||||
-Wtrampolines
|
||||
-Wtrigraphs
|
||||
-Wtrivial-auto-var-init
|
||||
|
@ -1 +1 @@
|
||||
cpplint==2.0.1
|
||||
cpplint==2.0.2
|
||||
|
@ -8,7 +8,7 @@ struct adl_serializer;
|
||||
Serializer that uses ADL ([Argument-Dependent Lookup](https://en.cppreference.com/w/cpp/language/adl)) to choose
|
||||
`to_json`/`from_json` functions from the types' namespaces.
|
||||
|
||||
It is implemented similar to
|
||||
It is implemented similarly to
|
||||
|
||||
```cpp
|
||||
template<typename ValueType>
|
||||
|
@ -4,12 +4,14 @@
|
||||
// (1)
|
||||
template<typename InputType>
|
||||
static bool accept(InputType&& i,
|
||||
const bool ignore_comments = false);
|
||||
const bool ignore_comments = false,
|
||||
const bool ignore_trailing_commas = false);
|
||||
|
||||
// (2)
|
||||
template<typename IteratorType>
|
||||
static bool accept(IteratorType first, IteratorType last,
|
||||
const bool ignore_comments = false);
|
||||
const bool ignore_comments = false,
|
||||
const bool ignore_trailing_commas = false);
|
||||
```
|
||||
|
||||
Checks whether the input is valid JSON.
|
||||
@ -50,6 +52,10 @@ Unlike the [`parse()`](parse.md) function, this function neither throws an excep
|
||||
: whether comments should be ignored and treated like whitespace (`#!cpp true`) or yield a parse error
|
||||
(`#!cpp false`); (optional, `#!cpp false` by default)
|
||||
|
||||
`ignore_trailing_commas` (in)
|
||||
: whether trailing commas in arrays or objects should be ignored and treated like whitespace (`#!cpp true`) or yield a parse error
|
||||
(`#!cpp false`); (optional, `#!cpp false` by default)
|
||||
|
||||
`first` (in)
|
||||
: iterator to the start of the character range
|
||||
|
||||
@ -102,6 +108,7 @@ A UTF-8 byte order mark is silently ignored.
|
||||
- Added in version 3.0.0.
|
||||
- Ignoring comments via `ignore_comments` added in version 3.9.0.
|
||||
- Changed [runtime assertion](../../features/assertions.md) in case of `FILE*` null pointers to exception in version 3.12.0.
|
||||
- Added `ignore_trailing_commas` in version 3.12.1.
|
||||
|
||||
!!! warning "Deprecation"
|
||||
|
||||
|
@ -74,7 +74,7 @@ basic_json(basic_json&& other) noexcept;
|
||||
- **boolean**: `boolean_t` / `bool` can be used.
|
||||
- **binary**: `binary_t` / `std::vector<uint8_t>` may be used; unfortunately because string literals cannot be
|
||||
distinguished from binary character arrays by the C++ type system, all types compatible with `const char*` will be
|
||||
directed to the string constructor instead. This is both for backwards compatibility, and due to the fact that a
|
||||
directed to the string constructor instead. This is both for backwards compatibility and due to the fact that a
|
||||
binary type is not a standard JSON type.
|
||||
|
||||
See the examples below.
|
||||
|
@ -4,7 +4,7 @@
|
||||
bool empty() const noexcept;
|
||||
```
|
||||
|
||||
Checks if a JSON value has no elements (i.e. whether its [`size()`](size.md) is `0`).
|
||||
Checks if a JSON value has no elements (i.e., whether its [`size()`](size.md) is `0`).
|
||||
|
||||
## Return value
|
||||
|
||||
|
@ -29,11 +29,11 @@ void insert(const_iterator first, const_iterator last);
|
||||
For all cases where an element is added to an **array**, a reallocation can happen, in which case all iterators
|
||||
(including the [`end()`](end.md) iterator) and all references to the elements are invalidated. Otherwise, only the
|
||||
[`end()`](end.md) iterator is invalidated. Also, any iterator or reference after the insertion point will point to the
|
||||
same index which is now a different value.
|
||||
same index, which is now a different value.
|
||||
|
||||
For [`ordered_json`](../ordered_json.md), also adding an element to an **object** can yield a reallocation which again
|
||||
invalidates all iterators and all references. Also, any iterator or reference after the insertion point will point to
|
||||
the same index which is now a different value.
|
||||
the same index, which is now a different value.
|
||||
|
||||
## Parameters
|
||||
|
||||
|
@ -16,7 +16,7 @@ Examples of such functionality might be metadata, additional member functions (e
|
||||
|
||||
#### Default type
|
||||
|
||||
The default value for `CustomBaseClass` is `void`. In this case an
|
||||
The default value for `CustomBaseClass` is `void`. In this case, an
|
||||
[empty base class](https://en.cppreference.com/w/cpp/language/ebo) is used and no additional functionality is injected.
|
||||
|
||||
#### Limitations
|
||||
|
@ -6,14 +6,16 @@ template<typename InputType>
|
||||
static basic_json parse(InputType&& i,
|
||||
const parser_callback_t cb = nullptr,
|
||||
const bool allow_exceptions = true,
|
||||
const bool ignore_comments = false);
|
||||
const bool ignore_comments = false,
|
||||
const bool ignore_trailing_commas = false);
|
||||
|
||||
// (2)
|
||||
template<typename IteratorType>
|
||||
static basic_json parse(IteratorType first, IteratorType last,
|
||||
const parser_callback_t cb = nullptr,
|
||||
const bool allow_exceptions = true,
|
||||
const bool ignore_comments = false);
|
||||
const bool ignore_comments = false,
|
||||
const bool ignore_trailing_commas = false);
|
||||
```
|
||||
|
||||
1. Deserialize from a compatible input.
|
||||
@ -56,6 +58,10 @@ static basic_json parse(IteratorType first, IteratorType last,
|
||||
: whether comments should be ignored and treated like whitespace (`#!cpp true`) or yield a parse error
|
||||
(`#!cpp false`); (optional, `#!cpp false` by default)
|
||||
|
||||
`ignore_trailing_commas` (in)
|
||||
: whether trailing commas in arrays or objects should be ignored and treated like whitespace (`#!cpp true`) or yield a parse error
|
||||
(`#!cpp false`); (optional, `#!cpp false` by default)
|
||||
|
||||
`first` (in)
|
||||
: iterator to the start of a character range
|
||||
|
||||
@ -189,6 +195,34 @@ A UTF-8 byte order mark is silently ignored.
|
||||
--8<-- "examples/parse__allow_exceptions.output"
|
||||
```
|
||||
|
||||
??? example "Effect of `ignore_comments` parameter"
|
||||
|
||||
The example below demonstrates the effect of the `ignore_comments` parameter in the `parse()` function.
|
||||
|
||||
```cpp
|
||||
--8<-- "examples/comments.cpp"
|
||||
```
|
||||
|
||||
Output:
|
||||
|
||||
```
|
||||
--8<-- "examples/comments.output"
|
||||
```
|
||||
|
||||
??? example "Effect of `ignore_trailing_commas` parameter"
|
||||
|
||||
The example below demonstrates the effect of the `ignore_trailing_commas` parameter in the `parse()` function.
|
||||
|
||||
```cpp
|
||||
--8<-- "examples/trailing_commas.cpp"
|
||||
```
|
||||
|
||||
Output:
|
||||
|
||||
```
|
||||
--8<-- "examples/trailing_commas.output"
|
||||
```
|
||||
|
||||
## See also
|
||||
|
||||
- [accept](accept.md) - check if the input is valid JSON
|
||||
@ -200,6 +234,7 @@ A UTF-8 byte order mark is silently ignored.
|
||||
- Overload for contiguous containers (1) added in version 2.0.3.
|
||||
- Ignoring comments via `ignore_comments` added in version 3.9.0.
|
||||
- Changed [runtime assertion](../../features/assertions.md) in case of `FILE*` null pointers to exception in version 3.12.0.
|
||||
- Added `ignore_trailing_commas` in version 3.12.1.
|
||||
|
||||
!!! warning "Deprecation"
|
||||
|
||||
|
@ -7,7 +7,8 @@ static bool sax_parse(InputType&& i,
|
||||
SAX* sax,
|
||||
input_format_t format = input_format_t::json,
|
||||
const bool strict = true,
|
||||
const bool ignore_comments = false);
|
||||
const bool ignore_comments = false,
|
||||
const bool ignore_trailing_commas = false);
|
||||
|
||||
// (2)
|
||||
template<class IteratorType, class SAX>
|
||||
@ -15,7 +16,8 @@ static bool sax_parse(IteratorType first, IteratorType last,
|
||||
SAX* sax,
|
||||
input_format_t format = input_format_t::json,
|
||||
const bool strict = true,
|
||||
const bool ignore_comments = false);
|
||||
const bool ignore_comments = false,
|
||||
const bool ignore_trailing_commas = false);
|
||||
```
|
||||
|
||||
Read from input and generate SAX events
|
||||
@ -65,6 +67,10 @@ The SAX event lister must follow the interface of [`json_sax`](../json_sax/index
|
||||
: whether comments should be ignored and treated like whitespace (`#!cpp true`) or yield a parse error
|
||||
(`#!cpp false`); (optional, `#!cpp false` by default)
|
||||
|
||||
`ignore_trailing_commas` (in)
|
||||
: whether trailing commas in arrays or objects should be ignored and treated like whitespace (`#!cpp true`) or yield a parse error
|
||||
(`#!cpp false`); (optional, `#!cpp false` by default)
|
||||
|
||||
`first` (in)
|
||||
: iterator to the start of a character range
|
||||
|
||||
@ -107,6 +113,7 @@ A UTF-8 byte order mark is silently ignored.
|
||||
|
||||
- Added in version 3.2.0.
|
||||
- Ignoring comments via `ignore_comments` added in version 3.9.0.
|
||||
- Added `ignore_trailing_commas` in version 3.12.1.
|
||||
|
||||
!!! warning "Deprecation"
|
||||
|
||||
|
@ -79,7 +79,7 @@ template<typename BasicJsonType>
|
||||
void from_json(const BasicJsonType&, type&);
|
||||
```
|
||||
|
||||
Macros 3 and 6 add one function to the namespace which takes care of the serialization only:
|
||||
Macros 3 and 6 add one function to the namespace, which takes care of the serialization only:
|
||||
|
||||
```cpp
|
||||
template<typename BasicJsonType>
|
||||
|
@ -46,7 +46,7 @@ template<typename BasicJsonType>
|
||||
friend void from_json(const BasicJsonType&, type&); // except (3)
|
||||
```
|
||||
|
||||
See examples below for the concrete generated code.
|
||||
See the examples below for the concrete generated code.
|
||||
|
||||
## Notes
|
||||
|
||||
|
@ -46,7 +46,7 @@ template<typename BasicJsonType>
|
||||
void from_json(const BasicJsonType&, type&); // except (3)
|
||||
```
|
||||
|
||||
See examples below for the concrete generated code.
|
||||
See the examples below for the concrete generated code.
|
||||
|
||||
## Notes
|
||||
|
||||
|
@ -4,8 +4,8 @@
|
||||
#define NLOHMANN_JSON_SERIALIZE_ENUM(type, conversion...)
|
||||
```
|
||||
|
||||
By default, enum values are serialized to JSON as integers. In some cases this could result in undesired behavior. If an
|
||||
enum is modified or re-ordered after data has been serialized to JSON, the later deserialized JSON data may be
|
||||
By default, enum values are serialized to JSON as integers. In some cases, this could result in undesired behavior. If
|
||||
an enum is modified or re-ordered after data has been serialized to JSON, the later deserialized JSON data may be
|
||||
undefined or a different enum value than was originally intended.
|
||||
|
||||
The `NLOHMANN_JSON_SERIALIZE_ENUM` allows to define a user-defined serialization for every enumerator.
|
||||
|
@ -11,7 +11,7 @@ This type preserves the insertion order of object keys.
|
||||
The type is based on [`ordered_map`](ordered_map.md) which in turn uses a `std::vector` to store object elements.
|
||||
Therefore, adding object elements can yield a reallocation in which case all iterators (including the
|
||||
[`end()`](basic_json/end.md) iterator) and all references to the elements are invalidated. Also, any iterator or
|
||||
reference after the insertion point will point to the same index which is now a different value.
|
||||
reference after the insertion point will point to the same index, which is now a different value.
|
||||
|
||||
## Examples
|
||||
|
||||
|
@ -28,6 +28,7 @@ violations will result in a failed build.
|
||||
| AppleClang 16.0.0.16000026; Xcode 16 | arm64 | macOS 15.2 (Sequoia) | GitHub |
|
||||
| AppleClang 16.0.0.16000026; Xcode 16.1 | arm64 | macOS 15.2 (Sequoia) | GitHub |
|
||||
| AppleClang 16.0.0.16000026; Xcode 16.2 | arm64 | macOS 15.2 (Sequoia) | GitHub |
|
||||
| AppleClang 17.0.0.17000013; Xcode 16.3 | arm64 | macOS 15.5 (Sequoia) | GitHub |
|
||||
| Clang 3.5.2 | x86_64 | Ubuntu 22.04.1 LTS | GitHub |
|
||||
| Clang 3.6.2 | x86_64 | Ubuntu 22.04.1 LTS | GitHub |
|
||||
| Clang 3.7.1 | x86_64 | Ubuntu 22.04.1 LTS | GitHub |
|
||||
@ -57,6 +58,7 @@ violations will result in a failed build.
|
||||
| Clang 19.1.7 | x86_64 | Ubuntu 22.04.1 LTS | GitHub |
|
||||
| Clang 20.1.1 | x86_64 | Ubuntu 22.04.1 LTS | GitHub |
|
||||
| Clang 21.0.0 | x86_64 | Ubuntu 22.04.1 LTS | GitHub |
|
||||
| Emscripten 4.0.6 | x86_64 | Ubuntu 22.04.1 LTS | GitHub |
|
||||
| GNU 4.8.5 | x86_64 | Ubuntu 22.04.1 LTS | GitHub |
|
||||
| GNU 4.9.3 | x86_64 | Ubuntu 22.04.1 LTS | GitHub |
|
||||
| GNU 5.5.0 | x86_64 | Ubuntu 22.04.1 LTS | GitHub |
|
||||
@ -75,6 +77,8 @@ violations will result in a failed build.
|
||||
| GNU 13.3.0 | x86_64 | Ubuntu 22.04.1 LTS | GitHub |
|
||||
| GNU 14.2.0 | x86_64 | Ubuntu 22.04.1 LTS | GitHub |
|
||||
| GNU 14.2.0 | arm64 | Linux 6.1.100 | Cirrus CI |
|
||||
| GNU 15.1.0 | x86_64 | Ubuntu 22.04.1 LTS | GitHub |
|
||||
| icpc (ICC) 2021.5.0 20211109 | x86_64 | Ubuntu 20.04.3 LTS | GitHub |
|
||||
| MSVC 19.0.24241.7 | x86 | Windows 8.1 | AppVeyor |
|
||||
| MSVC 19.16.27035.0 | x86 | Windows-10 (Build 14393) | AppVeyor |
|
||||
| MSVC 19.29.30157.0 | x86 | Windows 10 (Build 17763) | GitHub |
|
||||
|
31
docs/mkdocs/docs/examples/comments.cpp
Normal file
31
docs/mkdocs/docs/examples/comments.cpp
Normal file
@ -0,0 +1,31 @@
|
||||
|
||||
#include <iostream>
|
||||
#include <nlohmann/json.hpp>
|
||||
|
||||
using json = nlohmann::json;
|
||||
|
||||
int main()
|
||||
{
|
||||
std::string s = R"(
|
||||
{
|
||||
// update in 2006: removed Pluto
|
||||
"planets": ["Mercury", "Venus", "Earth", "Mars",
|
||||
"Jupiter", "Uranus", "Neptune" /*, "Pluto" */]
|
||||
}
|
||||
)";
|
||||
|
||||
try
|
||||
{
|
||||
json j = json::parse(s);
|
||||
}
|
||||
catch (json::exception& e)
|
||||
{
|
||||
std::cout << e.what() << std::endl;
|
||||
}
|
||||
|
||||
json j = json::parse(s,
|
||||
/* callback */ nullptr,
|
||||
/* allow exceptions */ true,
|
||||
/* ignore_comments */ true);
|
||||
std::cout << j.dump(2) << '\n';
|
||||
}
|
12
docs/mkdocs/docs/examples/comments.output
Normal file
12
docs/mkdocs/docs/examples/comments.output
Normal file
@ -0,0 +1,12 @@
|
||||
[json.exception.parse_error.101] parse error at line 3, column 9: syntax error while parsing object key - invalid literal; last read: '<U+000A> {<U+000A> /'; expected string literal
|
||||
{
|
||||
"planets": [
|
||||
"Mercury",
|
||||
"Venus",
|
||||
"Earth",
|
||||
"Mars",
|
||||
"Jupiter",
|
||||
"Uranus",
|
||||
"Neptune"
|
||||
]
|
||||
}
|
37
docs/mkdocs/docs/examples/trailing_commas.cpp
Normal file
37
docs/mkdocs/docs/examples/trailing_commas.cpp
Normal file
@ -0,0 +1,37 @@
|
||||
#include <iostream>
|
||||
#include <nlohmann/json.hpp>
|
||||
|
||||
using json = nlohmann::json;
|
||||
|
||||
int main()
|
||||
{
|
||||
std::string s = R"(
|
||||
{
|
||||
"planets": [
|
||||
"Mercury",
|
||||
"Venus",
|
||||
"Earth",
|
||||
"Mars",
|
||||
"Jupiter",
|
||||
"Uranus",
|
||||
"Neptune",
|
||||
]
|
||||
}
|
||||
)";
|
||||
|
||||
try
|
||||
{
|
||||
json j = json::parse(s);
|
||||
}
|
||||
catch (json::exception& e)
|
||||
{
|
||||
std::cout << e.what() << std::endl;
|
||||
}
|
||||
|
||||
json j = json::parse(s,
|
||||
/* callback */ nullptr,
|
||||
/* allow exceptions */ true,
|
||||
/* ignore_comments */ false,
|
||||
/* ignore_trailing_commas */ true);
|
||||
std::cout << j.dump(2) << '\n';
|
||||
}
|
12
docs/mkdocs/docs/examples/trailing_commas.output
Normal file
12
docs/mkdocs/docs/examples/trailing_commas.output
Normal file
@ -0,0 +1,12 @@
|
||||
[json.exception.parse_error.101] parse error at line 11, column 9: syntax error while parsing value - unexpected ']'; expected '[', '{', or a literal
|
||||
{
|
||||
"planets": [
|
||||
"Mercury",
|
||||
"Venus",
|
||||
"Earth",
|
||||
"Mars",
|
||||
"Jupiter",
|
||||
"Uranus",
|
||||
"Neptune"
|
||||
]
|
||||
}
|
@ -1,7 +1,7 @@
|
||||
# CBOR
|
||||
|
||||
The Concise Binary Object Representation (CBOR) is a data format whose design goals include the possibility of extremely
|
||||
small code size, fairly small message size, and extensibility without the need for version negotiation.
|
||||
The Concise Binary Object Representation (CBOR) is a data format whose design goals include the possibility of
|
||||
extremely small code sizes, fairly small message size, and extensibility without the need for version negotiation.
|
||||
|
||||
!!! abstract "References"
|
||||
|
||||
|
@ -1,7 +1,7 @@
|
||||
# Binary Values
|
||||
|
||||
The library implements several [binary formats](binary_formats/index.md) that encode JSON in an efficient way. Most of
|
||||
these formats support binary values; that is, values that have semantics define outside the library and only define a
|
||||
these formats support binary values; that is, values that have semantics defined outside the library and only define a
|
||||
sequence of bytes to be stored.
|
||||
|
||||
JSON itself does not have a binary value. As such, binary values are an extension that this library implements to store
|
||||
@ -189,7 +189,7 @@ as an array of uint8 values. The library implements this translation.
|
||||
|
||||
### BSON
|
||||
|
||||
[BSON](binary_formats/bson.md) supports binary values and subtypes. If a subtype is given, it is used and added as
|
||||
[BSON](binary_formats/bson.md) supports binary values and subtypes. If a subtype is given, it is used and added as an
|
||||
unsigned 8-bit integer. If no subtype is given, the generic binary subtype 0x00 is used.
|
||||
|
||||
??? example
|
||||
@ -274,7 +274,7 @@ byte array.
|
||||
|
||||
[MessagePack](binary_formats/messagepack.md) supports binary values and subtypes. If a subtype is given, the ext family
|
||||
is used. The library will choose the smallest representation among fixext1, fixext2, fixext4, fixext8, ext8, ext16, and
|
||||
ext32. The subtype is then added as signed 8-bit integer.
|
||||
ext32. The subtype is then added as a signed 8-bit integer.
|
||||
|
||||
If no subtype is given, the bin family (bin8, bin16, bin32) is used.
|
||||
|
||||
|
@ -11,7 +11,9 @@ This library does not support comments *by default*. It does so for three reason
|
||||
|
||||
3. It is dangerous for interoperability if some libraries add comment support while others do not. Please check [The Harmful Consequences of the Robustness Principle](https://tools.ietf.org/html/draft-iab-protocol-maintenance-01) on this.
|
||||
|
||||
However, you can pass set parameter `ignore_comments` to `#!c true` in the parse function to ignore `//` or `/* */` comments. Comments will then be treated as whitespace.
|
||||
However, you can set parameter `ignore_comments` to `#!cpp true` in the [`parse`](../api/basic_json/parse.md) function to ignore `//` or `/* */` comments. Comments will then be treated as whitespace.
|
||||
|
||||
For more information, see [JSON With Commas and Comments (JWCC)](https://nigeltao.github.io/blog/2021/json-with-commas-comments.html).
|
||||
|
||||
!!! example
|
||||
|
||||
@ -28,56 +30,11 @@ However, you can pass set parameter `ignore_comments` to `#!c true` in the parse
|
||||
When calling `parse` without additional argument, a parse error exception is thrown. If `ignore_comments` is set to `#! true`, the comments are ignored during parsing:
|
||||
|
||||
```cpp
|
||||
#include <iostream>
|
||||
#include "json.hpp"
|
||||
|
||||
using json = nlohmann::json;
|
||||
|
||||
int main()
|
||||
{
|
||||
std::string s = R"(
|
||||
{
|
||||
// update in 2006: removed Pluto
|
||||
"planets": ["Mercury", "Venus", "Earth", "Mars",
|
||||
"Jupiter", "Uranus", "Neptune" /*, "Pluto" */]
|
||||
}
|
||||
)";
|
||||
|
||||
try
|
||||
{
|
||||
json j = json::parse(s);
|
||||
}
|
||||
catch (json::exception &e)
|
||||
{
|
||||
std::cout << e.what() << std::endl;
|
||||
}
|
||||
|
||||
json j = json::parse(s,
|
||||
/* callback */ nullptr,
|
||||
/* allow exceptions */ true,
|
||||
/* ignore_comments */ true);
|
||||
std::cout << j.dump(2) << '\n';
|
||||
}
|
||||
--8<-- "examples/comments.cpp"
|
||||
```
|
||||
|
||||
Output:
|
||||
|
||||
```
|
||||
[json.exception.parse_error.101] parse error at line 3, column 9:
|
||||
syntax error while parsing object key - invalid literal;
|
||||
last read: '<U+000A> {<U+000A> /'; expected string literal
|
||||
```
|
||||
|
||||
```json
|
||||
{
|
||||
"planets": [
|
||||
"Mercury",
|
||||
"Venus",
|
||||
"Earth",
|
||||
"Mars",
|
||||
"Jupiter",
|
||||
"Uranus",
|
||||
"Neptune"
|
||||
]
|
||||
}
|
||||
--8<-- "examples/comments.output"
|
||||
```
|
||||
|
@ -1,8 +1,8 @@
|
||||
# Specializing enum conversion
|
||||
|
||||
By default, enum values are serialized to JSON as integers. In some cases this could result in undesired behavior. If an
|
||||
enum is modified or re-ordered after data has been serialized to JSON, the later deserialized JSON data may be
|
||||
undefined or a different enum value than was originally intended.
|
||||
By default, enum values are serialized to JSON as integers. In some cases, this could result in undesired behavior. If
|
||||
the integer values of any enum values are changed after data using those enum values has been serialized to JSON, then
|
||||
deserializing that JSON would result in a different enum value being restored, or the value not being found at all.
|
||||
|
||||
It is possible to more precisely specify how a given enum is mapped to and from JSON as shown below:
|
||||
|
||||
|
@ -37,7 +37,7 @@ When iterating over objects, values are ordered with respect to the `object_comp
|
||||
|
||||
The reason for the order is the lexicographic ordering of the object keys "one", "three", "two".
|
||||
|
||||
### Access object key during iteration
|
||||
### Access object keys during iteration
|
||||
|
||||
The JSON iterators have two member functions, `key()` and `value()` to access the object key and stored value, respectively. When calling `key()` on a non-object iterator, an [invalid_iterator.207](../home/exceptions.md#jsonexceptioninvalid_iterator207) exception is thrown.
|
||||
|
||||
|
@ -42,7 +42,7 @@ Note `/` does not identify the root (i.e., the whole document), but an object en
|
||||
JSON Pointers can be created from a string:
|
||||
|
||||
```cpp
|
||||
json::json_pointer p = "/nested/one";
|
||||
json::json_pointer p("/nested/one");
|
||||
```
|
||||
|
||||
Furthermore, a user-defined string literal can be used to achieve the same result:
|
||||
|
@ -64,7 +64,7 @@ configurations – to be used in cases where the linker would otherwise output u
|
||||
|
||||
To do so, define [`NLOHMANN_JSON_NAMESPACE_NO_VERSION`](../api/macros/nlohmann_json_namespace_no_version.md) to `1`.
|
||||
|
||||
This applies to version 3.11.2 and above only, versions 3.11.0 and 3.11.1 can apply the technique described in the next
|
||||
This applies to version 3.11.2 and above only; versions 3.11.0 and 3.11.1 can apply the technique described in the next
|
||||
section to emulate the effect of the `NLOHMANN_JSON_NAMESPACE_NO_VERSION` macro.
|
||||
|
||||
!!! danger "Use at your own risk"
|
||||
|
39
docs/mkdocs/docs/features/trailing_commas.md
Normal file
39
docs/mkdocs/docs/features/trailing_commas.md
Normal file
@ -0,0 +1,39 @@
|
||||
# Trailing Commas
|
||||
|
||||
Like [comments](comments.md), this library does not support trailing commas in arrays and objects *by default*.
|
||||
|
||||
You can set parameter `ignore_trailing_commas` to `#!cpp true` in the [`parse`](../api/basic_json/parse.md) function to allow trailing commas in arrays and objects. Note that a single comma as the only content of the array or object (`[,]` or `{,}`) is not allowed, and multiple trailing commas (`[1,,]`) are not allowed either.
|
||||
|
||||
This library does not add trailing commas when serializing JSON data.
|
||||
|
||||
For more information, see [JSON With Commas and Comments (JWCC)](https://nigeltao.github.io/blog/2021/json-with-commas-comments.html).
|
||||
|
||||
!!! example
|
||||
|
||||
Consider the following JSON with trailing commas.
|
||||
|
||||
```json
|
||||
{
|
||||
"planets": [
|
||||
"Mercury",
|
||||
"Venus",
|
||||
"Earth",
|
||||
"Mars",
|
||||
"Jupiter",
|
||||
"Uranus",
|
||||
"Neptune",
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
When calling `parse` without additional argument, a parse error exception is thrown. If `ignore_trailing_commas` is set to `#! true`, the trailing commas are ignored during parsing:
|
||||
|
||||
```cpp
|
||||
--8<-- "examples/trailing_commas.cpp"
|
||||
```
|
||||
|
||||
Output:
|
||||
|
||||
```
|
||||
--8<-- "examples/trailing_commas.output"
|
||||
```
|
@ -13,7 +13,7 @@ the result of an internet search. If you know further customers of the library,
|
||||
|
||||
- [**Alexa Auto SDK**](https://github.com/alexa/alexa-auto-sdk), a software development kit enabling the integration of Alexa into automotive systems
|
||||
- [**Apollo**](https://github.com/ApolloAuto/apollo), a framework for building autonomous driving systems
|
||||
- [**Automotive Grade Linux (AGL)**](https://download.automotivelinux.org/AGL/release/jellyfish/latest/qemux86-64/deploy/licenses/nlohmann-json/): a collaborative open-source platform for automotive software development
|
||||
- [**Automotive Grade Linux (AGL)**](https://download.automotivelinux.org/AGL/release/jellyfish/latest/qemux86-64/deploy/licenses/nlohmann-json/), a collaborative open-source platform for automotive software development
|
||||
- [**Genesis Motor** (infotainment)](http://webmanual.genesis.com/ccIC/AVNT/JW/KOR/English/reference010.html), a luxury automotive brand
|
||||
- [**Hyundai** (infotainment)](https://www.hyundai.com/wsvc/ww/download.file.do?id=/content/hyundai/ww/data/opensource/data/GN7-2022/licenseCode/info), a global automotive brand
|
||||
- [**Kia** (infotainment)](http://webmanual.kia.com/PREM_GEN6/AVNT/RJPE/KOR/Korean/reference010.html), a global automotive brand
|
||||
@ -23,30 +23,36 @@ the result of an internet search. If you know further customers of the library,
|
||||
|
||||
## Gaming and Entertainment
|
||||
|
||||
- [**Assassin's Creed: Mirage**](https://www.mobygames.com/person/1195889/niels-lohmann/credits/): a stealth-action game set in the Middle East, focusing on the journey of a young assassin with classic parkour and stealth mechanics
|
||||
- [**Chasm: The Rift**](https://www.mobygames.com/person/1195889/niels-lohmann/credits/): a first-person shooter blending horror and adventure, where players navigate dark realms and battle monsters
|
||||
- [**College Football 25**](https://www.mobygames.com/person/1195889/niels-lohmann/credits/): a college football simulation game featuring gameplay that mimics real-life college teams and competitions
|
||||
- [**Concepts**](https://concepts.app/en/licenses): a digital sketching app designed for creative professionals, offering flexible drawing tools for illustration, design, and brainstorming
|
||||
- [**Depthkit**](https://www.depthkit.tv/third-party-licenses): a tool for creating and capturing volumetric video, enabling immersive 3D experiences and interactive content
|
||||
- [**immersivetech**](https://immersitech.io/open-source-third-party-software/): a technology company focused on immersive experiences, providing tools and solutions for virtual and augmented reality applications
|
||||
- [**Assassin's Creed: Mirage**](https://www.mobygames.com/person/1195889/niels-lohmann/credits/), a stealth-action game set in the Middle East, focusing on the journey of a young assassin with classic parkour and stealth mechanics
|
||||
- [**Chasm: The Rift**](https://www.mobygames.com/person/1195889/niels-lohmann/credits/), a first-person shooter blending horror and adventure, where players navigate dark realms and battle monsters
|
||||
- [**College Football 25**](https://www.mobygames.com/person/1195889/niels-lohmann/credits/), a college football simulation game featuring gameplay that mimics real-life college teams and competitions
|
||||
- [**Concepts**](https://concepts.app/en/licenses), a digital sketching app designed for creative professionals, offering flexible drawing tools for illustration, design, and brainstorming
|
||||
- [**Depthkit**](https://www.depthkit.tv/third-party-licenses), a tool for creating and capturing volumetric video, enabling immersive 3D experiences and interactive content
|
||||
- [**IMG.LY**](https://img.ly/acknowledgements), a platform offering creative tools and SDKs for integrating advanced image and video editing in applications
|
||||
- [**LOOT**](https://loot.readthedocs.io/_/downloads/en/0.13.0/pdf/), a tool for optimizing the load order of game plugins, commonly used in The Elder Scrolls and Fallout series
|
||||
- [**Madden NFL 25**](https://www.mobygames.com/person/1195889/niels-lohmann/credits/): a sports simulation game capturing the excitement of American football with realistic gameplay and team management features
|
||||
- [**Madden NFL 25**](https://www.mobygames.com/person/1195889/niels-lohmann/credits/), a sports simulation game capturing the excitement of American football with realistic gameplay and team management features
|
||||
- [**Marne**](https://marne.io/licenses), an unofficial private server platform for hosting custom Battlefield 1 game experiences
|
||||
- [**Minecraft**](https://www.minecraft.net/zh-hant/attribution), a popular sandbox video game
|
||||
- [**NHL 22**](https://www.mobygames.com/person/1195889/niels-lohmann/credits/): a hockey simulation game offering realistic gameplay, team management, and various modes to enhance the hockey experience
|
||||
- [**Pixelpart**](https://pixelpart.net/documentation/book/third-party.html): a 2D animation and video compositing software that allows users to create animated graphics and visual effects with a focus on simplicity and ease of use
|
||||
- [**Red Dead Redemption II**](https://www.mobygames.com/person/1195889/niels-lohmann/credits/): an open-world action-adventure game following an outlaw's story in the late 1800s, emphasizing deep storytelling and immersive gameplay
|
||||
- [**NHL 22**](https://www.mobygames.com/person/1195889/niels-lohmann/credits/), a hockey simulation game offering realistic gameplay, team management, and various modes to enhance the hockey experience
|
||||
- [**Pixelpart**](https://pixelpart.net/documentation/book/third-party.html), a 2D animation and video compositing software that allows users to create animated graphics and visual effects with a focus on simplicity and ease of use
|
||||
- [**Razer Cortex**](https://mysupport.razer.com/app/answers/detail/a_id/14146/~/open-source-software-for-razer-software), a gaming performance optimizer and system booster designed to enhance the gaming experience
|
||||
- [**Red Dead Redemption II**](https://www.mobygames.com/person/1195889/niels-lohmann/credits/), an open-world action-adventure game following an outlaw's story in the late 1800s, emphasizing deep storytelling and immersive gameplay
|
||||
- [**Snapchat**](https://www.snap.com/terms/license-android), a multimedia messaging and augmented reality app for communication and entertainment
|
||||
- [**Tactics Ogre: Reborn**](https://www.square-enix-games.com/en_US/documents/tactics-ogre-reborn-pc-installer-software-and-associated-plug-ins-disclosure), a tactical role-playing game featuring strategic battles and deep storytelling elements
|
||||
- [**Throne and Liberty**](https://www.amazon.com/gp/help/customer/display.html?nodeId=T7fLNw5oAevCMtJFPj&pop-up=1), an MMORPG that offers an expansive fantasy world with dynamic gameplay and immersive storytelling
|
||||
- [**Unity Vivox**](https://docs.unity3d.com/Packages/com.unity.services.vivox@15.1/license/Third%20Party%20Notices.html), a communication service that enables voice and text chat functionality in multiplayer games developed with Unity
|
||||
- [**Zool: Redimensioned**](https://www.mobygames.com/person/1195889/niels-lohmann/credits/): a modern reimagining of the classic platformer featuring fast-paced gameplay and vibrant environments
|
||||
- [**Zool: Redimensioned**](https://www.mobygames.com/person/1195889/niels-lohmann/credits/), a modern reimagining of the classic platformer featuring fast-paced gameplay and vibrant environments
|
||||
- [**immersivetech**](https://immersitech.io/open-source-third-party-software/), a technology company focused on immersive experiences, providing tools and solutions for virtual and augmented reality applications
|
||||
|
||||
## Consumer Electronics
|
||||
|
||||
- [**Audinate**](https://www.audinate.com/legal/software-licensing/dante-av-h-open-source-licenses/): a provider of networked audio solutions specializing in Dante technology, which facilitates high-quality digital audio transport over IP networks
|
||||
- [**Audinate**](https://www.audinate.com/legal/software-licensing/dante-av-h-open-source-licenses/), a provider of networked audio solutions specializing in Dante technology, which facilitates high-quality digital audio transport over IP networks
|
||||
- [**Canon CanoScan LIDE**](https://carolburo.com/wp-content/uploads/2024/06/LiDE400_OnlineManual_Win_FR_V02.pdf), a series of flatbed scanners offering high-resolution image scanning for home and office use
|
||||
- [**Canon PIXMA Printers**](https://www.mediaexpert.pl/products/files/73/7338196/Instrukcja-obslugi-CANON-Pixma-TS7450i.pdf), a line of all-in-one inkjet printers known for high-quality printing and wireless connectivity
|
||||
- [**Cisco Webex Desk Camera**](https://www.cisco.com/c/dam/en_us/about/doing_business/open_source/docs/CiscoWebexDeskCamera-23-1622100417.pdf), a video camera designed for professional-quality video conferencing and remote collaboration
|
||||
- [**Philips Hue Personal Wireless Lighting**](http://2ak5ape.257.cz/): a smart lighting system for customizable and wireless home illumination
|
||||
- [**Philips Hue Personal Wireless Lighting**](http://2ak5ape.257.cz/), a smart lighting system for customizable and wireless home illumination
|
||||
- [**Ray-Ban Meta Smart glasses**](https://www.meta.com/de/en/legal/smart-glasses/third-party-notices-android/03/), a pair of smart glasses designed for capturing photos and videos with integrated connectivity and social features
|
||||
- [**Razer Synapse**](https://mysupport.razer.com/app/answers/detail/a_id/14146/~/open-source-software-for-razer-software), a unified configuration software enabling hardware customization for Razer devices
|
||||
- [**Siemens SINEMA Remote Connect**](https://cache.industry.siemens.com/dl/files/790/109793790/att_1054961/v2/OSS_SINEMA-RC_86.pdf), a remote connectivity solution for monitoring and managing industrial networks and devices securely
|
||||
- [**Sony PlayStation 4**](https://doc.dl.playstation.net/doc/ps4-oss/index.html), a gaming console developed by Sony that offers a wide range of games and multimedia entertainment features
|
||||
- [**Sony Virtual Webcam Driver for Remote Camera**](https://helpguide.sony.net/rc/vwd/v1/zh-cn/print.pdf), a software driver that enables the use of Sony cameras as virtual webcams for video conferencing and streaming
|
||||
@ -56,7 +62,7 @@ the result of an internet search. If you know further customers of the library,
|
||||
- [**Apple iOS and macOS**](https://www.apple.com/macos), a family of operating systems developed by Apple, including iOS for mobile devices and macOS for desktop computers
|
||||
- [**Google Fuchsia**](https://fuchsia.googlesource.com/third_party/json/), an open-source operating system developed by Google, designed to be secure, updatable, and adaptable across various devices
|
||||
- [**SerenityOS**](https://github.com/SerenityOS/serenity), an open-source operating system that aims to provide a simple and beautiful user experience with a focus on simplicity and elegance
|
||||
- [**Yocto**](http://ftp.emacinc.com/openembedded-sw/kirkstone-icop-5.15-kirkstone-6.0/archive-2024-10/pn8m-090t-ppc/licenses/nlohmann-json/): a Linux-based build system for creating custom operating systems and software distributions, tailored for embedded devices and IoT applications
|
||||
- [**Yocto**](http://ftp.emacinc.com/openembedded-sw/kirkstone-icop-5.15-kirkstone-6.0/archive-2024-10/pn8m-090t-ppc/licenses/nlohmann-json/), a Linux-based build system for creating custom operating systems and software distributions, tailored for embedded devices and IoT applications
|
||||
|
||||
## Development Tools and IDEs
|
||||
|
||||
@ -66,67 +72,72 @@ the result of an internet search. If you know further customers of the library,
|
||||
- [**CoderPad**](https://coderpad.io), a collaborative coding platform that enables real-time code interviews and assessments for developers; the library is included in every CoderPad instance and can be accessed with a simple `#include "json.hpp"`
|
||||
- [**Compiler Explorer**](https://godbolt.org), a web-based tool that allows users to write, compile, and visualize the assembly output of code in various programming languages; the library is readily available and accessible with the directive `#include <nlohmann/json.hpp>`.
|
||||
- [**GitHub CodeQL**](https://github.com/github/codeql), a code analysis tool used for identifying security vulnerabilities and bugs in software through semantic queries
|
||||
- [**Hex-Rays**](https://docs.hex-rays.com/user-guide/user-interface/licenses): a reverse engineering toolset for analyzing and decompiling binaries, primarily used for security research and vulnerability analysis
|
||||
- [**Hex-Rays**](https://docs.hex-rays.com/user-guide/user-interface/licenses), a reverse engineering toolset for analyzing and decompiling binaries, primarily used for security research and vulnerability analysis
|
||||
- [**ImHex**](https://github.com/WerWolv/ImHex), a hex editor designed for reverse engineering, providing advanced features for data analysis and manipulation
|
||||
- [**Intel GPA Framework**](https://intel.github.io/gpasdk-doc/src/licenses.html), a suite of cross-platform tools for capturing, analyzing, and optimizing graphics applications across different APIs
|
||||
- [**Meta Yoga**](https://github.com/facebook/yoga), a layout engine that facilitates flexible and efficient user interface design across multiple platforms
|
||||
- [**Intopix**](https://www.intopix.com/software-licensing), a provider of advanced image processing and compression solutions used in software development and AV workflows
|
||||
- [**MKVToolNix**](https://mkvtoolnix.download/doc/README.md), a set of tools for creating, editing, and inspecting MKV (Matroska) multimedia container files
|
||||
- [**Meta Yoga**](https://github.com/facebook/yoga), a layout engine that facilitates flexible and efficient user interface design across multiple platforms
|
||||
- [**NVIDIA Nsight Compute**](https://docs.nvidia.com/nsight-compute/2022.2/pdf/CopyrightAndLicenses.pdf), a performance analysis tool for CUDA applications that provides detailed insights into GPU performance metrics
|
||||
- [**Notepad++**](https://github.com/notepad-plus-plus/notepad-plus-plus), a free source code editor that supports various programming languages
|
||||
- [**OpenRGB**](https://gitlab.com/CalcProgrammer1/OpenRGB), an open source RGB lighting control that doesn't depend on manufacturer software
|
||||
- [**OpenTelemetry C++**](https://github.com/open-telemetry/opentelemetry-cpp): a library for collecting and exporting observability data in C++, enabling developers to implement distributed tracing and metrics in their application
|
||||
- [**OpenTelemetry C++**](https://github.com/open-telemetry/opentelemetry-cpp), a library for collecting and exporting observability data in C++, enabling developers to implement distributed tracing and metrics in their application
|
||||
- [**Qt Creator**](https://doc.qt.io/qtcreator/qtcreator-attribution-json-nlohmann.html), an IDE for developing applications using the Qt application framework
|
||||
- [**Scanbot SDK**](https://docs.scanbot.io/barcode-scanner-sdk/web/third-party-libraries/): a software development kit (SDK) that provides tools for integrating advanced document scanning and barcode scanning capabilities into applications
|
||||
- [**Scanbot SDK**](https://docs.scanbot.io/barcode-scanner-sdk/web/third-party-libraries/), a software development kit (SDK) that provides tools for integrating advanced document scanning and barcode scanning capabilities into applications
|
||||
|
||||
## Machine Learning and AI
|
||||
|
||||
- [**Apple Core ML Tools**](https://github.com/apple/coremltools), a set of tools for converting and configuring machine learning models for deployment in Apple's Core ML framework
|
||||
- [**Avular Mobile Robotics**](https://www.avular.com/licenses/nlohmann-json-3.9.1.txt): a platform for developing and deploying mobile robotics solutions
|
||||
- [**Avular Mobile Robotics**](https://www.avular.com/licenses/nlohmann-json-3.9.1.txt), a platform for developing and deploying mobile robotics solutions
|
||||
- [**Google gemma.cpp**](https://github.com/google/gemma.cpp), a lightweight C++ inference engine designed for running AI models from the Gemma family
|
||||
- [**llama.cpp**](https://github.com/ggerganov/llama.cpp), a C++ library designed for efficient inference of large language models (LLMs), enabling streamlined integration into applications
|
||||
- [**MLX**](https://github.com/ml-explore/mlx), an array framework for machine learning on Apple Silicon
|
||||
- [**Mozilla llamafile**](https://github.com/Mozilla-Ocho/llamafile), a tool designed for distributing and executing large language models (LLMs) efficiently using a single file format
|
||||
- [**NVIDIA ACE**](https://docs.nvidia.com/ace/latest/index.html), a suite of real-time AI solutions designed for the development of interactive avatars and digital human applications, enabling scalable and sophisticated user interactions
|
||||
- [**Peer**](https://support.peer.inc/hc/en-us/articles/17261335054235-Licenses): a platform offering personalized AI assistants for interactive learning and creative collaboration
|
||||
- [**stable-diffusion.cpp**](https://github.com/leejet/stable-diffusion.cpp): a C++ implementation of the Stable Diffusion image generation model
|
||||
- [**TanvasTouch**](https://tanvas.co/tanvastouch-sdk-third-party-acknowledgments): a software development kit (SDK) that enables developers to create tactile experiences on touchscreens, allowing users to feel textures and physical sensations in a digital environment
|
||||
- [**Peer**](https://support.peer.inc/hc/en-us/articles/17261335054235-Licenses), a platform offering personalized AI assistants for interactive learning and creative collaboration
|
||||
- [**stable-diffusion.cpp**](https://github.com/leejet/stable-diffusion.cpp), a C++ implementation of the Stable Diffusion image generation model
|
||||
- [**TanvasTouch**](https://tanvas.co/tanvastouch-sdk-third-party-acknowledgments), a software development kit (SDK) that enables developers to create tactile experiences on touchscreens, allowing users to feel textures and physical sensations in a digital environment
|
||||
- [**TensorFlow**](https://github.com/tensorflow/tensorflow), a machine learning framework that facilitates the development and training of models, supporting data serialization and efficient data exchange between components
|
||||
|
||||
## Scientific Research and Analysis
|
||||
|
||||
- [**BLACK**](https://www.black-sat.org/en/stable/installation/linux.html), a bounded linear temporal logic (LTL) satisfiability checker
|
||||
- [**CERN Atlas Athena**](https://gitlab.cern.ch/atlas/athena/-/blob/main/Control/PerformanceMonitoring/PerfMonComps/src/PerfMonMTSvc.h), a software framework used in the ATLAS experiment at the Large Hadron Collider (LHC) for performance monitoring
|
||||
- [**KAMERA**](https://github.com/Kitware/kamera): a platform for synchronized data collection and real-time deep learning to map marine species like polar bears and seals, aiding Arctic ecosystem research
|
||||
- [**KiCad**](https://gitlab.com/kicad/code/kicad/-/tree/master/thirdparty/nlohmann_json): a free and open-source software suite for electronic design automation
|
||||
- [**MeVisLab**](https://mevislabdownloads.mevis.de/docs/current/MeVis/ThirdParty/Documentation/Publish/ThirdPartyReference/index.html): a software framework for medical image processing and visualization.
|
||||
- [**OpenPMD API**](https://openpmd-api.readthedocs.io/en/0.8.0-alpha/backends/json.html): a versatile programming interface for accessing and managing scientific data, designed to facilitate the efficient storage, retrieval, and sharing of simulation data across various applications and platforms
|
||||
- [**ParaView**](https://github.com/Kitware/ParaView): an open-source tool for large-scale data visualization and analysis across various scientific domains
|
||||
- [**QGIS**](https://gitlab.b-data.ch/qgis/qgis/-/blob/backport-57658-to-release-3_34/external/nlohmann/json.hpp): a free and open-source geographic information system (GIS) application that allows users to create, edit, visualize, and analyze geospatial data across a variety of formats
|
||||
- [**VTK**](https://github.com/Kitware/VTK): a software library for 3D computer graphics, image processing, and visualization
|
||||
- [**VolView**](https://github.com/Kitware/VolView): a lightweight application for interactive visualization and analysis of 3D medical imaging data.
|
||||
- [**ICU**](https://github.com/unicode-org/icu), the International Components for Unicode, a mature library for software globalization and multilingual support
|
||||
- [**KAMERA**](https://github.com/Kitware/kamera), a platform for synchronized data collection and real-time deep learning to map marine species like polar bears and seals, aiding Arctic ecosystem research
|
||||
- [**KiCad**](https://gitlab.com/kicad/code/kicad/-/tree/master/thirdparty/nlohmann_json), a free and open-source software suite for electronic design automation
|
||||
- [**Maple**](https://www.maplesoft.com/support/help/Maple/view.aspx?path=copyright), a symbolic and numeric computing environment for advanced mathematical modeling and analysis
|
||||
- [**MeVisLab**](https://mevislabdownloads.mevis.de/docs/current/MeVis/ThirdParty/Documentation/Publish/ThirdPartyReference/index.html), a software framework for medical image processing and visualization.
|
||||
- [**OpenPMD API**](https://openpmd-api.readthedocs.io/en/0.8.0-alpha/backends/json.html), a versatile programming interface for accessing and managing scientific data, designed to facilitate the efficient storage, retrieval, and sharing of simulation data across various applications and platforms
|
||||
- [**ParaView**](https://github.com/Kitware/ParaView), an open-source tool for large-scale data visualization and analysis across various scientific domains
|
||||
- [**QGIS**](https://gitlab.b-data.ch/qgis/qgis/-/blob/backport-57658-to-release-3_34/external/nlohmann/json.hpp), a free and open-source geographic information system (GIS) application that allows users to create, edit, visualize, and analyze geospatial data across a variety of formats
|
||||
- [**VTK**](https://github.com/Kitware/VTK), a software library for 3D computer graphics, image processing, and visualization
|
||||
- [**VolView**](https://github.com/Kitware/VolView), a lightweight application for interactive visualization and analysis of 3D medical imaging data.
|
||||
|
||||
## Business and Productivity Software
|
||||
|
||||
- [**ArcGIS PRO**](https://www.esri.com/content/dam/esrisites/en-us/media/legal/open-source-acknowledgements/arcgis-pro-2-8-attribution-report.html), a desktop geographic information system (GIS) application developed by Esri for mapping and spatial analysis
|
||||
- [**Autodesk Desktop**](https://damassets.autodesk.net/content/dam/autodesk/www/Company/legal-notices-trademarks/autodesk-desktop-platform-components/internal-autodesk-components-web-page-2023.pdf), a software platform developed by Autodesk for creating and managing desktop applications and services
|
||||
- [**Check Point**](https://www.checkpoint.com/about-us/copyright-and-trademarks/): a cybersecurity company specializing in threat prevention and network security solutions, offering a range of products designed to protect enterprises from cyber threats and ensure data integrity
|
||||
- [**Check Point**](https://www.checkpoint.com/about-us/copyright-and-trademarks/), a cybersecurity company specializing in threat prevention and network security solutions, offering a range of products designed to protect enterprises from cyber threats and ensure data integrity
|
||||
- [**Microsoft Office for Mac**](https://officecdnmac.microsoft.com/pr/legal/mac/OfficeforMacAttributions.html), a suite of productivity applications developed by Microsoft for macOS, including tools for word processing, spreadsheets, and presentations
|
||||
- [**Microsoft Teams**](https://www.microsoft.com/microsoft-teams/), a team collaboration application offering workspace chat and video conferencing, file storage, and integration of proprietary and third-party applications and services
|
||||
- [**Nexthink Infinity**](https://docs.nexthink.com/legal/services-terms/experience-open-source-software-licenses/infinity-2022.8-software-licenses): a digital employee experience management platform for monitoring and improving IT performance
|
||||
- [**Sophos Connect Client**](https://docs.sophos.com/nsg/licenses/SophosConnect/SophosConnectAttribution.html): a secure VPN client from Sophos that allows remote users to connect to their corporate network, ensuring secure access to resources and data
|
||||
- [**Stonebranch**](https://stonebranchdocs.atlassian.net/wiki/spaces/UA77/pages/799545647/Licenses+for+Third-Party+Libraries): a cloud-based cybersecurity solution that integrates backup, disaster recovery, and cybersecurity features to protect data and ensure business continuity for organizations
|
||||
- [**Tablecruncher**](https://tablecruncher.com/): a data analysis tool that allows users to import, analyze, and visualize spreadsheet data, offering interactive features for better insights and decision-making
|
||||
- [**Nexthink Infinity**](https://docs.nexthink.com/legal/services-terms/experience-open-source-software-licenses/infinity-2022.8-software-licenses), a digital employee experience management platform for monitoring and improving IT performance
|
||||
- [**Sophos Connect Client**](https://docs.sophos.com/nsg/licenses/SophosConnect/SophosConnectAttribution.html), a secure VPN client from Sophos that allows remote users to connect to their corporate network, ensuring secure access to resources and data
|
||||
- [**Stonebranch**](https://stonebranchdocs.atlassian.net/wiki/spaces/UA77/pages/799545647/Licenses+for+Third-Party+Libraries), a cloud-based cybersecurity solution that integrates backup, disaster recovery, and cybersecurity features to protect data and ensure business continuity for organizations
|
||||
- [**Tablecruncher**](https://tablecruncher.com/), a data analysis tool that allows users to import, analyze, and visualize spreadsheet data, offering interactive features for better insights and decision-making
|
||||
- [**magicplan**](https://help.magicplan.app/acknowledgments), a mobile application for creating floor plans and interior designs using augmented reality
|
||||
|
||||
## Databases and Big Data
|
||||
|
||||
- [**ADIOS2**](https://code.ornl.gov/ecpcitest/adios2/-/tree/pr4285_FFSUpstream/thirdparty/nlohmann_json?ref_type=heads): a data management framework designed for high-performance input and output operations
|
||||
- [**Cribl Stream**](https://docs.cribl.io/stream/third-party-current-list/): a real-time data processing platform that enables organizations to collect, route, and transform observability data, enhancing visibility and insights into their systems
|
||||
- [**ADIOS2**](https://code.ornl.gov/ecpcitest/adios2/-/tree/pr4285_FFSUpstream/thirdparty/nlohmann_json?ref_type=heads), a data management framework designed for high-performance input and output operations
|
||||
- [**Cribl Stream**](https://docs.cribl.io/stream/third-party-current-list/), a real-time data processing platform that enables organizations to collect, route, and transform observability data, enhancing visibility and insights into their systems
|
||||
- [**DB Browser for SQLite**](https://github.com/sqlitebrowser/sqlitebrowser), a visual open-source tool for creating, designing, and editing SQLite database files
|
||||
- [**MySQL Connector/C++**](https://docs.oracle.com/cd/E17952_01/connector-cpp-9.1-license-com-en/license-opentelemetry-cpp-com.html), a C++ library for connecting and interacting with MySQL databases
|
||||
- [**MySQL NDB Cluster**](https://downloads.mysql.com/docs/licenses/cluster-9.0-com-en.pdf), a distributed database system that provides high availability and scalability for MySQL databases
|
||||
- [**MySQL Shell**](https://downloads.mysql.com/docs/licenses/mysql-shell-8.0-gpl-en.pdf), an advanced client and code editor for interacting with MySQL servers, supporting SQL, Python, and JavaScript
|
||||
- [**PrestoDB**](https://github.com/prestodb/presto), a distributed SQL query engine designed for large-scale data analytics, originally developed by Facebook
|
||||
- [**ROOT Data Analysis Framework**](https://root.cern/doc/v614/classnlohmann_1_1basic__json.html), an open-source data analysis framework widely used in high-energy physics and other fields for data processing and visualization
|
||||
- [**WiredTiger**](https://github.com/wiredtiger/wiredtiger), a high-performance storage engine for databases, offering support for compression, concurrency, and checkpointing
|
||||
|
||||
## Simulation and Modeling
|
||||
|
||||
@ -134,30 +145,33 @@ the result of an internet search. If you know further customers of the library,
|
||||
- [**azul**](https://pure.tudelft.nl/ws/files/85338589/tgis.12673.pdf), a fast and efficient 3D city model viewer designed for visualizing urban environments and spatial data
|
||||
- [**Blender**](https://projects.blender.org/blender/blender/search?q=nlohmann), a free and open-source 3D creation suite for modeling, animation, rendering, and more
|
||||
- [**cpplot**](https://cpplot.readthedocs.io/en/latest/library_api/function_eigen_8h_1ac080eac0541014c5892a55e41bf785e6.html), a library for creating interactive graphs and charts in C++, which can be viewed in web browsers
|
||||
- [**Foundry Nuke**](https://learn.foundry.com/nuke/content/misc/studio_third_party_libraries.html), a powerful node-based digital compositing and visual effects application used in film and television post-production
|
||||
- [**GAMS**](https://www.gams.com/47/docs/THIRDPARTY.html), a high-performance mathematical modeling system for optimization and decision support
|
||||
- [**Kitware SMTK**](https://github.com/Kitware/SMTK), a software toolkit for managing simulation models and workflows in scientific and engineering applications
|
||||
- [**M-Star**](https://docs.mstarcfd.com/3_Licensing/thirdparty-licenses.html), a computational fluid dynamics software for simulating and analyzing fluid flow
|
||||
- [**MapleSim CAD Toolbox**](https://www.maplesoft.com/support/help/MapleSim/view.aspx?path=CADToolbox/copyright), a software extension for MapleSim that integrates CAD models, allowing users to import, manipulate, and analyze 3D CAD data within the MapleSim environment for enhanced modeling and simulation
|
||||
- [**NVIDIA Omniverse**](https://docs.omniverse.nvidia.com/composer/latest/common/product-licenses/usd-explorer/usd-explorer-2023.2.0-licenses-manifest.html), a platform for 3D content creation and collaboration that enables real-time simulations and interactive experiences across various industries
|
||||
- [**Pixar Renderman**](https://rmanwiki-26.pixar.com/space/REN26/19662083/Legal+Notice), a photorealistic 3D rendering software developed by Pixar, widely used in the film industry for creating high-quality visual effects and animations
|
||||
- [**ROS - Robot Operating System**](http://docs.ros.org/en/noetic/api/behaviortree_cpp/html/json_8hpp_source.html), a set of software libraries and tools that assist in developing robot applications
|
||||
- [**UBS**](https://www.ubs.com/), a multinational financial services and banking company
|
||||
- [**GAMS**](https://www.gams.com/47/docs/THIRDPARTY.html): a high-performance mathematical modeling system for optimization and decision support
|
||||
- [**M-Star**](https://docs.mstarcfd.com/3_Licensing/thirdparty-licenses.html): a computational fluid dynamics software for simulating and analyzing fluid flow
|
||||
- [**MapleSim CAD Toolbox**](https://www.maplesoft.com/support/help/MapleSim/view.aspx?path=CADToolbox/copyright): a software extension for MapleSim that integrates CAD models, allowing users to import, manipulate, and analyze 3D CAD data within the MapleSim environment for enhanced modeling and simulation
|
||||
- [**Kitware SMTK**](https://github.com/Kitware/SMTK): a software toolkit for managing simulation models and workflows in scientific and engineering applications
|
||||
|
||||
## Enterprise and Cloud Applications
|
||||
|
||||
- [**Acronis Cyber Protect Cloud**](https://care.acronis.com/s/article/59533-Third-party-software-used-in-Acronis-Cyber-Protect-Cloud?language=en_US): an all-in-one data protection solution that combines backup, disaster recovery, and cybersecurity to safeguard business data from threats like ransomware
|
||||
- [**Baereos**](https://gitlab.tiger-computing.co.uk/packages/bareos/-/blob/tiger/bullseye/third-party/CLI11/examples/json.cpp): a backup solution that provides data protection and recovery options for various environments, including physical and virtual systems
|
||||
- [**Acronis Cyber Protect Cloud**](https://care.acronis.com/s/article/59533-Third-party-software-used-in-Acronis-Cyber-Protect-Cloud?language=en_US), an all-in-one data protection solution that combines backup, disaster recovery, and cybersecurity to safeguard business data from threats like ransomware
|
||||
- [**Baereos**](https://gitlab.tiger-computing.co.uk/packages/bareos/-/blob/tiger/bullseye/third-party/CLI11/examples/json.cpp), a backup solution that provides data protection and recovery options for various environments, including physical and virtual systems
|
||||
- [**Bitdefender Home Scanner**](https://www.bitdefender.de/site/Main/view/home-scanner-open-source.html), a tool from Bitdefender that scans devices for malware and security threats, providing a safeguard against potential online dangers
|
||||
- [**Citrix Provisioning**](https://docs.citrix.com/en-us/provisioning/2203-ltsr/downloads/pvs-third-party-notices-2203.pdf): a solution that streamlines the delivery of virtual desktops and applications by allowing administrators to manage and provision resources efficiently across multiple environments
|
||||
- [**Citrix Provisioning**](https://docs.citrix.com/en-us/provisioning/2203-ltsr/downloads/pvs-third-party-notices-2203.pdf), a solution that streamlines the delivery of virtual desktops and applications by allowing administrators to manage and provision resources efficiently across multiple environments
|
||||
- [**Citrix Virtual Apps and Desktops**](https://docs.citrix.com/en-us/citrix-virtual-apps-desktops/2305/downloads/third-party-notices-apps-and-desktops.pdf), a solution from Citrix that delivers virtual apps and desktops
|
||||
- [**Cyberarc**](https://docs.cyberark.com/Downloads/Legal/Privileged%20Session%20Manager%20for%20SSH%20Third-Party%20Notices.pdf): a security solution that specializes in privileged access management, enabling organizations to control and monitor access to critical systems and data, thereby enhancing overall cybersecurity posture
|
||||
- [**Elster**](https://www.secunet.com/en/about-us/press/article/elstersecure-bietet-komfortablen-login-ohne-passwort-dank-secunet-protect4use): a digital platform developed by German tax authorities for secure and efficient electronic tax filing and management using secunet protect4use
|
||||
- [**Egnyte Desktop**](https://helpdesk.egnyte.com/hc/en-us/articles/360007071732-Third-Party-Software-Acknowledgements): a secure cloud storage solution designed for businesses, enabling file sharing, collaboration, and data management across teams while ensuring compliance and data protection
|
||||
- [**Cyberarc**](https://docs.cyberark.com/Downloads/Legal/Privileged%20Session%20Manager%20for%20SSH%20Third-Party%20Notices.pdf), a security solution that specializes in privileged access management, enabling organizations to control and monitor access to critical systems and data, thereby enhancing overall cybersecurity posture
|
||||
- [**Egnyte Desktop**](https://helpdesk.egnyte.com/hc/en-us/articles/360007071732-Third-Party-Software-Acknowledgements), a secure cloud storage solution designed for businesses, enabling file sharing, collaboration, and data management across teams while ensuring compliance and data protection
|
||||
- [**Elster**](https://www.secunet.com/en/about-us/press/article/elstersecure-bietet-komfortablen-login-ohne-passwort-dank-secunet-protect4use), a digital platform developed by German tax authorities for secure and efficient electronic tax filing and management using secunet protect4use
|
||||
- [**Ethereum Solidity**](https://github.com/ethereum/solidity), a high-level, object-oriented programming language designed for implementing smart contracts on the Ethereum platform
|
||||
- [**Inciga**](https://fossies.org/linux/icinga2/third-party/nlohmann_json/json.hpp): a monitoring tool for IT infrastructure, designed to provide insights into system performance and availability through customizable dashboards and alerts
|
||||
- [**Intel Accelerator Management Daemon for VMware ESXi**](https://downloadmirror.intel.com/772507/THIRD-PARTY.txt): a management tool designed for monitoring and controlling Intel hardware accelerators within VMware ESXi environments, optimizing performance and resource allocation
|
||||
- [**Inciga**](https://fossies.org/linux/icinga2/third-party/nlohmann_json/json.hpp), a monitoring tool for IT infrastructure, designed to provide insights into system performance and availability through customizable dashboards and alerts
|
||||
- [**Intel Accelerator Management Daemon for VMware ESXi**](https://downloadmirror.intel.com/772507/THIRD-PARTY.txt), a management tool designed for monitoring and controlling Intel hardware accelerators within VMware ESXi environments, optimizing performance and resource allocation
|
||||
- [**Juniper Identity Management Service**](https://www.juniper.net/documentation/us/en/software/jims/jims-guide/jims-guide.pdf)
|
||||
- [**Microsoft Azure IoT SDK**](https://library.e.abb.com/public/2779c5f85f30484192eb3cb3f666a201/IP%20Gateway%20Open%20License%20Declaration_9AKK108467A4095_Rev_C.pdf), a collection of tools and libraries to help developers connect, build, and deploy Internet of Things (IoT) solutions on the Azure cloud platform
|
||||
- [**Microsoft WinGet**](https://github.com/microsoft/winget-cli), a command-line utility included in the Windows Package Manager
|
||||
- [**Pointr**](https://docs-dev.pointr.tech/docs/8.x/Developer%20Portal/Open%20Source%20Licenses/): a platform for indoor positioning and navigation solutions, offering tools and SDKs for developers to create location-based applications
|
||||
- [**secunet protect4use**](https://www.secunet.com/en/about-us/press/article/elstersecure-bietet-komfortablen-login-ohne-passwort-dank-secunet-protect4use): a secure, passwordless multifactor authentication solution that transforms smartphones into digital keyrings, ensuring high security for online services and digital identities
|
||||
- [**plexusAV**](https://www.sisme.com/media/10994/manual_plexusav-p-avn-4-form8244-c.pdf), a high-performance AV-over-IP transceiver device capable of video encoding and decoding using the IPMX standard
|
||||
- [**Pointr**](https://docs-dev.pointr.tech/docs/8.x/Developer%20Portal/Open%20Source%20Licenses/), a platform for indoor positioning and navigation solutions, offering tools and SDKs for developers to create location-based applications
|
||||
- [**secunet protect4use**](https://www.secunet.com/en/about-us/press/article/elstersecure-bietet-komfortablen-login-ohne-passwort-dank-secunet-protect4use), a secure, passwordless multifactor authentication solution that transforms smartphones into digital keyrings, ensuring high security for online services and digital identities
|
||||
- [**Sencore MRD 7000**](https://www.foccusdigital.com/wp-content/uploads/2025/03/MRD-7000-Manual-8175V.pdf), a professional multi-channel receiver and decoder supporting UHD and HD stream decoding
|
||||
|
@ -520,7 +520,7 @@ The order of object iterators cannot be compared, because JSON objects are unord
|
||||
|
||||
### json.exception.invalid_iterator.214
|
||||
|
||||
Cannot get value for iterator: Either the iterator belongs to a null value or it is an iterator to a primitive type (number, boolean, or string), but the iterator is different to `begin()`.
|
||||
Cannot retrieve value from iterator: The iterator either refers to a null value, or it refers to a primitive type (number, boolean, or string), but does not match the iterator returned by `begin()`.
|
||||
|
||||
!!! failure "Example message"
|
||||
|
||||
|
@ -67,7 +67,7 @@ The library supports **Unicode input** as follows:
|
||||
- The strings stored in the library are UTF-8 encoded. When using the default string type (`std::string`), note that its length/size functions return the number of stored bytes rather than the number of characters or glyphs.
|
||||
- When you store strings with different encodings in the library, calling [`dump()`](https://nlohmann.github.io/json/classnlohmann_1_1basic__json_a50ec80b02d0f3f51130d4abb5d1cfdc5.html#a50ec80b02d0f3f51130d4abb5d1cfdc5) may throw an exception unless `json::error_handler_t::replace` or `json::error_handler_t::ignore` are used as error handlers.
|
||||
|
||||
In most cases, the parser is right to complain, because the input is not UTF-8 encoded. This is especially true for Microsoft Windows where Latin-1 or ISO 8859-1 is often the standard encoding.
|
||||
In most cases, the parser is right to complain, because the input is not UTF-8 encoded. This is especially true for Microsoft Windows, where Latin-1 or ISO 8859-1 is often the standard encoding.
|
||||
|
||||
|
||||
### Wide string handling
|
||||
|
@ -1071,7 +1071,7 @@ This release combines a lot of small fixes and improvements. The release is back
|
||||
- Improved the performance of the serialization by avoiding the re-creation of a locale object.
|
||||
- Fixed two MSVC warnings. Compiling the test suite with `/Wall` now only warns about non-inlined functions (C4710) and the deprecation of the constructor from input-stream (C4996).
|
||||
- Some project internals:
|
||||
- <img align="right" src="https://bestpractices.coreinfrastructure.org/assets/questions_page_badge-17b338c0e8528d695d8676e23f39f17ca2b89bb88176370803ee69aeebcb5be4.png"> The project has qualified for the [Core Infrastructure Initiative Best Practices Badge](https://bestpractices.coreinfrastructure.org/projects/289). While most requirements where already satisfied, some led to a more explicit documentation of quality-ensuring procedures. For instance, static analysis is now executed with every commit on the build server. Furthermore, the [contribution guidelines document](https://github.com/nlohmann/json/blob/develop/.github/CONTRIBUTING.md) how to communicate security issues privately.
|
||||
- <img align="right" src="https://bestpractices.coreinfrastructure.org/assets/questions_page_badge-17b338c0e8528d695d8676e23f39f17ca2b89bb88176370803ee69aeebcb5be4.png"> The project has qualified for the [Core Infrastructure Initiative Best Practices Badge](https://bestpractices.coreinfrastructure.org/projects/289). While most requirements where already satisfied, some led to more explicit documentation of quality-ensuring procedures. For instance, static analysis is now executed with every commit on the build server. Furthermore, the [contribution guidelines document](https://github.com/nlohmann/json/blob/develop/.github/CONTRIBUTING.md) how to communicate security issues privately.
|
||||
- The test suite has been overworked and split into several files to allow for faster compilation and analysis. The execute the test suite, simply execute `make check`.
|
||||
- The continuous integration with [Travis](https://travis-ci.org/nlohmann/json) was extended with Clang versions 3.6.0 to 3.8.1 and now includes 18 different compiler/OS combinations.
|
||||
- An 11-day run of [American fuzzy lop](http://lcamtuf.coredump.cx/afl/) checked 962 million inputs on the parser and found no issue.
|
||||
|
Binary file not shown.
Before Width: | Height: | Size: 997 KiB After Width: | Height: | Size: 1.0 MiB |
@ -67,6 +67,7 @@ nav:
|
||||
- features/binary_formats/ubjson.md
|
||||
- features/binary_values.md
|
||||
- features/comments.md
|
||||
- features/trailing_commas.md
|
||||
- Element Access:
|
||||
- features/element_access/index.md
|
||||
- features/element_access/unchecked_access.md
|
||||
|
@ -1,8 +1,8 @@
|
||||
wheel==0.45.1
|
||||
|
||||
mkdocs==1.6.1 # documentation framework
|
||||
mkdocs-git-revision-date-localized-plugin==1.4.5 # plugin "git-revision-date-localized"
|
||||
mkdocs-material==9.6.11 # theme for mkdocs
|
||||
mkdocs-git-revision-date-localized-plugin==1.4.7 # plugin "git-revision-date-localized"
|
||||
mkdocs-material==9.6.14 # theme for mkdocs
|
||||
mkdocs-material-extensions==1.3.1 # extensions
|
||||
mkdocs-minify-plugin==0.8.0 # plugin "minify"
|
||||
mkdocs-redirects==1.2.2 # plugin "redirects"
|
||||
|
@ -13,9 +13,6 @@
|
||||
#include <forward_list> // forward_list
|
||||
#include <iterator> // inserter, front_inserter, end
|
||||
#include <map> // map
|
||||
#ifdef JSON_HAS_CPP_17
|
||||
#include <optional> // optional
|
||||
#endif
|
||||
#include <string> // string
|
||||
#include <tuple> // tuple, make_tuple
|
||||
#include <type_traits> // is_arithmetic, is_same, is_enum, underlying_type, is_convertible
|
||||
@ -32,6 +29,15 @@
|
||||
#include <nlohmann/detail/string_concat.hpp>
|
||||
#include <nlohmann/detail/value_t.hpp>
|
||||
|
||||
// include after macro_scope.hpp
|
||||
#ifdef JSON_HAS_CPP_17
|
||||
#include <optional> // optional
|
||||
#endif
|
||||
|
||||
#if JSON_HAS_FILESYSTEM || JSON_HAS_EXPERIMENTAL_FILESYSTEM
|
||||
#include <string_view> // u8string_view
|
||||
#endif
|
||||
|
||||
NLOHMANN_JSON_NAMESPACE_BEGIN
|
||||
namespace detail
|
||||
{
|
||||
@ -47,7 +53,6 @@ inline void from_json(const BasicJsonType& j, typename std::nullptr_t& n)
|
||||
}
|
||||
|
||||
#ifdef JSON_HAS_CPP_17
|
||||
#ifndef JSON_USE_IMPLICIT_CONVERSIONS
|
||||
template<typename BasicJsonType, typename T>
|
||||
void from_json(const BasicJsonType& j, std::optional<T>& opt)
|
||||
{
|
||||
@ -60,8 +65,6 @@ void from_json(const BasicJsonType& j, std::optional<T>& opt)
|
||||
opt.emplace(j.template get<T>());
|
||||
}
|
||||
}
|
||||
|
||||
#endif // JSON_USE_IMPLICIT_CONVERSIONS
|
||||
#endif // JSON_HAS_CPP_17
|
||||
|
||||
// overloads for basic_json template parameters
|
||||
@ -395,7 +398,7 @@ inline void from_json(const BasicJsonType& j, ConstructibleObjectType& obj)
|
||||
}
|
||||
|
||||
// overload for arithmetic types, not chosen for basic_json template arguments
|
||||
// (BooleanType, etc..); note: Is it really necessary to provide explicit
|
||||
// (BooleanType, etc.); note: Is it really necessary to provide explicit
|
||||
// overloads for boolean_t etc. in case of a custom BooleanType which is not
|
||||
// an arithmetic type?
|
||||
template < typename BasicJsonType, typename ArithmeticType,
|
||||
@ -540,7 +543,10 @@ inline void from_json(const BasicJsonType& j, std_fs::path& p)
|
||||
JSON_THROW(type_error::create(302, concat("type must be string, but is ", j.type_name()), &j));
|
||||
}
|
||||
const auto& s = *j.template get_ptr<const typename BasicJsonType::string_t*>();
|
||||
#ifdef JSON_HAS_CPP_20
|
||||
// Checking for C++20 standard or later can be insufficient in case the
|
||||
// library support for char8_t is either incomplete or was disabled
|
||||
// altogether. Use the __cpp_lib_char8_t feature test instead.
|
||||
#if defined(__cpp_lib_char8_t) && (__cpp_lib_char8_t >= 201907L)
|
||||
p = std_fs::path(std::u8string_view(reinterpret_cast<const char8_t*>(s.data()), s.size()));
|
||||
#else
|
||||
p = std_fs::u8path(s); // accepts UTF-8 encoded std::string in C++17, deprecated in C++20
|
||||
|
@ -130,7 +130,7 @@ struct diyfp // f * 2^e
|
||||
// p_lo = p0_lo + (Q << 32)
|
||||
//
|
||||
// But in this particular case here, the full p_lo is not required.
|
||||
// Effectively we only need to add the highest bit in p_lo to p_hi (and
|
||||
// Effectively, we only need to add the highest bit in p_lo to p_hi (and
|
||||
// Q_hi + 1 does not overflow).
|
||||
|
||||
Q += std::uint64_t{1} << (64u - 32u - 1u); // round, ties up
|
||||
@ -220,7 +220,7 @@ boundaries compute_boundaries(FloatType value)
|
||||
// Compute the boundaries m- and m+ of the floating-point value
|
||||
// v = f * 2^e.
|
||||
//
|
||||
// Determine v- and v+, the floating-point predecessor and successor if v,
|
||||
// Determine v- and v+, the floating-point predecessor and successor of v,
|
||||
// respectively.
|
||||
//
|
||||
// v- = v - 2^e if f != 2^(p-1) or e == e_min (A)
|
||||
@ -375,7 +375,7 @@ inline cached_power get_cached_power_for_binary_exponent(int e)
|
||||
// (A smaller distance gamma-alpha would require a larger table.)
|
||||
|
||||
// NB:
|
||||
// Actually this function returns c, such that -60 <= e_c + e + 64 <= -34.
|
||||
// Actually, this function returns c, such that -60 <= e_c + e + 64 <= -34.
|
||||
|
||||
constexpr int kCachedPowersMinDecExp = -300;
|
||||
constexpr int kCachedPowersDecStep = 8;
|
||||
@ -687,8 +687,8 @@ inline void grisu2_digit_gen(char* buffer, int& length, int& decimal_exponent,
|
||||
|
||||
decimal_exponent += n;
|
||||
|
||||
// We may now just stop. But instead look if the buffer could be
|
||||
// decremented to bring V closer to w.
|
||||
// We may now just stop. But instead, it looks as if the buffer
|
||||
// could be decremented to bring V closer to w.
|
||||
//
|
||||
// pow10 = 10^n is now 1 ulp in the decimal representation V.
|
||||
// The rounding procedure works with diyfp's with an implicit
|
||||
@ -1095,7 +1095,7 @@ char* to_chars(char* first, const char* last, FloatType value)
|
||||
// Compute v = buffer * 10^decimal_exponent.
|
||||
// The decimal digits are stored in the buffer, which needs to be interpreted
|
||||
// as an unsigned decimal integer.
|
||||
// len is the length of the buffer, i.e. the number of decimal digits.
|
||||
// len is the length of the buffer, i.e., the number of decimal digits.
|
||||
int len = 0;
|
||||
int decimal_exponent = 0;
|
||||
dtoa_impl::grisu2(first, len, decimal_exponent, value);
|
||||
|
@ -15,7 +15,8 @@
|
||||
|
||||
#include <algorithm> // copy
|
||||
#include <iterator> // begin, end
|
||||
#include <string> // string
|
||||
#include <memory> // allocator_traits
|
||||
#include <string> // basic_string, char_traits
|
||||
#include <tuple> // tuple, get
|
||||
#include <type_traits> // is_same, is_constructible, is_floating_point, is_enum, underlying_type
|
||||
#include <utility> // move, forward, declval, pair
|
||||
@ -267,7 +268,7 @@ struct external_constructor<value_t::object>
|
||||
#ifdef JSON_HAS_CPP_17
|
||||
template<typename BasicJsonType, typename T,
|
||||
enable_if_t<std::is_constructible<BasicJsonType, T>::value, int> = 0>
|
||||
void to_json(BasicJsonType& j, const std::optional<T>& opt)
|
||||
void to_json(BasicJsonType& j, const std::optional<T>& opt) noexcept
|
||||
{
|
||||
if (opt.has_value())
|
||||
{
|
||||
@ -440,15 +441,21 @@ inline void to_json(BasicJsonType& j, const T& t)
|
||||
}
|
||||
|
||||
#if JSON_HAS_FILESYSTEM || JSON_HAS_EXPERIMENTAL_FILESYSTEM
|
||||
#if defined(__cpp_lib_char8_t)
|
||||
template<typename BasicJsonType, typename Tr, typename Allocator>
|
||||
inline void to_json(BasicJsonType& j, const std::basic_string<char8_t, Tr, Allocator>& s)
|
||||
{
|
||||
using OtherAllocator = typename std::allocator_traits<Allocator>::template rebind_alloc<char>;
|
||||
j = std::basic_string<char, std::char_traits<char>, OtherAllocator>(s.begin(), s.end(), s.get_allocator());
|
||||
}
|
||||
#endif
|
||||
|
||||
template<typename BasicJsonType>
|
||||
inline void to_json(BasicJsonType& j, const std_fs::path& p)
|
||||
{
|
||||
#ifdef JSON_HAS_CPP_20
|
||||
const std::u8string s = p.u8string();
|
||||
j = std::string(s.begin(), s.end());
|
||||
#else
|
||||
j = p.u8string(); // returns std::string in C++17
|
||||
#endif
|
||||
// Returns either a std::string or a std::u8string depending whether library
|
||||
// support for char8_t is enabled.
|
||||
j = p.u8string();
|
||||
}
|
||||
#endif
|
||||
|
||||
|
@ -30,7 +30,7 @@
|
||||
// emitted in every translation unit. This issue cannot be fixed with a
|
||||
// header-only library as there is no implementation file to move these
|
||||
// functions to. As a result, we suppress this warning here to avoid client
|
||||
// code to stumble over this. See https://github.com/nlohmann/json/issues/4087
|
||||
// code stumbling over this. See https://github.com/nlohmann/json/issues/4087
|
||||
// for a discussion.
|
||||
#if defined(__clang__)
|
||||
#pragma clang diagnostic push
|
||||
|
@ -53,7 +53,7 @@ enum class cbor_tag_handler_t
|
||||
|
||||
@note from https://stackoverflow.com/a/1001328/266378
|
||||
*/
|
||||
static inline bool little_endianness(int num = 1) noexcept
|
||||
inline bool little_endianness(int num = 1) noexcept
|
||||
{
|
||||
return *reinterpret_cast<char*>(&num) == 1;
|
||||
}
|
||||
@ -334,7 +334,7 @@ class binary_reader
|
||||
return get_number<std::uint64_t, true>(input_format_t::bson, value) && sax->number_unsigned(value);
|
||||
}
|
||||
|
||||
default: // anything else not supported (yet)
|
||||
default: // anything else is not supported (yet)
|
||||
{
|
||||
std::array<char, 3> cr{{}};
|
||||
static_cast<void>((std::snprintf)(cr.data(), cr.size(), "%.2hhX", static_cast<unsigned char>(element_type))); // NOLINT(cppcoreguidelines-pro-type-vararg,hicpp-vararg)
|
||||
@ -731,7 +731,7 @@ class binary_reader
|
||||
case 0xD2:
|
||||
case 0xD3:
|
||||
case 0xD4:
|
||||
case 0xD8: // tagged item (1 bytes follow)
|
||||
case 0xD8: // tagged item (1 byte follows)
|
||||
case 0xD9: // tagged item (2 bytes follow)
|
||||
case 0xDA: // tagged item (4 bytes follow)
|
||||
case 0xDB: // tagged item (8 bytes follow)
|
||||
@ -783,7 +783,7 @@ class binary_reader
|
||||
case cbor_tag_handler_t::store:
|
||||
{
|
||||
binary_t b;
|
||||
// use binary subtype and store in binary container
|
||||
// use binary subtype and store in a binary container
|
||||
switch (current)
|
||||
{
|
||||
case 0xD8:
|
||||
@ -852,7 +852,7 @@ class binary_reader
|
||||
const auto byte1 = static_cast<unsigned char>(byte1_raw);
|
||||
const auto byte2 = static_cast<unsigned char>(byte2_raw);
|
||||
|
||||
// code from RFC 7049, Appendix D, Figure 3:
|
||||
// Code from RFC 7049, Appendix D, Figure 3:
|
||||
// As half-precision floating-point numbers were only added
|
||||
// to IEEE 754 in 2008, today's programming platforms often
|
||||
// still only have limited support for them. It is very
|
||||
@ -2159,7 +2159,7 @@ class binary_reader
|
||||
{
|
||||
break;
|
||||
}
|
||||
if (is_ndarray) // ndarray dimensional vector can only contain integers, and can not embed another array
|
||||
if (is_ndarray) // ndarray dimensional vector can only contain integers and cannot embed another array
|
||||
{
|
||||
return sax->parse_error(chars_read, get_token_string(), parse_error::create(113, chars_read, exception_message(input_format, "ndarray dimensional vector is not allowed", "size"), nullptr));
|
||||
}
|
||||
@ -2192,8 +2192,16 @@ class binary_reader
|
||||
result = 1;
|
||||
for (auto i : dim)
|
||||
{
|
||||
// Pre-multiplication overflow check: if i > 0 and result > SIZE_MAX/i, then result*i would overflow.
|
||||
// This check must happen before multiplication since overflow detection after the fact is unreliable
|
||||
// as modular arithmetic can produce any value, not just 0 or SIZE_MAX.
|
||||
if (JSON_HEDLEY_UNLIKELY(i > 0 && result > (std::numeric_limits<std::size_t>::max)() / i))
|
||||
{
|
||||
return sax->parse_error(chars_read, get_token_string(), out_of_range::create(408, exception_message(input_format, "excessive ndarray size caused overflow", "size"), nullptr));
|
||||
}
|
||||
result *= i;
|
||||
if (result == 0 || result == npos) // because dim elements shall not have zeros, result = 0 means overflow happened; it also can't be npos as it is used to initialize size in get_ubjson_size_type()
|
||||
// Additional post-multiplication check to catch any edge cases the pre-check might miss
|
||||
if (result == 0 || result == npos)
|
||||
{
|
||||
return sax->parse_error(chars_read, get_token_string(), out_of_range::create(408, exception_message(input_format, "excessive ndarray size caused overflow", "size"), nullptr));
|
||||
}
|
||||
@ -2409,7 +2417,7 @@ class binary_reader
|
||||
const auto byte1 = static_cast<unsigned char>(byte1_raw);
|
||||
const auto byte2 = static_cast<unsigned char>(byte2_raw);
|
||||
|
||||
// code from RFC 7049, Appendix D, Figure 3:
|
||||
// Code from RFC 7049, Appendix D, Figure 3:
|
||||
// As half-precision floating-point numbers were only added
|
||||
// to IEEE 754 in 2008, today's programming platforms often
|
||||
// still only have limited support for them. It is very
|
||||
@ -2697,7 +2705,7 @@ class binary_reader
|
||||
|
||||
bool get_ubjson_high_precision_number()
|
||||
{
|
||||
// get size of following number string
|
||||
// get the size of the following number string
|
||||
std::size_t size{};
|
||||
bool no_ndarray = true;
|
||||
auto res = get_ubjson_size_value(size, no_ndarray);
|
||||
@ -2795,7 +2803,7 @@ class binary_reader
|
||||
chars_read += new_chars_read;
|
||||
if (JSON_HEDLEY_UNLIKELY(new_chars_read < sizeof(T)))
|
||||
{
|
||||
// in case of failure, advance position by 1 to report failing location
|
||||
// in case of failure, advance position by 1 to report the failing location
|
||||
++chars_read;
|
||||
sax->parse_error(chars_read, "<end of file>", parse_error::create(110, chars_read, exception_message(format, "unexpected end of input", context), nullptr));
|
||||
return false;
|
||||
@ -2826,17 +2834,22 @@ class binary_reader
|
||||
{
|
||||
return;
|
||||
}
|
||||
if constexpr(std::is_integral_v<NumberType>)
|
||||
else if constexpr(std::is_integral_v<NumberType>)
|
||||
{
|
||||
number = std::byteswap(number);
|
||||
return;
|
||||
}
|
||||
#endif
|
||||
auto* ptr = reinterpret_cast<std::uint8_t*>(&number);
|
||||
for (std::size_t i = 0; i < sz / 2; ++i)
|
||||
else
|
||||
{
|
||||
std::swap(ptr[i], ptr[sz - i - 1]);
|
||||
#endif
|
||||
auto* ptr = reinterpret_cast<std::uint8_t*>(&number);
|
||||
for (std::size_t i = 0; i < sz / 2; ++i)
|
||||
{
|
||||
std::swap(ptr[i], ptr[sz - i - 1]);
|
||||
}
|
||||
#ifdef __cpp_lib_byteswap
|
||||
}
|
||||
#endif
|
||||
}
|
||||
|
||||
/*
|
||||
@ -2931,7 +2944,7 @@ class binary_reader
|
||||
success = false;
|
||||
break;
|
||||
}
|
||||
result.push_back(static_cast<std::uint8_t>(current));
|
||||
result.push_back(static_cast<typename binary_t::value_type>(current));
|
||||
}
|
||||
return success;
|
||||
}
|
||||
|
@ -108,7 +108,7 @@ class input_stream_adapter
|
||||
: is(&i), sb(i.rdbuf())
|
||||
{}
|
||||
|
||||
// delete because of pointer members
|
||||
// deleted because of pointer members
|
||||
input_stream_adapter(const input_stream_adapter&) = delete;
|
||||
input_stream_adapter& operator=(input_stream_adapter&) = delete;
|
||||
input_stream_adapter& operator=(input_stream_adapter&&) = delete;
|
||||
@ -122,7 +122,7 @@ class input_stream_adapter
|
||||
|
||||
// std::istream/std::streambuf use std::char_traits<char>::to_int_type, to
|
||||
// ensure that std::char_traits<char>::eof() and the character 0xFF do not
|
||||
// end up as the same value, e.g. 0xFFFFFFFF.
|
||||
// end up as the same value, e.g., 0xFFFFFFFF.
|
||||
std::char_traits<char>::int_type get_character()
|
||||
{
|
||||
auto res = sb->sbumpc();
|
||||
@ -344,7 +344,7 @@ class wide_string_input_adapter
|
||||
|
||||
typename std::char_traits<char>::int_type get_character() noexcept
|
||||
{
|
||||
// check if buffer needs to be filled
|
||||
// check if the buffer needs to be filled
|
||||
if (utf8_bytes_index == utf8_bytes_filled)
|
||||
{
|
||||
fill_buffer<sizeof(WideCharType)>();
|
||||
|
@ -260,7 +260,7 @@ class json_sax_dom_parser
|
||||
JSON_ASSERT(!ref_stack.empty());
|
||||
JSON_ASSERT(ref_stack.back()->is_object());
|
||||
|
||||
// add null at given key and store the reference for later
|
||||
// add null at the given key and store the reference for later
|
||||
object_element = &(ref_stack.back()->m_data.m_value.object->operator[](val));
|
||||
return true;
|
||||
}
|
||||
@ -576,11 +576,11 @@ class json_sax_dom_callback_parser
|
||||
{
|
||||
BasicJsonType k = BasicJsonType(val);
|
||||
|
||||
// check callback for key
|
||||
// check callback for the key
|
||||
const bool keep = callback(static_cast<int>(ref_stack.size()), parse_event_t::key, k);
|
||||
key_keep_stack.push_back(keep);
|
||||
|
||||
// add discarded value at given key and store the reference for later
|
||||
// add discarded value at the given key and store the reference for later
|
||||
if (keep && ref_stack.back())
|
||||
{
|
||||
object_element = &(ref_stack.back()->m_data.m_value.object->operator[](val) = discarded);
|
||||
|
@ -127,7 +127,7 @@ class lexer : public lexer_base<BasicJsonType>
|
||||
, decimal_point_char(static_cast<char_int_type>(get_decimal_point()))
|
||||
{}
|
||||
|
||||
// delete because of pointer members
|
||||
// deleted because of pointer members
|
||||
lexer(const lexer&) = delete;
|
||||
lexer(lexer&&) = default; // NOLINT(hicpp-noexcept-move,performance-noexcept-move-constructor)
|
||||
lexer& operator=(lexer&) = delete;
|
||||
@ -262,10 +262,10 @@ class lexer : public lexer_base<BasicJsonType>
|
||||
|
||||
while (true)
|
||||
{
|
||||
// get next character
|
||||
// get the next character
|
||||
switch (get())
|
||||
{
|
||||
// end of file while parsing string
|
||||
// end of file while parsing the string
|
||||
case char_traits<char_type>::eof():
|
||||
{
|
||||
error_message = "invalid string: missing closing quote";
|
||||
@ -351,7 +351,7 @@ class lexer : public lexer_base<BasicJsonType>
|
||||
(static_cast<unsigned int>(codepoint1) << 10u)
|
||||
// low surrogate occupies the least significant 15 bits
|
||||
+ static_cast<unsigned int>(codepoint2)
|
||||
// there is still the 0xD800, 0xDC00 and 0x10000 noise
|
||||
// there is still the 0xD800, 0xDC00, and 0x10000 noise
|
||||
// in the result, so we have to subtract with:
|
||||
// (0xD800 << 10) + DC00 - 0x10000 = 0x35FDC00
|
||||
- 0x35FDC00u);
|
||||
@ -377,7 +377,7 @@ class lexer : public lexer_base<BasicJsonType>
|
||||
}
|
||||
}
|
||||
|
||||
// result of the above calculation yields a proper codepoint
|
||||
// the result of the above calculation yields a proper codepoint
|
||||
JSON_ASSERT(0x00 <= codepoint && codepoint <= 0x10FFFF);
|
||||
|
||||
// translate codepoint into bytes
|
||||
@ -828,7 +828,7 @@ class lexer : public lexer_base<BasicJsonType>
|
||||
break;
|
||||
}
|
||||
|
||||
// remaining bytes (80..C1 and F5..FF) are ill-formed
|
||||
// the remaining bytes (80..C1 and F5..FF) are ill-formed
|
||||
default:
|
||||
{
|
||||
error_message = "invalid string: ill-formed UTF-8 byte";
|
||||
@ -973,7 +973,7 @@ class lexer : public lexer_base<BasicJsonType>
|
||||
reset();
|
||||
|
||||
// the type of the parsed number; initially set to unsigned; will be
|
||||
// changed if minus sign, decimal point or exponent is read
|
||||
// changed if minus sign, decimal point, or exponent is read
|
||||
token_type number_type = token_type::value_unsigned;
|
||||
|
||||
// state (init): we just found out we need to scan a number
|
||||
@ -1345,7 +1345,7 @@ scan_number_done:
|
||||
|
||||
if (next_unget)
|
||||
{
|
||||
// just reset the next_unget variable and work with current
|
||||
// only reset the next_unget variable and work with current
|
||||
next_unget = false;
|
||||
}
|
||||
else
|
||||
@ -1524,7 +1524,7 @@ scan_number_done:
|
||||
return token_type::parse_error;
|
||||
}
|
||||
|
||||
// read next character and ignore whitespace
|
||||
// read the next character and ignore whitespace
|
||||
skip_whitespace();
|
||||
|
||||
// ignore comments
|
||||
|
@ -71,10 +71,12 @@ class parser
|
||||
explicit parser(InputAdapterType&& adapter,
|
||||
parser_callback_t<BasicJsonType> cb = nullptr,
|
||||
const bool allow_exceptions_ = true,
|
||||
const bool skip_comments = false)
|
||||
const bool ignore_comments = false,
|
||||
const bool ignore_trailing_commas_ = false)
|
||||
: callback(std::move(cb))
|
||||
, m_lexer(std::move(adapter), skip_comments)
|
||||
, m_lexer(std::move(adapter), ignore_comments)
|
||||
, allow_exceptions(allow_exceptions_)
|
||||
, ignore_trailing_commas(ignore_trailing_commas_)
|
||||
{
|
||||
// read first token
|
||||
get_token();
|
||||
@ -106,7 +108,7 @@ class parser
|
||||
exception_message(token_type::end_of_input, "value"), nullptr));
|
||||
}
|
||||
|
||||
// in case of an error, return discarded value
|
||||
// in case of an error, return a discarded value
|
||||
if (sdp.is_errored())
|
||||
{
|
||||
result = value_t::discarded;
|
||||
@ -133,7 +135,7 @@ class parser
|
||||
parse_error::create(101, m_lexer.get_position(), exception_message(token_type::end_of_input, "value"), nullptr));
|
||||
}
|
||||
|
||||
// in case of an error, return discarded value
|
||||
// in case of an error, return a discarded value
|
||||
if (sdp.is_errored())
|
||||
{
|
||||
result = value_t::discarded;
|
||||
@ -336,7 +338,7 @@ class parser
|
||||
|
||||
case token_type::parse_error:
|
||||
{
|
||||
// using "uninitialized" to avoid "expected" message
|
||||
// using "uninitialized" to avoid an "expected" message
|
||||
return sax->parse_error(m_lexer.get_position(),
|
||||
m_lexer.get_token_string(),
|
||||
parse_error::create(101, m_lexer.get_position(), exception_message(token_type::uninitialized, "value"), nullptr));
|
||||
@ -384,11 +386,17 @@ class parser
|
||||
if (states.back()) // array
|
||||
{
|
||||
// comma -> next value
|
||||
// or end of array (ignore_trailing_commas = true)
|
||||
if (get_token() == token_type::value_separator)
|
||||
{
|
||||
// parse a new value
|
||||
get_token();
|
||||
continue;
|
||||
|
||||
// if ignore_trailing_commas and last_token is ], we can continue to "closing ]"
|
||||
if (!(ignore_trailing_commas && last_token == token_type::end_array))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
// closing ]
|
||||
@ -417,32 +425,39 @@ class parser
|
||||
// states.back() is false -> object
|
||||
|
||||
// comma -> next value
|
||||
// or end of object (ignore_trailing_commas = true)
|
||||
if (get_token() == token_type::value_separator)
|
||||
{
|
||||
// parse key
|
||||
if (JSON_HEDLEY_UNLIKELY(get_token() != token_type::value_string))
|
||||
{
|
||||
return sax->parse_error(m_lexer.get_position(),
|
||||
m_lexer.get_token_string(),
|
||||
parse_error::create(101, m_lexer.get_position(), exception_message(token_type::value_string, "object key"), nullptr));
|
||||
}
|
||||
|
||||
if (JSON_HEDLEY_UNLIKELY(!sax->key(m_lexer.get_string())))
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
// parse separator (:)
|
||||
if (JSON_HEDLEY_UNLIKELY(get_token() != token_type::name_separator))
|
||||
{
|
||||
return sax->parse_error(m_lexer.get_position(),
|
||||
m_lexer.get_token_string(),
|
||||
parse_error::create(101, m_lexer.get_position(), exception_message(token_type::name_separator, "object separator"), nullptr));
|
||||
}
|
||||
|
||||
// parse values
|
||||
get_token();
|
||||
continue;
|
||||
|
||||
// if ignore_trailing_commas and last_token is }, we can continue to "closing }"
|
||||
if (!(ignore_trailing_commas && last_token == token_type::end_object))
|
||||
{
|
||||
// parse key
|
||||
if (JSON_HEDLEY_UNLIKELY(last_token != token_type::value_string))
|
||||
{
|
||||
return sax->parse_error(m_lexer.get_position(),
|
||||
m_lexer.get_token_string(),
|
||||
parse_error::create(101, m_lexer.get_position(), exception_message(token_type::value_string, "object key"), nullptr));
|
||||
}
|
||||
|
||||
if (JSON_HEDLEY_UNLIKELY(!sax->key(m_lexer.get_string())))
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
// parse separator (:)
|
||||
if (JSON_HEDLEY_UNLIKELY(get_token() != token_type::name_separator))
|
||||
{
|
||||
return sax->parse_error(m_lexer.get_position(),
|
||||
m_lexer.get_token_string(),
|
||||
parse_error::create(101, m_lexer.get_position(), exception_message(token_type::name_separator, "object separator"), nullptr));
|
||||
}
|
||||
|
||||
// parse values
|
||||
get_token();
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
// closing }
|
||||
@ -513,6 +528,8 @@ class parser
|
||||
lexer_t m_lexer;
|
||||
/// whether to throw exceptions in case of errors
|
||||
const bool allow_exceptions = true;
|
||||
/// whether trailing commas in objects and arrays should be ignored (true) or signaled as errors (false)
|
||||
const bool ignore_trailing_commas = false;
|
||||
};
|
||||
|
||||
} // namespace detail
|
||||
|
@ -23,7 +23,7 @@ NLOHMANN_JSON_NAMESPACE_BEGIN
|
||||
namespace detail
|
||||
{
|
||||
|
||||
// forward declare, to be able to friend it later on
|
||||
// forward declare to be able to friend it later on
|
||||
template<typename IteratorType> class iteration_proxy;
|
||||
template<typename IteratorType> class iteration_proxy_value;
|
||||
|
||||
|
@ -21,9 +21,9 @@ namespace detail
|
||||
@brief an iterator for primitive JSON types
|
||||
|
||||
This class models an iterator for primitive JSON types (boolean, number,
|
||||
string). It's only purpose is to allow the iterator/const_iterator classes
|
||||
string). Its only purpose is to allow the iterator/const_iterator classes
|
||||
to "iterate" over primitive values. Internally, the iterator is modeled by
|
||||
a `difference_type` variable. Value begin_value (`0`) models the begin,
|
||||
a `difference_type` variable. Value begin_value (`0`) models the begin and
|
||||
end_value (`1`) models past the end.
|
||||
*/
|
||||
class primitive_iterator_t
|
||||
|
@ -285,7 +285,7 @@ class json_pointer
|
||||
{
|
||||
if (reference_token == "0")
|
||||
{
|
||||
// start a new array if reference token is 0
|
||||
// start a new array if the reference token is 0
|
||||
result = &result->operator[](0);
|
||||
}
|
||||
else
|
||||
@ -314,7 +314,7 @@ class json_pointer
|
||||
The following code is only reached if there exists a reference
|
||||
token _and_ the current value is primitive. In this case, we have
|
||||
an error situation, because primitive values may only occur as
|
||||
single value; that is, with an empty list of reference tokens.
|
||||
a single value; that is, with an empty list of reference tokens.
|
||||
*/
|
||||
case detail::value_t::string:
|
||||
case detail::value_t::boolean:
|
||||
@ -358,7 +358,7 @@ class json_pointer
|
||||
// convert null values to arrays or objects before continuing
|
||||
if (ptr->is_null())
|
||||
{
|
||||
// check if reference token is a number
|
||||
// check if the reference token is a number
|
||||
const bool nums =
|
||||
std::all_of(reference_token.begin(), reference_token.end(),
|
||||
[](const unsigned char x)
|
||||
@ -366,7 +366,7 @@ class json_pointer
|
||||
return std::isdigit(x);
|
||||
});
|
||||
|
||||
// change value to array for numbers or "-" or to object otherwise
|
||||
// change value to an array for numbers or "-" or to object otherwise
|
||||
*ptr = (nums || reference_token == "-")
|
||||
? detail::value_t::array
|
||||
: detail::value_t::object;
|
||||
@ -609,7 +609,7 @@ class json_pointer
|
||||
{
|
||||
if (JSON_HEDLEY_UNLIKELY(!('1' <= reference_token[0] && reference_token[0] <= '9')))
|
||||
{
|
||||
// first char should be between '1' and '9'
|
||||
// the first char should be between '1' and '9'
|
||||
return false;
|
||||
}
|
||||
for (std::size_t i = 1; i < reference_token.size(); i++)
|
||||
@ -673,7 +673,7 @@ class json_pointer
|
||||
return result;
|
||||
}
|
||||
|
||||
// check if nonempty reference string begins with slash
|
||||
// check if a nonempty reference string begins with slash
|
||||
if (JSON_HEDLEY_UNLIKELY(reference_string[0] != '/'))
|
||||
{
|
||||
JSON_THROW(detail::parse_error::create(107, 1, detail::concat("JSON pointer must be empty or begin with '/' - was: '", reference_string, "'"), nullptr));
|
||||
@ -747,7 +747,7 @@ class json_pointer
|
||||
}
|
||||
else
|
||||
{
|
||||
// iterate array and use index as reference string
|
||||
// iterate array and use index as a reference string
|
||||
for (std::size_t i = 0; i < value.m_data.m_value.array->size(); ++i)
|
||||
{
|
||||
flatten(detail::concat<string_t>(reference_string, '/', std::to_string(i)),
|
||||
@ -785,7 +785,7 @@ class json_pointer
|
||||
case detail::value_t::discarded:
|
||||
default:
|
||||
{
|
||||
// add primitive value with its reference string
|
||||
// add a primitive value with its reference string
|
||||
result[reference_string] = value;
|
||||
break;
|
||||
}
|
||||
@ -821,17 +821,17 @@ class json_pointer
|
||||
JSON_THROW(detail::type_error::create(315, "values in object must be primitive", &element.second));
|
||||
}
|
||||
|
||||
// assign value to reference pointed to by JSON pointer; Note that if
|
||||
// the JSON pointer is "" (i.e., points to the whole value), function
|
||||
// get_and_create returns a reference to result itself. An assignment
|
||||
// will then create a primitive value.
|
||||
// Assign the value to the reference pointed to by JSON pointer. Note
|
||||
// that if the JSON pointer is "" (i.e., points to the whole value),
|
||||
// function get_and_create returns a reference to the result itself.
|
||||
// An assignment will then create a primitive value.
|
||||
json_pointer(element.first).get_and_create(result) = element.second;
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
// can't use conversion operator because of ambiguity
|
||||
// can't use the conversion operator because of ambiguity
|
||||
json_pointer<string_t> convert() const&
|
||||
{
|
||||
json_pointer<string_t> result;
|
||||
@ -926,7 +926,7 @@ class json_pointer
|
||||
};
|
||||
|
||||
#if !JSON_HAS_THREE_WAY_COMPARISON
|
||||
// functions cannot be defined inside class due to ODR violations
|
||||
// functions cannot be defined inside the class due to ODR violations
|
||||
template<typename RefStringTypeLhs, typename RefStringTypeRhs>
|
||||
inline bool operator==(const json_pointer<RefStringTypeLhs>& lhs,
|
||||
const json_pointer<RefStringTypeRhs>& rhs) noexcept
|
||||
|
@ -31,9 +31,15 @@
|
||||
#endif
|
||||
|
||||
// C++ language standard detection
|
||||
// if the user manually specified the used c++ version this is skipped
|
||||
#if !defined(JSON_HAS_CPP_23) && !defined(JSON_HAS_CPP_20) && !defined(JSON_HAS_CPP_17) && !defined(JSON_HAS_CPP_14) && !defined(JSON_HAS_CPP_11)
|
||||
#if (defined(__cplusplus) && __cplusplus > 202002L) || (defined(_MSVC_LANG) && _MSVC_LANG > 202002L)
|
||||
// if the user manually specified the used C++ version, this is skipped
|
||||
#if !defined(JSON_HAS_CPP_26) && !defined(JSON_HAS_CPP_23) && !defined(JSON_HAS_CPP_20) && !defined(JSON_HAS_CPP_17) && !defined(JSON_HAS_CPP_14) && !defined(JSON_HAS_CPP_11)
|
||||
#if (defined(__cplusplus) && __cplusplus > 202302L) || (defined(_MSVC_LANG) && _MSVC_LANG > 202302L)
|
||||
#define JSON_HAS_CPP_26
|
||||
#define JSON_HAS_CPP_23
|
||||
#define JSON_HAS_CPP_20
|
||||
#define JSON_HAS_CPP_17
|
||||
#define JSON_HAS_CPP_14
|
||||
#elif (defined(__cplusplus) && __cplusplus > 202002L) || (defined(_MSVC_LANG) && _MSVC_LANG > 202002L)
|
||||
#define JSON_HAS_CPP_23
|
||||
#define JSON_HAS_CPP_20
|
||||
#define JSON_HAS_CPP_17
|
||||
@ -128,7 +134,7 @@
|
||||
#endif
|
||||
|
||||
#ifndef JSON_HAS_RANGES
|
||||
// ranges header shipping in GCC 11.1.0 (released 2021-04-27) has syntax error
|
||||
// ranges header shipping in GCC 11.1.0 (released 2021-04-27) has a syntax error
|
||||
#if defined(__GLIBCXX__) && __GLIBCXX__ == 20210427
|
||||
#define JSON_HAS_RANGES 0
|
||||
#elif defined(__cpp_lib_ranges)
|
||||
@ -205,7 +211,7 @@
|
||||
#define JSON_ASSERT(x) assert(x)
|
||||
#endif
|
||||
|
||||
// allow to access some private functions (needed by the test suite)
|
||||
// allow accessing some private functions (needed by the test suite)
|
||||
#if defined(JSON_TESTS_PRIVATE)
|
||||
#define JSON_PRIVATE_UNLESS_TESTED public
|
||||
#else
|
||||
|
@ -35,6 +35,7 @@
|
||||
#undef JSON_HAS_CPP_17
|
||||
#undef JSON_HAS_CPP_20
|
||||
#undef JSON_HAS_CPP_23
|
||||
#undef JSON_HAS_CPP_26
|
||||
#undef JSON_HAS_FILESYSTEM
|
||||
#undef JSON_HAS_EXPERIMENTAL_FILESYSTEM
|
||||
#undef JSON_HAS_THREE_WAY_COMPARISON
|
||||
|
@ -13,7 +13,9 @@
|
||||
#include <tuple> // tuple
|
||||
#include <type_traits> // false_type, is_constructible, is_integral, is_same, true_type
|
||||
#include <utility> // declval
|
||||
|
||||
#if defined(__cpp_lib_byte) && __cpp_lib_byte >= 201603L
|
||||
#include <cstddef> // byte
|
||||
#endif
|
||||
#include <nlohmann/detail/iterators/iterator_traits.hpp>
|
||||
#include <nlohmann/detail/macro_scope.hpp>
|
||||
#include <nlohmann/detail/meta/call_std/begin.hpp>
|
||||
@ -40,12 +42,12 @@ namespace detail
|
||||
|
||||
// Note to maintainers:
|
||||
//
|
||||
// Every trait in this file expects a non CV-qualified type.
|
||||
// Every trait in this file expects a non-CV-qualified type.
|
||||
// The only exceptions are in the 'aliases for detected' section
|
||||
// (i.e. those of the form: decltype(T::member_function(std::declval<T>())))
|
||||
// (i.e., those of the form: decltype(T::member_function(std::declval<T>())))
|
||||
//
|
||||
// In this case, T has to be properly CV-qualified to constraint the function arguments
|
||||
// (e.g. to_json(BasicJsonType&, const T&))
|
||||
// (e.g., to_json(BasicJsonType&, const T&))
|
||||
|
||||
template<typename> struct is_basic_json : std::false_type {};
|
||||
|
||||
@ -53,7 +55,7 @@ NLOHMANN_BASIC_JSON_TPL_DECLARATION
|
||||
struct is_basic_json<NLOHMANN_BASIC_JSON_TPL> : std::true_type {};
|
||||
|
||||
// used by exceptions create() member functions
|
||||
// true_type for pointer to possibly cv-qualified basic_json or std::nullptr_t
|
||||
// true_type for the pointer to possibly cv-qualified basic_json or std::nullptr_t
|
||||
// false_type otherwise
|
||||
template<typename BasicJsonContext>
|
||||
struct is_basic_json_context :
|
||||
@ -239,6 +241,30 @@ struct char_traits<signed char> : std::char_traits<char>
|
||||
}
|
||||
};
|
||||
|
||||
#if defined(__cpp_lib_byte) && __cpp_lib_byte >= 201603L
|
||||
template<>
|
||||
struct char_traits<std::byte> : std::char_traits<char>
|
||||
{
|
||||
using char_type = std::byte;
|
||||
using int_type = uint64_t;
|
||||
|
||||
static int_type to_int_type(char_type c) noexcept
|
||||
{
|
||||
return static_cast<int_type>(std::to_integer<unsigned char>(c));
|
||||
}
|
||||
|
||||
static char_type to_char_type(int_type i) noexcept
|
||||
{
|
||||
return std::byte(static_cast<unsigned char>(i));
|
||||
}
|
||||
|
||||
static constexpr int_type eof() noexcept
|
||||
{
|
||||
return static_cast<int_type>(std::char_traits<char>::eof());
|
||||
}
|
||||
};
|
||||
#endif
|
||||
|
||||
///////////////////
|
||||
// is_ functions //
|
||||
///////////////////
|
||||
@ -255,7 +281,7 @@ template<class B> struct negation : std::integral_constant < bool, !B::value > {
|
||||
|
||||
// Reimplementation of is_constructible and is_default_constructible, due to them being broken for
|
||||
// std::pair and std::tuple until LWG 2367 fix (see https://cplusplus.github.io/LWG/lwg-defects.html#2367).
|
||||
// This causes compile errors in e.g. clang 3.5 or gcc 4.9.
|
||||
// This causes compile errors in e.g., Clang 3.5 or GCC 4.9.
|
||||
template <typename T>
|
||||
struct is_default_constructible : std::is_default_constructible<T> {};
|
||||
|
||||
@ -335,7 +361,7 @@ using range_value_t = value_type_t<iterator_traits<iterator_t<T>>>;
|
||||
|
||||
// The following implementation of is_complete_type is taken from
|
||||
// https://blogs.msdn.microsoft.com/vcblog/2015/12/02/partial-support-for-expression-sfinae-in-vs-2015-update-1/
|
||||
// and is written by Xiang Fan who agreed to using it in this library.
|
||||
// and is written by Xiang Fan who agreed to use it in this library.
|
||||
|
||||
template<typename T, typename = void>
|
||||
struct is_complete_type : std::false_type {};
|
||||
@ -572,7 +598,7 @@ decltype(std::declval<Compare>()(std::declval<B>(), std::declval<A>()))
|
||||
template<typename T>
|
||||
using detect_is_transparent = typename T::is_transparent;
|
||||
|
||||
// type trait to check if KeyType can be used as object key (without a BasicJsonType)
|
||||
// type trait to check if KeyType can be used as an object key (without a BasicJsonType)
|
||||
// see is_usable_as_basic_json_key_type below
|
||||
template<typename Comparator, typename ObjectKeyType, typename KeyTypeCVRef, bool RequireTransparentComparator = true,
|
||||
bool ExcludeObjectKeyType = RequireTransparentComparator, typename KeyType = uncvref_t<KeyTypeCVRef>>
|
||||
@ -586,7 +612,7 @@ using is_usable_as_key_type = typename std::conditional <
|
||||
std::true_type,
|
||||
std::false_type >::type;
|
||||
|
||||
// type trait to check if KeyType can be used as object key
|
||||
// type trait to check if KeyType can be used as an object key
|
||||
// true if:
|
||||
// - KeyType is comparable with BasicJsonType::object_t::key_type
|
||||
// - if ExcludeObjectKeyType is true, KeyType is not BasicJsonType::object_t::key_type
|
||||
|
@ -973,9 +973,9 @@ class binary_writer
|
||||
if (JSON_HEDLEY_UNLIKELY(it != BasicJsonType::string_t::npos))
|
||||
{
|
||||
JSON_THROW(out_of_range::create(409, concat("BSON key cannot contain code point U+0000 (at byte ", std::to_string(it), ")"), &j));
|
||||
static_cast<void>(j);
|
||||
}
|
||||
|
||||
static_cast<void>(j);
|
||||
return /*id*/ 1ul + name.size() + /*zero-terminator*/1u;
|
||||
}
|
||||
|
||||
@ -1552,7 +1552,7 @@ class binary_writer
|
||||
{
|
||||
return 'L';
|
||||
}
|
||||
// anything else is treated as high-precision number
|
||||
// anything else is treated as a high-precision number
|
||||
return 'H'; // LCOV_EXCL_LINE
|
||||
}
|
||||
|
||||
@ -1590,7 +1590,7 @@ class binary_writer
|
||||
{
|
||||
return 'M';
|
||||
}
|
||||
// anything else is treated as high-precision number
|
||||
// anything else is treated as a high-precision number
|
||||
return 'H'; // LCOV_EXCL_LINE
|
||||
}
|
||||
|
||||
@ -1756,11 +1756,11 @@ class binary_writer
|
||||
template<typename NumberType>
|
||||
void write_number(const NumberType n, const bool OutputIsLittleEndian = false)
|
||||
{
|
||||
// step 1: write number to array of length NumberType
|
||||
// step 1: write the number to an array of length NumberType
|
||||
std::array<CharType, sizeof(NumberType)> vec{};
|
||||
std::memcpy(vec.data(), &n, sizeof(NumberType));
|
||||
|
||||
// step 2: write array to output (with possible reordering)
|
||||
// step 2: write the array to output (with possible reordering)
|
||||
if (is_little_endian != OutputIsLittleEndian)
|
||||
{
|
||||
// reverse byte order prior to conversion if necessary
|
||||
@ -1776,9 +1776,9 @@ class binary_writer
|
||||
#pragma GCC diagnostic push
|
||||
#pragma GCC diagnostic ignored "-Wfloat-equal"
|
||||
#endif
|
||||
if (static_cast<double>(n) >= static_cast<double>(std::numeric_limits<float>::lowest()) &&
|
||||
static_cast<double>(n) <= static_cast<double>((std::numeric_limits<float>::max)()) &&
|
||||
static_cast<double>(static_cast<float>(n)) == static_cast<double>(n))
|
||||
if (!std::isfinite(n) || ((static_cast<double>(n) >= static_cast<double>(std::numeric_limits<float>::lowest()) &&
|
||||
static_cast<double>(n) <= static_cast<double>((std::numeric_limits<float>::max)()) &&
|
||||
static_cast<double>(static_cast<float>(n)) == static_cast<double>(n))))
|
||||
{
|
||||
oa->write_character(format == detail::input_format_t::cbor
|
||||
? get_cbor_float_prefix(static_cast<float>(n))
|
||||
@ -1813,8 +1813,21 @@ class binary_writer
|
||||
enable_if_t < std::is_signed<C>::value && std::is_unsigned<char>::value > * = nullptr >
|
||||
static CharType to_char_type(std::uint8_t x) noexcept
|
||||
{
|
||||
static_assert(sizeof(std::uint8_t) == sizeof(CharType), "size of CharType must be equal to std::uint8_t");
|
||||
// The std::is_trivial trait is deprecated in C++26. The replacement is to use
|
||||
// std::is_trivially_copyable and std::is_trivially_default_constructible.
|
||||
// However, some older library implementations support std::is_trivial
|
||||
// but not all the std::is_trivially_* traits.
|
||||
// Since detecting full support across all libraries is difficult,
|
||||
// we use std::is_trivial unless we are using a standard where it has been deprecated.
|
||||
// For more details, see: https://github.com/nlohmann/json/pull/4775#issuecomment-2884361627
|
||||
#ifdef JSON_HAS_CPP_26
|
||||
static_assert(std::is_trivially_copyable<CharType>::value, "CharType must be trivially copyable");
|
||||
static_assert(std::is_trivially_default_constructible<CharType>::value, "CharType must be trivially default constructible");
|
||||
#else
|
||||
static_assert(std::is_trivial<CharType>::value, "CharType must be trivial");
|
||||
#endif
|
||||
|
||||
static_assert(sizeof(std::uint8_t) == sizeof(CharType), "size of CharType must be equal to std::uint8_t");
|
||||
CharType result;
|
||||
std::memcpy(&result, &x, sizeof(x));
|
||||
return result;
|
||||
|
@ -75,7 +75,7 @@ class serializer
|
||||
, error_handler(error_handler_)
|
||||
{}
|
||||
|
||||
// delete because of pointer members
|
||||
// deleted because of pointer members
|
||||
serializer(const serializer&) = delete;
|
||||
serializer& operator=(const serializer&) = delete;
|
||||
serializer(serializer&&) = delete;
|
||||
@ -573,7 +573,7 @@ class serializer
|
||||
break;
|
||||
}
|
||||
|
||||
default: // decode found yet incomplete multi-byte code point
|
||||
default: // decode found yet incomplete multibyte code point
|
||||
{
|
||||
if (!ensure_ascii)
|
||||
{
|
||||
@ -762,7 +762,7 @@ class serializer
|
||||
|
||||
// jump to the end to generate the string from backward,
|
||||
// so we later avoid reversing the result
|
||||
buffer_ptr += n_chars;
|
||||
buffer_ptr += static_cast<typename decltype(number_buffer)::difference_type>(n_chars);
|
||||
|
||||
// Fast int2ascii implementation inspired by "Fastware" talk by Andrei Alexandrescu
|
||||
// See: https://www.youtube.com/watch?v=o4-CwDo2zpg
|
||||
@ -827,7 +827,7 @@ class serializer
|
||||
|
||||
void dump_float(number_float_t x, std::false_type /*is_ieee_single_or_double*/)
|
||||
{
|
||||
// get number of digits for a float -> text -> float round-trip
|
||||
// get the number of digits for a float -> text -> float round-trip
|
||||
static constexpr auto d = std::numeric_limits<number_float_t>::max_digits10;
|
||||
|
||||
// the actual conversion
|
||||
@ -836,10 +836,10 @@ class serializer
|
||||
|
||||
// negative value indicates an error
|
||||
JSON_ASSERT(len > 0);
|
||||
// check if buffer was large enough
|
||||
// check if the buffer was large enough
|
||||
JSON_ASSERT(static_cast<std::size_t>(len) < number_buffer.size());
|
||||
|
||||
// erase thousands separator
|
||||
// erase thousands separators
|
||||
if (thousands_sep != '\0')
|
||||
{
|
||||
// NOLINTNEXTLINE(readability-qualified-auto,llvm-qualified-auto): std::remove returns an iterator, see https://github.com/nlohmann/json/issues/3081
|
||||
@ -947,8 +947,8 @@ class serializer
|
||||
* Helper function for dump_integer
|
||||
*
|
||||
* This function takes a negative signed integer and returns its absolute
|
||||
* value as unsigned integer. The plus/minus shuffling is necessary as we can
|
||||
* not directly remove the sign of an arbitrary signed integer as the
|
||||
* value as an unsigned integer. The plus/minus shuffling is necessary as we
|
||||
* cannot directly remove the sign of an arbitrary signed integer as the
|
||||
* absolute values of INT_MIN and INT_MAX are usually not the same. See
|
||||
* #1708 for details.
|
||||
*/
|
||||
|
@ -32,10 +32,10 @@ inline void replace_substring(StringType& s, const StringType& f,
|
||||
const StringType& t)
|
||||
{
|
||||
JSON_ASSERT(!f.empty());
|
||||
for (auto pos = s.find(f); // find first occurrence of f
|
||||
for (auto pos = s.find(f); // find the first occurrence of f
|
||||
pos != StringType::npos; // make sure f was found
|
||||
s.replace(pos, f.size(), t), // replace with t, and
|
||||
pos = s.find(f, pos + t.size())) // find next occurrence of f
|
||||
pos = s.find(f, pos + t.size())) // find the next occurrence of f
|
||||
{}
|
||||
}
|
||||
|
||||
@ -62,7 +62,7 @@ inline StringType escape(StringType s)
|
||||
* Note the order of escaping "~1" to "/" and "~0" to "~" is important.
|
||||
*/
|
||||
template<typename StringType>
|
||||
static void unescape(StringType& s)
|
||||
inline void unescape(StringType& s)
|
||||
{
|
||||
replace_substring(s, StringType{"~1"}, StringType{"/"});
|
||||
replace_substring(s, StringType{"~0"}, StringType{"~"});
|
||||
|
@ -134,11 +134,12 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
InputAdapterType adapter,
|
||||
detail::parser_callback_t<basic_json>cb = nullptr,
|
||||
const bool allow_exceptions = true,
|
||||
const bool ignore_comments = false
|
||||
const bool ignore_comments = false,
|
||||
const bool ignore_trailing_commas = false
|
||||
)
|
||||
{
|
||||
return ::nlohmann::detail::parser<basic_json, InputAdapterType>(std::move(adapter),
|
||||
std::move(cb), allow_exceptions, ignore_comments);
|
||||
std::move(cb), allow_exceptions, ignore_comments, ignore_trailing_commas);
|
||||
}
|
||||
|
||||
private:
|
||||
@ -563,7 +564,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
(t == value_t::binary && binary == nullptr)
|
||||
)
|
||||
{
|
||||
//not initialized (e.g. due to exception in the ctor)
|
||||
// not initialized (e.g., due to exception in the ctor)
|
||||
return;
|
||||
}
|
||||
if (t == value_t::array || t == value_t::object)
|
||||
@ -588,7 +589,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
|
||||
while (!stack.empty())
|
||||
{
|
||||
// move the last item to local variable to be processed
|
||||
// move the last item to a local variable to be processed
|
||||
basic_json current_item(std::move(stack.back()));
|
||||
stack.pop_back();
|
||||
|
||||
@ -610,7 +611,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
current_item.m_data.m_value.object->clear();
|
||||
}
|
||||
|
||||
// it's now safe that current_item get destructed
|
||||
// it's now safe that current_item gets destructed
|
||||
// since it doesn't have any children
|
||||
}
|
||||
}
|
||||
@ -918,20 +919,20 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
{
|
||||
// The cast is to ensure op[size_type] is called, bearing in mind size_type may not be int;
|
||||
// (many string types can be constructed from 0 via its null-pointer guise, so we get a
|
||||
// broken call to op[key_type], the wrong semantics and a 4804 warning on Windows)
|
||||
// broken call to op[key_type], the wrong semantics, and a 4804 warning on Windows)
|
||||
return element_ref->is_array() && element_ref->size() == 2 && (*element_ref)[static_cast<size_type>(0)].is_string();
|
||||
});
|
||||
|
||||
// adjust type if type deduction is not wanted
|
||||
if (!type_deduction)
|
||||
{
|
||||
// if array is wanted, do not create an object though possible
|
||||
// if an array is wanted, do not create an object though possible
|
||||
if (manual_type == value_t::array)
|
||||
{
|
||||
is_an_object = false;
|
||||
}
|
||||
|
||||
// if object is wanted but impossible, throw an exception
|
||||
// if an object is wanted but impossible, throw an exception
|
||||
if (JSON_HEDLEY_UNLIKELY(manual_type == value_t::object && !is_an_object))
|
||||
{
|
||||
JSON_THROW(type_error::create(301, "cannot create object from initializer list", nullptr));
|
||||
@ -940,7 +941,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
|
||||
if (is_an_object)
|
||||
{
|
||||
// the initializer list is a list of pairs -> create object
|
||||
// the initializer list is a list of pairs -> create an object
|
||||
m_data.m_type = value_t::object;
|
||||
m_data.m_value = value_t::object;
|
||||
|
||||
@ -954,7 +955,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
}
|
||||
else
|
||||
{
|
||||
// the initializer list describes an array -> create array
|
||||
// the initializer list describes an array -> create an array
|
||||
m_data.m_type = value_t::array;
|
||||
m_data.m_value.array = create<array_t>(init.begin(), init.end());
|
||||
}
|
||||
@ -1042,16 +1043,16 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
JSON_ASSERT(first.m_object != nullptr);
|
||||
JSON_ASSERT(last.m_object != nullptr);
|
||||
|
||||
// make sure iterator fits the current value
|
||||
// make sure the iterator fits the current value
|
||||
if (JSON_HEDLEY_UNLIKELY(first.m_object != last.m_object))
|
||||
{
|
||||
JSON_THROW(invalid_iterator::create(201, "iterators are not compatible", nullptr));
|
||||
}
|
||||
|
||||
// copy type from first iterator
|
||||
// copy type from the first iterator
|
||||
m_data.m_type = first.m_object->m_data.m_type;
|
||||
|
||||
// check if iterator range is complete for primitive values
|
||||
// check if the iterator range is complete for primitive values
|
||||
switch (m_data.m_type)
|
||||
{
|
||||
case value_t::boolean:
|
||||
@ -1231,7 +1232,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
, end_position(other.end_position) // cppcheck-suppress[accessForwarded] TODO check
|
||||
#endif
|
||||
{
|
||||
// check that passed value is valid
|
||||
// check that the passed value is valid
|
||||
other.assert_invariant(false); // cppcheck-suppress[accessForwarded]
|
||||
|
||||
// invalidate payload
|
||||
@ -1257,7 +1258,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
std::is_nothrow_move_assignable<json_base_class_t>::value
|
||||
)
|
||||
{
|
||||
// check that passed value is valid
|
||||
// check that the passed value is valid
|
||||
other.assert_invariant();
|
||||
|
||||
using std::swap;
|
||||
@ -1973,7 +1974,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
}
|
||||
JSON_CATCH (std::out_of_range&)
|
||||
{
|
||||
// create better exception explanation
|
||||
// create a better exception explanation
|
||||
JSON_THROW(out_of_range::create(401, detail::concat("array index ", std::to_string(idx), " is out of range"), this));
|
||||
} // cppcheck-suppress[missingReturn]
|
||||
}
|
||||
@ -1996,7 +1997,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
}
|
||||
JSON_CATCH (std::out_of_range&)
|
||||
{
|
||||
// create better exception explanation
|
||||
// create a better exception explanation
|
||||
JSON_THROW(out_of_range::create(401, detail::concat("array index ", std::to_string(idx), " is out of range"), this));
|
||||
} // cppcheck-suppress[missingReturn]
|
||||
}
|
||||
@ -2086,7 +2087,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
/// @sa https://json.nlohmann.me/api/basic_json/operator%5B%5D/
|
||||
reference operator[](size_type idx)
|
||||
{
|
||||
// implicitly convert null value to an empty array
|
||||
// implicitly convert a null value to an empty array
|
||||
if (is_null())
|
||||
{
|
||||
m_data.m_type = value_t::array;
|
||||
@ -2097,7 +2098,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
// operator[] only works for arrays
|
||||
if (JSON_HEDLEY_LIKELY(is_array()))
|
||||
{
|
||||
// fill up array with null values if given idx is outside range
|
||||
// fill up the array with null values if given idx is outside the range
|
||||
if (idx >= m_data.m_value.array->size())
|
||||
{
|
||||
#if JSON_DIAGNOSTICS
|
||||
@ -2145,7 +2146,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
/// @sa https://json.nlohmann.me/api/basic_json/operator%5B%5D/
|
||||
reference operator[](typename object_t::key_type key) // NOLINT(performance-unnecessary-value-param)
|
||||
{
|
||||
// implicitly convert null value to an empty object
|
||||
// implicitly convert a null value to an empty object
|
||||
if (is_null())
|
||||
{
|
||||
m_data.m_type = value_t::object;
|
||||
@ -2198,7 +2199,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
detail::is_usable_as_basic_json_key_type<basic_json_t, KeyType>::value, int > = 0 >
|
||||
reference operator[](KeyType && key)
|
||||
{
|
||||
// implicitly convert null value to an empty object
|
||||
// implicitly convert a null value to an empty object
|
||||
if (is_null())
|
||||
{
|
||||
m_data.m_type = value_t::object;
|
||||
@ -2255,7 +2256,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
// value only works for objects
|
||||
if (JSON_HEDLEY_LIKELY(is_object()))
|
||||
{
|
||||
// if key is found, return value and given default value otherwise
|
||||
// If 'key' is found, return its value. Otherwise, return `default_value'.
|
||||
const auto it = find(key);
|
||||
if (it != end())
|
||||
{
|
||||
@ -2280,7 +2281,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
// value only works for objects
|
||||
if (JSON_HEDLEY_LIKELY(is_object()))
|
||||
{
|
||||
// if key is found, return value and given default value otherwise
|
||||
// If 'key' is found, return its value. Otherwise, return `default_value'.
|
||||
const auto it = find(key);
|
||||
if (it != end())
|
||||
{
|
||||
@ -2306,7 +2307,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
// value only works for objects
|
||||
if (JSON_HEDLEY_LIKELY(is_object()))
|
||||
{
|
||||
// if key is found, return value and given default value otherwise
|
||||
// If 'key' is found, return its value. Otherwise, return `default_value'.
|
||||
const auto it = find(std::forward<KeyType>(key));
|
||||
if (it != end())
|
||||
{
|
||||
@ -2333,7 +2334,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
// value only works for objects
|
||||
if (JSON_HEDLEY_LIKELY(is_object()))
|
||||
{
|
||||
// if key is found, return value and given default value otherwise
|
||||
// If 'key' is found, return its value. Otherwise, return `default_value'.
|
||||
const auto it = find(std::forward<KeyType>(key));
|
||||
if (it != end())
|
||||
{
|
||||
@ -2356,7 +2357,8 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
// value only works for objects
|
||||
if (JSON_HEDLEY_LIKELY(is_object()))
|
||||
{
|
||||
// if pointer resolves a value, return it or use default value
|
||||
// If the pointer resolves to a value, return it. Otherwise, return
|
||||
// 'default_value'.
|
||||
JSON_TRY
|
||||
{
|
||||
return ptr.get_checked(this).template get<ValueType>();
|
||||
@ -2381,7 +2383,8 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
// value only works for objects
|
||||
if (JSON_HEDLEY_LIKELY(is_object()))
|
||||
{
|
||||
// if pointer resolves a value, return it or use default value
|
||||
// If the pointer resolves to a value, return it. Otherwise, return
|
||||
// 'default_value'.
|
||||
JSON_TRY
|
||||
{
|
||||
return ptr.get_checked(this).template get<ReturnType>();
|
||||
@ -2455,7 +2458,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
std::is_same<IteratorType, typename basic_json_t::const_iterator>::value, int > = 0 >
|
||||
IteratorType erase(IteratorType pos) // NOLINT(performance-unnecessary-value-param)
|
||||
{
|
||||
// make sure iterator fits the current value
|
||||
// make sure the iterator fits the current value
|
||||
if (JSON_HEDLEY_UNLIKELY(this != pos.m_object))
|
||||
{
|
||||
JSON_THROW(invalid_iterator::create(202, "iterator does not fit current value", this));
|
||||
@ -2525,7 +2528,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
std::is_same<IteratorType, typename basic_json_t::const_iterator>::value, int > = 0 >
|
||||
IteratorType erase(IteratorType first, IteratorType last) // NOLINT(performance-unnecessary-value-param)
|
||||
{
|
||||
// make sure iterator fits the current value
|
||||
// make sure the iterator fits the current value
|
||||
if (JSON_HEDLEY_UNLIKELY(this != first.m_object || this != last.m_object))
|
||||
{
|
||||
JSON_THROW(invalid_iterator::create(203, "iterators do not fit current value", this));
|
||||
@ -3120,7 +3123,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
JSON_THROW(type_error::create(308, detail::concat("cannot use push_back() with ", type_name()), this));
|
||||
}
|
||||
|
||||
// transform null object into an array
|
||||
// transform a null object into an array
|
||||
if (is_null())
|
||||
{
|
||||
m_data.m_type = value_t::array;
|
||||
@ -3128,7 +3131,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
assert_invariant();
|
||||
}
|
||||
|
||||
// add element to array (move semantics)
|
||||
// add the element to the array (move semantics)
|
||||
const auto old_capacity = m_data.m_value.array->capacity();
|
||||
m_data.m_value.array->push_back(std::move(val));
|
||||
set_parent(m_data.m_value.array->back(), old_capacity);
|
||||
@ -3153,7 +3156,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
JSON_THROW(type_error::create(308, detail::concat("cannot use push_back() with ", type_name()), this));
|
||||
}
|
||||
|
||||
// transform null object into an array
|
||||
// transform a null object into an array
|
||||
if (is_null())
|
||||
{
|
||||
m_data.m_type = value_t::array;
|
||||
@ -3161,7 +3164,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
assert_invariant();
|
||||
}
|
||||
|
||||
// add element to array
|
||||
// add the element to the array
|
||||
const auto old_capacity = m_data.m_value.array->capacity();
|
||||
m_data.m_value.array->push_back(val);
|
||||
set_parent(m_data.m_value.array->back(), old_capacity);
|
||||
@ -3185,7 +3188,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
JSON_THROW(type_error::create(308, detail::concat("cannot use push_back() with ", type_name()), this));
|
||||
}
|
||||
|
||||
// transform null object into an object
|
||||
// transform a null object into an object
|
||||
if (is_null())
|
||||
{
|
||||
m_data.m_type = value_t::object;
|
||||
@ -3193,7 +3196,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
assert_invariant();
|
||||
}
|
||||
|
||||
// add element to object
|
||||
// add the element to the object
|
||||
auto res = m_data.m_value.object->insert(val);
|
||||
set_parent(res.first->second);
|
||||
}
|
||||
@ -3241,7 +3244,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
JSON_THROW(type_error::create(311, detail::concat("cannot use emplace_back() with ", type_name()), this));
|
||||
}
|
||||
|
||||
// transform null object into an array
|
||||
// transform a null object into an array
|
||||
if (is_null())
|
||||
{
|
||||
m_data.m_type = value_t::array;
|
||||
@ -3249,7 +3252,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
assert_invariant();
|
||||
}
|
||||
|
||||
// add element to array (perfect forwarding)
|
||||
// add the element to the array (perfect forwarding)
|
||||
const auto old_capacity = m_data.m_value.array->capacity();
|
||||
m_data.m_value.array->emplace_back(std::forward<Args>(args)...);
|
||||
return set_parent(m_data.m_value.array->back(), old_capacity);
|
||||
@ -3266,7 +3269,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
JSON_THROW(type_error::create(311, detail::concat("cannot use emplace() with ", type_name()), this));
|
||||
}
|
||||
|
||||
// transform null object into an object
|
||||
// transform a null object into an object
|
||||
if (is_null())
|
||||
{
|
||||
m_data.m_type = value_t::object;
|
||||
@ -3274,11 +3277,11 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
assert_invariant();
|
||||
}
|
||||
|
||||
// add element to array (perfect forwarding)
|
||||
// add the element to the array (perfect forwarding)
|
||||
auto res = m_data.m_value.object->emplace(std::forward<Args>(args)...);
|
||||
set_parent(res.first->second);
|
||||
|
||||
// create result iterator and set iterator to the result of emplace
|
||||
// create a result iterator and set iterator to the result of emplace
|
||||
auto it = begin();
|
||||
it.m_it.object_iterator = res.first;
|
||||
|
||||
@ -3442,7 +3445,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
/// @sa https://json.nlohmann.me/api/basic_json/update/
|
||||
void update(const_iterator first, const_iterator last, bool merge_objects = false) // NOLINT(performance-unnecessary-value-param)
|
||||
{
|
||||
// implicitly convert null value to an empty object
|
||||
// implicitly convert a null value to an empty object
|
||||
if (is_null())
|
||||
{
|
||||
m_data.m_type = value_t::object;
|
||||
@ -4002,7 +4005,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
/// @sa https://json.nlohmann.me/api/basic_json/operator_ltlt/
|
||||
friend std::ostream& operator<<(std::ostream& o, const basic_json& j)
|
||||
{
|
||||
// read width member and use it as indentation parameter if nonzero
|
||||
// read width member and use it as the indentation parameter if nonzero
|
||||
const bool pretty_print = o.width() > 0;
|
||||
const auto indentation = pretty_print ? o.width() : 0;
|
||||
|
||||
@ -4043,10 +4046,11 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
static basic_json parse(InputType&& i,
|
||||
parser_callback_t cb = nullptr,
|
||||
const bool allow_exceptions = true,
|
||||
const bool ignore_comments = false)
|
||||
const bool ignore_comments = false,
|
||||
const bool ignore_trailing_commas = false)
|
||||
{
|
||||
basic_json result;
|
||||
parser(detail::input_adapter(std::forward<InputType>(i)), std::move(cb), allow_exceptions, ignore_comments).parse(true, result); // cppcheck-suppress[accessMoved,accessForwarded]
|
||||
parser(detail::input_adapter(std::forward<InputType>(i)), std::move(cb), allow_exceptions, ignore_comments, ignore_trailing_commas).parse(true, result); // cppcheck-suppress[accessMoved,accessForwarded]
|
||||
return result;
|
||||
}
|
||||
|
||||
@ -4058,10 +4062,11 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
IteratorType last,
|
||||
parser_callback_t cb = nullptr,
|
||||
const bool allow_exceptions = true,
|
||||
const bool ignore_comments = false)
|
||||
const bool ignore_comments = false,
|
||||
const bool ignore_trailing_commas = false)
|
||||
{
|
||||
basic_json result;
|
||||
parser(detail::input_adapter(std::move(first), std::move(last)), std::move(cb), allow_exceptions, ignore_comments).parse(true, result); // cppcheck-suppress[accessMoved]
|
||||
parser(detail::input_adapter(std::move(first), std::move(last)), std::move(cb), allow_exceptions, ignore_comments, ignore_trailing_commas).parse(true, result); // cppcheck-suppress[accessMoved]
|
||||
return result;
|
||||
}
|
||||
|
||||
@ -4070,10 +4075,11 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
static basic_json parse(detail::span_input_adapter&& i,
|
||||
parser_callback_t cb = nullptr,
|
||||
const bool allow_exceptions = true,
|
||||
const bool ignore_comments = false)
|
||||
const bool ignore_comments = false,
|
||||
const bool ignore_trailing_commas = false)
|
||||
{
|
||||
basic_json result;
|
||||
parser(i.get(), std::move(cb), allow_exceptions, ignore_comments).parse(true, result); // cppcheck-suppress[accessMoved]
|
||||
parser(i.get(), std::move(cb), allow_exceptions, ignore_comments, ignore_trailing_commas).parse(true, result); // cppcheck-suppress[accessMoved]
|
||||
return result;
|
||||
}
|
||||
|
||||
@ -4081,26 +4087,29 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
/// @sa https://json.nlohmann.me/api/basic_json/accept/
|
||||
template<typename InputType>
|
||||
static bool accept(InputType&& i,
|
||||
const bool ignore_comments = false)
|
||||
const bool ignore_comments = false,
|
||||
const bool ignore_trailing_commas = false)
|
||||
{
|
||||
return parser(detail::input_adapter(std::forward<InputType>(i)), nullptr, false, ignore_comments).accept(true);
|
||||
return parser(detail::input_adapter(std::forward<InputType>(i)), nullptr, false, ignore_comments, ignore_trailing_commas).accept(true);
|
||||
}
|
||||
|
||||
/// @brief check if the input is valid JSON
|
||||
/// @sa https://json.nlohmann.me/api/basic_json/accept/
|
||||
template<typename IteratorType>
|
||||
static bool accept(IteratorType first, IteratorType last,
|
||||
const bool ignore_comments = false)
|
||||
const bool ignore_comments = false,
|
||||
const bool ignore_trailing_commas = false)
|
||||
{
|
||||
return parser(detail::input_adapter(std::move(first), std::move(last)), nullptr, false, ignore_comments).accept(true);
|
||||
return parser(detail::input_adapter(std::move(first), std::move(last)), nullptr, false, ignore_comments, ignore_trailing_commas).accept(true);
|
||||
}
|
||||
|
||||
JSON_HEDLEY_WARN_UNUSED_RESULT
|
||||
JSON_HEDLEY_DEPRECATED_FOR(3.8.0, accept(ptr, ptr + len))
|
||||
static bool accept(detail::span_input_adapter&& i,
|
||||
const bool ignore_comments = false)
|
||||
const bool ignore_comments = false,
|
||||
const bool ignore_trailing_commas = false)
|
||||
{
|
||||
return parser(i.get(), nullptr, false, ignore_comments).accept(true);
|
||||
return parser(i.get(), nullptr, false, ignore_comments, ignore_trailing_commas).accept(true);
|
||||
}
|
||||
|
||||
/// @brief generate SAX events
|
||||
@ -4110,11 +4119,12 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
static bool sax_parse(InputType&& i, SAX* sax,
|
||||
input_format_t format = input_format_t::json,
|
||||
const bool strict = true,
|
||||
const bool ignore_comments = false)
|
||||
const bool ignore_comments = false,
|
||||
const bool ignore_trailing_commas = false)
|
||||
{
|
||||
auto ia = detail::input_adapter(std::forward<InputType>(i));
|
||||
return format == input_format_t::json
|
||||
? parser(std::move(ia), nullptr, true, ignore_comments).sax_parse(sax, strict)
|
||||
? parser(std::move(ia), nullptr, true, ignore_comments, ignore_trailing_commas).sax_parse(sax, strict)
|
||||
: detail::binary_reader<basic_json, decltype(ia), SAX>(std::move(ia), format).sax_parse(format, sax, strict);
|
||||
}
|
||||
|
||||
@ -4125,11 +4135,12 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
static bool sax_parse(IteratorType first, IteratorType last, SAX* sax,
|
||||
input_format_t format = input_format_t::json,
|
||||
const bool strict = true,
|
||||
const bool ignore_comments = false)
|
||||
const bool ignore_comments = false,
|
||||
const bool ignore_trailing_commas = false)
|
||||
{
|
||||
auto ia = detail::input_adapter(std::move(first), std::move(last));
|
||||
return format == input_format_t::json
|
||||
? parser(std::move(ia), nullptr, true, ignore_comments).sax_parse(sax, strict)
|
||||
? parser(std::move(ia), nullptr, true, ignore_comments, ignore_trailing_commas).sax_parse(sax, strict)
|
||||
: detail::binary_reader<basic_json, decltype(ia), SAX>(std::move(ia), format).sax_parse(format, sax, strict);
|
||||
}
|
||||
|
||||
@ -4144,12 +4155,13 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
static bool sax_parse(detail::span_input_adapter&& i, SAX* sax,
|
||||
input_format_t format = input_format_t::json,
|
||||
const bool strict = true,
|
||||
const bool ignore_comments = false)
|
||||
const bool ignore_comments = false,
|
||||
const bool ignore_trailing_commas = false)
|
||||
{
|
||||
auto ia = i.get();
|
||||
return format == input_format_t::json
|
||||
// NOLINTNEXTLINE(hicpp-move-const-arg,performance-move-const-arg)
|
||||
? parser(std::move(ia), nullptr, true, ignore_comments).sax_parse(sax, strict)
|
||||
? parser(std::move(ia), nullptr, true, ignore_comments, ignore_trailing_commas).sax_parse(sax, strict)
|
||||
// NOLINTNEXTLINE(hicpp-move-const-arg,performance-move-const-arg)
|
||||
: detail::binary_reader<basic_json, decltype(ia), SAX>(std::move(ia), format).sax_parse(format, sax, strict);
|
||||
}
|
||||
@ -4204,8 +4216,9 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
case value_t::number_integer:
|
||||
case value_t::number_unsigned:
|
||||
case value_t::number_float:
|
||||
default:
|
||||
return "number";
|
||||
default:
|
||||
return "invalid";
|
||||
}
|
||||
}
|
||||
|
||||
@ -4797,7 +4810,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
result.at(top_pointer);
|
||||
}
|
||||
|
||||
// get reference to parent of JSON pointer ptr
|
||||
// get reference to the parent of the JSON pointer ptr
|
||||
const auto last_path = ptr.back();
|
||||
ptr.pop_back();
|
||||
// parent must exist when performing patch add per RFC6902 specs
|
||||
@ -4835,7 +4848,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
break;
|
||||
}
|
||||
|
||||
// if there exists a parent it cannot be primitive
|
||||
// if there exists a parent, it cannot be primitive
|
||||
case value_t::string: // LCOV_EXCL_LINE
|
||||
case value_t::boolean: // LCOV_EXCL_LINE
|
||||
case value_t::number_integer: // LCOV_EXCL_LINE
|
||||
@ -4851,7 +4864,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
// wrapper for "remove" operation; remove value at ptr
|
||||
const auto operation_remove = [this, & result](json_pointer & ptr)
|
||||
{
|
||||
// get reference to parent of JSON pointer ptr
|
||||
// get reference to the parent of the JSON pointer ptr
|
||||
const auto last_path = ptr.back();
|
||||
ptr.pop_back();
|
||||
basic_json& parent = result.at(ptr);
|
||||
@ -4897,14 +4910,14 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
// context-sensitive error message
|
||||
const auto error_msg = (op == "op") ? "operation" : detail::concat("operation '", op, '\''); // NOLINT(bugprone-unused-local-non-trivial-variable)
|
||||
|
||||
// check if desired value is present
|
||||
// check if the desired value is present
|
||||
if (JSON_HEDLEY_UNLIKELY(it == val.m_data.m_value.object->end()))
|
||||
{
|
||||
// NOLINTNEXTLINE(performance-inefficient-string-concatenation)
|
||||
JSON_THROW(parse_error::create(105, 0, detail::concat(error_msg, " must have member '", member, "'"), &val));
|
||||
}
|
||||
|
||||
// check if result is of type string
|
||||
// check if the result is of type string
|
||||
if (JSON_HEDLEY_UNLIKELY(string_type && !it->second.is_string()))
|
||||
{
|
||||
// NOLINTNEXTLINE(performance-inefficient-string-concatenation)
|
||||
@ -4993,7 +5006,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
// ignore out of range errors: success remains false
|
||||
}
|
||||
|
||||
// throw an exception if test fails
|
||||
// throw an exception if the test fails
|
||||
if (JSON_HEDLEY_UNLIKELY(!success))
|
||||
{
|
||||
JSON_THROW(other_error::create(501, detail::concat("unsuccessful: ", val.dump()), &val));
|
||||
@ -5031,7 +5044,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
// the patch
|
||||
basic_json result(value_t::array);
|
||||
|
||||
// if the values are the same, return empty patch
|
||||
// if the values are the same, return an empty patch
|
||||
if (source == target)
|
||||
{
|
||||
return result;
|
||||
@ -5145,7 +5158,7 @@ class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-spec
|
||||
case value_t::discarded:
|
||||
default:
|
||||
{
|
||||
// both primitive type: replace value
|
||||
// both primitive types: replace value
|
||||
result.push_back(
|
||||
{
|
||||
{"op", "replace"}, {"path", path}, {"value", target}
|
||||
|
@ -226,7 +226,7 @@ template <class Key, class T, class IgnoredLess = std::less<Key>,
|
||||
|
||||
// Since we cannot move const Keys, we re-construct them in place.
|
||||
// We start at first and re-construct (viz. copy) the elements from
|
||||
// the back of the vector. Example for first iteration:
|
||||
// the back of the vector. Example for the first iteration:
|
||||
|
||||
// ,--------.
|
||||
// v | destroy e and re-construct with h
|
||||
|
File diff suppressed because it is too large
Load Diff
12
tests/module_cpp20/CMakeLists.txt
Normal file
12
tests/module_cpp20/CMakeLists.txt
Normal file
@ -0,0 +1,12 @@
|
||||
cmake_minimum_required(VERSION 3.28)
|
||||
|
||||
project(json_test CXX)
|
||||
|
||||
add_executable(json_test)
|
||||
|
||||
target_sources(json_test
|
||||
PRIVATE main.cpp
|
||||
PUBLIC FILE_SET cxx_modules TYPE CXX_MODULES FILES json.cpp)
|
||||
|
||||
target_compile_features(json_test PUBLIC cxx_std_20)
|
||||
target_include_directories(json_test PRIVATE ../../include)
|
17
tests/module_cpp20/json.cpp
Normal file
17
tests/module_cpp20/json.cpp
Normal file
@ -0,0 +1,17 @@
|
||||
module;
|
||||
#include <nlohmann/json.hpp>
|
||||
export module json;
|
||||
|
||||
export namespace nlohmann
|
||||
{
|
||||
using ::nlohmann::adl_serializer;
|
||||
|
||||
using ::nlohmann::basic_json;
|
||||
using ::nlohmann::json_pointer;
|
||||
|
||||
using ::nlohmann::json;
|
||||
using ::nlohmann::ordered_json;
|
||||
using ::nlohmann::ordered_map;
|
||||
|
||||
using ::nlohmann::json_pointer;
|
||||
} // namespace nlohmann
|
6
tests/module_cpp20/main.cpp
Normal file
6
tests/module_cpp20/main.cpp
Normal file
@ -0,0 +1,6 @@
|
||||
import json;
|
||||
|
||||
int main()
|
||||
{
|
||||
nlohmann::json j;
|
||||
}
|
@ -207,49 +207,49 @@ TEST_CASE("alternative string type")
|
||||
{
|
||||
alt_json doc;
|
||||
doc["pi"] = 3.141;
|
||||
alt_string dump = doc.dump();
|
||||
const alt_string dump = doc.dump();
|
||||
CHECK(dump == R"({"pi":3.141})");
|
||||
}
|
||||
|
||||
{
|
||||
alt_json doc;
|
||||
doc["happy"] = true;
|
||||
alt_string dump = doc.dump();
|
||||
const alt_string dump = doc.dump();
|
||||
CHECK(dump == R"({"happy":true})");
|
||||
}
|
||||
|
||||
{
|
||||
alt_json doc;
|
||||
doc["name"] = "I'm Batman";
|
||||
alt_string dump = doc.dump();
|
||||
const alt_string dump = doc.dump();
|
||||
CHECK(dump == R"({"name":"I'm Batman"})");
|
||||
}
|
||||
|
||||
{
|
||||
alt_json doc;
|
||||
doc["nothing"] = nullptr;
|
||||
alt_string dump = doc.dump();
|
||||
const alt_string dump = doc.dump();
|
||||
CHECK(dump == R"({"nothing":null})");
|
||||
}
|
||||
|
||||
{
|
||||
alt_json doc;
|
||||
doc["answer"]["everything"] = 42;
|
||||
alt_string dump = doc.dump();
|
||||
const alt_string dump = doc.dump();
|
||||
CHECK(dump == R"({"answer":{"everything":42}})");
|
||||
}
|
||||
|
||||
{
|
||||
alt_json doc;
|
||||
doc["list"] = { 1, 0, 2 };
|
||||
alt_string dump = doc.dump();
|
||||
const alt_string dump = doc.dump();
|
||||
CHECK(dump == R"({"list":[1,0,2]})");
|
||||
}
|
||||
|
||||
{
|
||||
alt_json doc;
|
||||
doc["object"] = { {"currency", "USD"}, {"value", 42.99} };
|
||||
alt_string dump = doc.dump();
|
||||
const alt_string dump = doc.dump();
|
||||
CHECK(dump == R"({"object":{"currency":"USD","value":42.99}})");
|
||||
}
|
||||
}
|
||||
@ -257,7 +257,7 @@ TEST_CASE("alternative string type")
|
||||
SECTION("parse")
|
||||
{
|
||||
auto doc = alt_json::parse(R"({"foo": "bar"})");
|
||||
alt_string dump = doc.dump();
|
||||
const alt_string dump = doc.dump();
|
||||
CHECK(dump == R"({"foo":"bar"})");
|
||||
}
|
||||
|
||||
|
@ -1226,21 +1226,21 @@ TEST_CASE("BJData")
|
||||
SECTION("0 (0 00000 0000000000)")
|
||||
{
|
||||
json const j = json::from_bjdata(std::vector<uint8_t>({'h', 0x00, 0x00}));
|
||||
json::number_float_t d{j};
|
||||
const json::number_float_t d{j};
|
||||
CHECK(d == 0.0);
|
||||
}
|
||||
|
||||
SECTION("-0 (1 00000 0000000000)")
|
||||
{
|
||||
json const j = json::from_bjdata(std::vector<uint8_t>({'h', 0x00, 0x80}));
|
||||
json::number_float_t d{j};
|
||||
const json::number_float_t d{j};
|
||||
CHECK(d == -0.0);
|
||||
}
|
||||
|
||||
SECTION("2**-24 (0 00000 0000000001)")
|
||||
{
|
||||
json const j = json::from_bjdata(std::vector<uint8_t>({'h', 0x01, 0x00}));
|
||||
json::number_float_t d{j};
|
||||
const json::number_float_t d{j};
|
||||
CHECK(d == std::pow(2.0, -24.0));
|
||||
}
|
||||
}
|
||||
@ -1250,7 +1250,7 @@ TEST_CASE("BJData")
|
||||
SECTION("infinity (0 11111 0000000000)")
|
||||
{
|
||||
json const j = json::from_bjdata(std::vector<uint8_t>({'h', 0x00, 0x7c}));
|
||||
json::number_float_t d{j};
|
||||
const json::number_float_t d{j};
|
||||
CHECK(d == std::numeric_limits<json::number_float_t>::infinity());
|
||||
CHECK(j.dump() == "null");
|
||||
}
|
||||
@ -1258,7 +1258,7 @@ TEST_CASE("BJData")
|
||||
SECTION("-infinity (1 11111 0000000000)")
|
||||
{
|
||||
json const j = json::from_bjdata(std::vector<uint8_t>({'h', 0x00, 0xfc}));
|
||||
json::number_float_t d{j};
|
||||
const json::number_float_t d{j};
|
||||
CHECK(d == -std::numeric_limits<json::number_float_t>::infinity());
|
||||
CHECK(j.dump() == "null");
|
||||
}
|
||||
@ -1269,21 +1269,21 @@ TEST_CASE("BJData")
|
||||
SECTION("1 (0 01111 0000000000)")
|
||||
{
|
||||
json const j = json::from_bjdata(std::vector<uint8_t>({'h', 0x00, 0x3c}));
|
||||
json::number_float_t d{j};
|
||||
const json::number_float_t d{j};
|
||||
CHECK(d == 1);
|
||||
}
|
||||
|
||||
SECTION("-2 (1 10000 0000000000)")
|
||||
{
|
||||
json const j = json::from_bjdata(std::vector<uint8_t>({'h', 0x00, 0xc0}));
|
||||
json::number_float_t d{j};
|
||||
const json::number_float_t d{j};
|
||||
CHECK(d == -2);
|
||||
}
|
||||
|
||||
SECTION("65504 (0 11110 1111111111)")
|
||||
{
|
||||
json const j = json::from_bjdata(std::vector<uint8_t>({'h', 0xff, 0x7b}));
|
||||
json::number_float_t d{j};
|
||||
const json::number_float_t d{j};
|
||||
CHECK(d == 65504);
|
||||
}
|
||||
}
|
||||
@ -2815,6 +2815,129 @@ TEST_CASE("BJData")
|
||||
#endif
|
||||
}
|
||||
|
||||
SECTION("overflow detection in dimension multiplication")
|
||||
{
|
||||
// Simple SAX handler just to monitor if overflow is detected
|
||||
struct SimpleOverflowSaxHandler : public nlohmann::json_sax<json>
|
||||
{
|
||||
bool overflow_detected = false;
|
||||
|
||||
// Implement all required virtual methods with minimal implementation
|
||||
bool null() override
|
||||
{
|
||||
return true;
|
||||
}
|
||||
bool boolean(bool /*val*/) override
|
||||
{
|
||||
return true;
|
||||
}
|
||||
bool number_integer(json::number_integer_t /*val*/) override
|
||||
{
|
||||
return true;
|
||||
}
|
||||
bool number_unsigned(json::number_unsigned_t /*val*/) override
|
||||
{
|
||||
return true;
|
||||
}
|
||||
bool number_float(json::number_float_t /*val*/, const std::string& /*s*/) override
|
||||
{
|
||||
return true;
|
||||
}
|
||||
bool string(std::string& /*val*/) override
|
||||
{
|
||||
return true;
|
||||
}
|
||||
bool binary(json::binary_t& /*val*/) override
|
||||
{
|
||||
return true;
|
||||
}
|
||||
bool start_object(std::size_t /*elements*/) override
|
||||
{
|
||||
return true;
|
||||
}
|
||||
bool key(std::string& /*val*/) override
|
||||
{
|
||||
return true;
|
||||
}
|
||||
bool end_object() override
|
||||
{
|
||||
return true;
|
||||
}
|
||||
bool start_array(std::size_t /*elements*/) override
|
||||
{
|
||||
return true;
|
||||
}
|
||||
bool end_array() override
|
||||
{
|
||||
return true;
|
||||
}
|
||||
|
||||
// This is the only method we care about - detecting error 408
|
||||
bool parse_error(std::size_t /*position*/, const std::string& /*last_token*/, const json::exception& ex) override
|
||||
{
|
||||
if (ex.id == 408)
|
||||
{
|
||||
overflow_detected = true;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
};
|
||||
|
||||
// Create BJData payload with overflow-causing dimensions (2^32+1) × (2^32)
|
||||
const std::vector<uint8_t> bjdata_payload =
|
||||
{
|
||||
0x5B, // '[' start array
|
||||
0x24, 0x55, // '$', 'U' (type uint8)
|
||||
0x23, 0x5B, // '#', '[' (dimensions array)
|
||||
0x4D, // 'M' (uint64)
|
||||
0x01, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00, // 2^32 + 1 (4294967297) as little-endian
|
||||
0x4D, // 'M' (uint64)
|
||||
0x00, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00, // 2^32 (4294967296) as little-endian
|
||||
0x5D // ']' end dimensions
|
||||
// No data - we don't need it for this test, we just want to hit the overflow check
|
||||
};
|
||||
|
||||
// Test with overflow dimensions using SAX parser
|
||||
{
|
||||
SimpleOverflowSaxHandler handler;
|
||||
const auto result = json::sax_parse(bjdata_payload, &handler,
|
||||
nlohmann::detail::input_format_t::bjdata, false);
|
||||
|
||||
// Should detect overflow
|
||||
CHECK(handler.overflow_detected == true);
|
||||
CHECK(result == false);
|
||||
}
|
||||
|
||||
// Test with DOM parser (should throw)
|
||||
{
|
||||
json _;
|
||||
CHECK_THROWS_AS(_ = json::from_bjdata(bjdata_payload), json::out_of_range);
|
||||
}
|
||||
|
||||
// Test with normal dimensions
|
||||
const std::vector<uint8_t> normal_payload =
|
||||
{
|
||||
0x5B, // '[' start array
|
||||
0x24, 0x55, // '$', 'U' (type uint8)
|
||||
0x23, 0x5B, // '#', '[' (dimensions array)
|
||||
0x55, 0x02, // 'U', 2 (uint8)
|
||||
0x55, 0x03, // 'U', 3 (uint8)
|
||||
0x5D, // ']' end dimensions
|
||||
// 6 data bytes for a 2×3 array (enough to avoid EOF but not entire array)
|
||||
0x01, 0x02, 0x03, 0x04, 0x05, 0x06
|
||||
};
|
||||
|
||||
// For normal dimensions, overflow should not be detected
|
||||
{
|
||||
SimpleOverflowSaxHandler handler;
|
||||
const auto result = json::sax_parse(normal_payload, &handler,
|
||||
nlohmann::detail::input_format_t::bjdata, false);
|
||||
|
||||
CHECK(handler.overflow_detected == false);
|
||||
CHECK(result == true);
|
||||
}
|
||||
}
|
||||
|
||||
SECTION("do not accept NTFZ markers in ndarray optimized type (with count)")
|
||||
{
|
||||
json _;
|
||||
@ -3666,7 +3789,7 @@ TEST_CASE("BJData roundtrips" * doctest::skip())
|
||||
INFO_WITH_TEMP(filename + ": std::vector<uint8_t>");
|
||||
// parse JSON file
|
||||
std::ifstream f_json(filename);
|
||||
json j1 = json::parse(f_json);
|
||||
const json j1 = json::parse(f_json);
|
||||
|
||||
// parse BJData file
|
||||
auto packed = utils::read_binary_file(filename + ".bjdata");
|
||||
@ -3681,7 +3804,7 @@ TEST_CASE("BJData roundtrips" * doctest::skip())
|
||||
INFO_WITH_TEMP(filename + ": std::ifstream");
|
||||
// parse JSON file
|
||||
std::ifstream f_json(filename);
|
||||
json j1 = json::parse(f_json);
|
||||
const json j1 = json::parse(f_json);
|
||||
|
||||
// parse BJData file
|
||||
std::ifstream f_bjdata(filename + ".bjdata", std::ios::binary);
|
||||
|
@ -565,10 +565,10 @@ TEST_CASE("BSON")
|
||||
SECTION("Example 1")
|
||||
{
|
||||
std::vector<std::uint8_t> input = {0x16, 0x00, 0x00, 0x00, 0x02, 'h', 'e', 'l', 'l', 'o', 0x00, 0x06, 0x00, 0x00, 0x00, 'w', 'o', 'r', 'l', 'd', 0x00, 0x00};
|
||||
json parsed = json::from_bson(input);
|
||||
json expected = {{"hello", "world"}};
|
||||
const json parsed = json::from_bson(input);
|
||||
const json expected = {{"hello", "world"}};
|
||||
CHECK(parsed == expected);
|
||||
auto dumped = json::to_bson(parsed);
|
||||
const auto dumped = json::to_bson(parsed);
|
||||
CHECK(dumped == input);
|
||||
CHECK(json::from_bson(dumped) == expected);
|
||||
}
|
||||
@ -576,10 +576,10 @@ TEST_CASE("BSON")
|
||||
SECTION("Example 2")
|
||||
{
|
||||
std::vector<std::uint8_t> input = {0x31, 0x00, 0x00, 0x00, 0x04, 'B', 'S', 'O', 'N', 0x00, 0x26, 0x00, 0x00, 0x00, 0x02, 0x30, 0x00, 0x08, 0x00, 0x00, 0x00, 'a', 'w', 'e', 's', 'o', 'm', 'e', 0x00, 0x01, 0x31, 0x00, 0x33, 0x33, 0x33, 0x33, 0x33, 0x33, 0x14, 0x40, 0x10, 0x32, 0x00, 0xc2, 0x07, 0x00, 0x00, 0x00, 0x00};
|
||||
json parsed = json::from_bson(input);
|
||||
json expected = {{"BSON", {"awesome", 5.05, 1986}}};
|
||||
const json parsed = json::from_bson(input);
|
||||
const json expected = {{"BSON", {"awesome", 5.05, 1986}}};
|
||||
CHECK(parsed == expected);
|
||||
auto dumped = json::to_bson(parsed);
|
||||
const auto dumped = json::to_bson(parsed);
|
||||
CHECK(dumped == input);
|
||||
CHECK(json::from_bson(dumped) == expected);
|
||||
}
|
||||
@ -588,7 +588,7 @@ TEST_CASE("BSON")
|
||||
|
||||
TEST_CASE("BSON input/output_adapters")
|
||||
{
|
||||
json json_representation =
|
||||
const json json_representation =
|
||||
{
|
||||
{"double", 42.5},
|
||||
{"entry", 4.2},
|
||||
@ -596,7 +596,7 @@ TEST_CASE("BSON input/output_adapters")
|
||||
{"object", {{ "string", "value" }}}
|
||||
};
|
||||
|
||||
std::vector<std::uint8_t> const bson_representation =
|
||||
const std::vector<std::uint8_t> bson_representation =
|
||||
{
|
||||
/*size */ 0x4f, 0x00, 0x00, 0x00,
|
||||
/*entry*/ 0x01, 'd', 'o', 'u', 'b', 'l', 'e', 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x40, 0x45, 0x40,
|
||||
@ -621,7 +621,7 @@ TEST_CASE("BSON input/output_adapters")
|
||||
{
|
||||
std::basic_ostringstream<char> ss;
|
||||
json::to_bson(json_representation, ss);
|
||||
json j3 = json::from_bson(ss.str());
|
||||
const json j3 = json::from_bson(ss.str());
|
||||
CHECK(json_representation == j3);
|
||||
}
|
||||
|
||||
@ -629,7 +629,7 @@ TEST_CASE("BSON input/output_adapters")
|
||||
{
|
||||
std::string s;
|
||||
json::to_bson(json_representation, s);
|
||||
json j3 = json::from_bson(s);
|
||||
const json j3 = json::from_bson(s);
|
||||
CHECK(json_representation == j3);
|
||||
}
|
||||
|
||||
@ -637,7 +637,7 @@ TEST_CASE("BSON input/output_adapters")
|
||||
{
|
||||
std::vector<std::uint8_t> v;
|
||||
json::to_bson(json_representation, v);
|
||||
json j3 = json::from_bson(v);
|
||||
const json j3 = json::from_bson(v);
|
||||
CHECK(json_representation == j3);
|
||||
}
|
||||
}
|
||||
@ -1227,7 +1227,7 @@ TEST_CASE("BSON roundtrips" * doctest::skip())
|
||||
INFO_WITH_TEMP(filename + ": std::vector<std::uint8_t>");
|
||||
// parse JSON file
|
||||
std::ifstream f_json(filename);
|
||||
json j1 = json::parse(f_json);
|
||||
const json j1 = json::parse(f_json);
|
||||
|
||||
// parse BSON file
|
||||
auto packed = utils::read_binary_file(filename + ".bson");
|
||||
@ -1242,7 +1242,7 @@ TEST_CASE("BSON roundtrips" * doctest::skip())
|
||||
INFO_WITH_TEMP(filename + ": std::ifstream");
|
||||
// parse JSON file
|
||||
std::ifstream f_json(filename);
|
||||
json j1 = json::parse(f_json);
|
||||
const json j1 = json::parse(f_json);
|
||||
|
||||
// parse BSON file
|
||||
std::ifstream f_bson(filename + ".bson", std::ios::binary);
|
||||
@ -1257,7 +1257,7 @@ TEST_CASE("BSON roundtrips" * doctest::skip())
|
||||
INFO_WITH_TEMP(filename + ": uint8_t* and size");
|
||||
// parse JSON file
|
||||
std::ifstream f_json(filename);
|
||||
json j1 = json::parse(f_json);
|
||||
const json j1 = json::parse(f_json);
|
||||
|
||||
// parse BSON file
|
||||
auto packed = utils::read_binary_file(filename + ".bson");
|
||||
|
@ -1022,21 +1022,21 @@ TEST_CASE("CBOR")
|
||||
SECTION("0 (0 00000 0000000000)")
|
||||
{
|
||||
json const j = json::from_cbor(std::vector<uint8_t>({0xf9, 0x00, 0x00}));
|
||||
json::number_float_t d{j};
|
||||
const json::number_float_t d{j};
|
||||
CHECK(d == 0.0);
|
||||
}
|
||||
|
||||
SECTION("-0 (1 00000 0000000000)")
|
||||
{
|
||||
json const j = json::from_cbor(std::vector<uint8_t>({0xf9, 0x80, 0x00}));
|
||||
json::number_float_t d{j};
|
||||
const json::number_float_t d{j};
|
||||
CHECK(d == -0.0);
|
||||
}
|
||||
|
||||
SECTION("2**-24 (0 00000 0000000001)")
|
||||
{
|
||||
json const j = json::from_cbor(std::vector<uint8_t>({0xf9, 0x00, 0x01}));
|
||||
json::number_float_t d{j};
|
||||
const json::number_float_t d{j};
|
||||
CHECK(d == std::pow(2.0, -24.0));
|
||||
}
|
||||
}
|
||||
@ -1046,7 +1046,7 @@ TEST_CASE("CBOR")
|
||||
SECTION("infinity (0 11111 0000000000)")
|
||||
{
|
||||
json const j = json::from_cbor(std::vector<uint8_t>({0xf9, 0x7c, 0x00}));
|
||||
json::number_float_t d{j};
|
||||
const json::number_float_t d{j};
|
||||
CHECK(d == std::numeric_limits<json::number_float_t>::infinity());
|
||||
CHECK(j.dump() == "null");
|
||||
}
|
||||
@ -1054,7 +1054,7 @@ TEST_CASE("CBOR")
|
||||
SECTION("-infinity (1 11111 0000000000)")
|
||||
{
|
||||
json const j = json::from_cbor(std::vector<uint8_t>({0xf9, 0xfc, 0x00}));
|
||||
json::number_float_t d{j};
|
||||
const json::number_float_t d{j};
|
||||
CHECK(d == -std::numeric_limits<json::number_float_t>::infinity());
|
||||
CHECK(j.dump() == "null");
|
||||
}
|
||||
@ -1065,21 +1065,21 @@ TEST_CASE("CBOR")
|
||||
SECTION("1 (0 01111 0000000000)")
|
||||
{
|
||||
json const j = json::from_cbor(std::vector<uint8_t>({0xf9, 0x3c, 0x00}));
|
||||
json::number_float_t d{j};
|
||||
const json::number_float_t d{j};
|
||||
CHECK(d == 1);
|
||||
}
|
||||
|
||||
SECTION("-2 (1 10000 0000000000)")
|
||||
{
|
||||
json const j = json::from_cbor(std::vector<uint8_t>({0xf9, 0xc0, 0x00}));
|
||||
json::number_float_t d{j};
|
||||
const json::number_float_t d{j};
|
||||
CHECK(d == -2);
|
||||
}
|
||||
|
||||
SECTION("65504 (0 11110 1111111111)")
|
||||
{
|
||||
json const j = json::from_cbor(std::vector<uint8_t>({0xf9, 0x7b, 0xff}));
|
||||
json::number_float_t d{j};
|
||||
const json::number_float_t d{j};
|
||||
CHECK(d == 65504);
|
||||
}
|
||||
}
|
||||
@ -1942,7 +1942,7 @@ TEST_CASE("CBOR regressions")
|
||||
{
|
||||
// parse CBOR file
|
||||
auto vec1 = utils::read_binary_file(filename);
|
||||
json j1 = json::from_cbor(vec1);
|
||||
const json j1 = json::from_cbor(vec1);
|
||||
|
||||
try
|
||||
{
|
||||
@ -2143,7 +2143,7 @@ TEST_CASE("CBOR roundtrips" * doctest::skip())
|
||||
INFO_WITH_TEMP(filename + ": std::vector<uint8_t>");
|
||||
// parse JSON file
|
||||
std::ifstream f_json(filename);
|
||||
json j1 = json::parse(f_json);
|
||||
const json j1 = json::parse(f_json);
|
||||
|
||||
// parse CBOR file
|
||||
const auto packed = utils::read_binary_file(filename + ".cbor");
|
||||
@ -2158,7 +2158,7 @@ TEST_CASE("CBOR roundtrips" * doctest::skip())
|
||||
INFO_WITH_TEMP(filename + ": std::ifstream");
|
||||
// parse JSON file
|
||||
std::ifstream f_json(filename);
|
||||
json j1 = json::parse(f_json);
|
||||
const json j1 = json::parse(f_json);
|
||||
|
||||
// parse CBOR file
|
||||
std::ifstream f_cbor(filename + ".cbor", std::ios::binary);
|
||||
@ -2173,7 +2173,7 @@ TEST_CASE("CBOR roundtrips" * doctest::skip())
|
||||
INFO_WITH_TEMP(filename + ": uint8_t* and size");
|
||||
// parse JSON file
|
||||
std::ifstream f_json(filename);
|
||||
json j1 = json::parse(f_json);
|
||||
const json j1 = json::parse(f_json);
|
||||
|
||||
// parse CBOR file
|
||||
const auto packed = utils::read_binary_file(filename + ".cbor");
|
||||
|
@ -206,6 +206,7 @@ class SaxCountdown : public nlohmann::json::json_sax_t
|
||||
json parser_helper(const std::string& s);
|
||||
bool accept_helper(const std::string& s);
|
||||
void comments_helper(const std::string& s);
|
||||
void trailing_comma_helper(const std::string& s);
|
||||
|
||||
json parser_helper(const std::string& s)
|
||||
{
|
||||
@ -225,6 +226,8 @@ json parser_helper(const std::string& s)
|
||||
|
||||
comments_helper(s);
|
||||
|
||||
trailing_comma_helper(s);
|
||||
|
||||
return j;
|
||||
}
|
||||
|
||||
@ -259,10 +262,11 @@ bool accept_helper(const std::string& s)
|
||||
// 6. check if this approach came to the same result
|
||||
CHECK(ok_noexcept == ok_noexcept_cb);
|
||||
|
||||
// 7. check if comments are properly ignored
|
||||
// 7. check if comments or trailing commas are properly ignored
|
||||
if (ok_accept)
|
||||
{
|
||||
comments_helper(s);
|
||||
trailing_comma_helper(s);
|
||||
}
|
||||
|
||||
// 8. return result
|
||||
@ -302,6 +306,38 @@ void comments_helper(const std::string& s)
|
||||
}
|
||||
}
|
||||
|
||||
void trailing_comma_helper(const std::string& s)
|
||||
{
|
||||
json _;
|
||||
|
||||
// parse/accept with default parser
|
||||
CHECK_NOTHROW(_ = json::parse(s));
|
||||
CHECK(json::accept(s));
|
||||
|
||||
// parse/accept while allowing trailing commas
|
||||
CHECK_NOTHROW(_ = json::parse(s, nullptr, false, false, true));
|
||||
CHECK(json::accept(s, false, true));
|
||||
|
||||
// note: [,] and {,} are not allowed
|
||||
if (s.size() > 1 && (s.back() == ']' || s.back() == '}') && !_.empty())
|
||||
{
|
||||
std::vector<std::string> json_with_trailing_commas;
|
||||
json_with_trailing_commas.push_back(s.substr(0, s.size() - 1) + " ," + s.back());
|
||||
json_with_trailing_commas.push_back(s.substr(0, s.size() - 1) + "," + s.back());
|
||||
json_with_trailing_commas.push_back(s.substr(0, s.size() - 1) + ", " + s.back());
|
||||
|
||||
for (const auto& json_with_trailing_comma : json_with_trailing_commas)
|
||||
{
|
||||
CAPTURE(json_with_trailing_comma)
|
||||
CHECK_THROWS_AS(_ = json::parse(json_with_trailing_comma), json::parse_error);
|
||||
CHECK(!json::accept(json_with_trailing_comma));
|
||||
|
||||
CHECK_NOTHROW(_ = json::parse(json_with_trailing_comma, nullptr, true, false, true));
|
||||
CHECK(json::accept(json_with_trailing_comma, false, true));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
} // namespace
|
||||
|
||||
TEST_CASE("parser class")
|
||||
@ -1366,7 +1402,7 @@ TEST_CASE("parser class")
|
||||
return event != json::parse_event_t::key;
|
||||
};
|
||||
|
||||
json x = json::parse("{\"key\": false}", cb);
|
||||
const json x = json::parse("{\"key\": false}", cb);
|
||||
CHECK(x == json::object());
|
||||
}
|
||||
}
|
||||
@ -1400,14 +1436,14 @@ TEST_CASE("parser class")
|
||||
|
||||
SECTION("filter nothing")
|
||||
{
|
||||
json j_object = json::parse(s_object, [](int /*unused*/, json::parse_event_t /*unused*/, const json& /*unused*/) noexcept
|
||||
const json j_object = json::parse(s_object, [](int /*unused*/, json::parse_event_t /*unused*/, const json& /*unused*/) noexcept
|
||||
{
|
||||
return true;
|
||||
});
|
||||
|
||||
CHECK (j_object == json({{"foo", 2}, {"bar", {{"baz", 1}}}}));
|
||||
|
||||
json j_array = json::parse(s_array, [](int /*unused*/, json::parse_event_t /*unused*/, const json& /*unused*/) noexcept
|
||||
const json j_array = json::parse(s_array, [](int /*unused*/, json::parse_event_t /*unused*/, const json& /*unused*/) noexcept
|
||||
{
|
||||
return true;
|
||||
});
|
||||
@ -1436,7 +1472,7 @@ TEST_CASE("parser class")
|
||||
|
||||
SECTION("filter specific element")
|
||||
{
|
||||
json j_object = json::parse(s_object, [](int /*unused*/, json::parse_event_t event, const json & j) noexcept
|
||||
const json j_object = json::parse(s_object, [](int /*unused*/, json::parse_event_t event, const json & j) noexcept
|
||||
{
|
||||
// filter all number(2) elements
|
||||
return event != json::parse_event_t::value || j != json(2);
|
||||
@ -1444,7 +1480,7 @@ TEST_CASE("parser class")
|
||||
|
||||
CHECK (j_object == json({{"bar", {{"baz", 1}}}}));
|
||||
|
||||
json j_array = json::parse(s_array, [](int /*unused*/, json::parse_event_t event, const json & j) noexcept
|
||||
const json j_array = json::parse(s_array, [](int /*unused*/, json::parse_event_t event, const json & j) noexcept
|
||||
{
|
||||
return event != json::parse_event_t::value || j != json(2);
|
||||
});
|
||||
@ -1454,7 +1490,7 @@ TEST_CASE("parser class")
|
||||
|
||||
SECTION("filter object in array")
|
||||
{
|
||||
json j_filtered1 = json::parse(structured_array, [](int /*unused*/, json::parse_event_t e, const json & parsed)
|
||||
const json j_filtered1 = json::parse(structured_array, [](int /*unused*/, json::parse_event_t e, const json & parsed)
|
||||
{
|
||||
return !(e == json::parse_event_t::object_end && parsed.contains("foo"));
|
||||
});
|
||||
@ -1463,7 +1499,7 @@ TEST_CASE("parser class")
|
||||
CHECK (j_filtered1.size() == 2);
|
||||
CHECK (j_filtered1 == json({1, {{"qux", "baz"}}}));
|
||||
|
||||
json j_filtered2 = json::parse(structured_array, [](int /*unused*/, json::parse_event_t e, const json& /*parsed*/) noexcept
|
||||
const json j_filtered2 = json::parse(structured_array, [](int /*unused*/, json::parse_event_t e, const json& /*parsed*/) noexcept
|
||||
{
|
||||
return e != json::parse_event_t::object_end;
|
||||
});
|
||||
@ -1478,7 +1514,7 @@ TEST_CASE("parser class")
|
||||
SECTION("first closing event")
|
||||
{
|
||||
{
|
||||
json j_object = json::parse(s_object, [](int /*unused*/, json::parse_event_t e, const json& /*unused*/) noexcept
|
||||
const json j_object = json::parse(s_object, [](int /*unused*/, json::parse_event_t e, const json& /*unused*/) noexcept
|
||||
{
|
||||
static bool first = true;
|
||||
if (e == json::parse_event_t::object_end && first)
|
||||
@ -1495,7 +1531,7 @@ TEST_CASE("parser class")
|
||||
}
|
||||
|
||||
{
|
||||
json j_array = json::parse(s_array, [](int /*unused*/, json::parse_event_t e, const json& /*unused*/) noexcept
|
||||
const json j_array = json::parse(s_array, [](int /*unused*/, json::parse_event_t e, const json& /*unused*/) noexcept
|
||||
{
|
||||
static bool first = true;
|
||||
if (e == json::parse_event_t::array_end && first)
|
||||
@ -1519,13 +1555,13 @@ TEST_CASE("parser class")
|
||||
// object and array is discarded only after the closing character
|
||||
// has been read
|
||||
|
||||
json j_empty_object = json::parse("{}", [](int /*unused*/, json::parse_event_t e, const json& /*unused*/) noexcept
|
||||
const json j_empty_object = json::parse("{}", [](int /*unused*/, json::parse_event_t e, const json& /*unused*/) noexcept
|
||||
{
|
||||
return e != json::parse_event_t::object_end;
|
||||
});
|
||||
CHECK(j_empty_object == json());
|
||||
|
||||
json j_empty_array = json::parse("[]", [](int /*unused*/, json::parse_event_t e, const json& /*unused*/) noexcept
|
||||
const json j_empty_array = json::parse("[]", [](int /*unused*/, json::parse_event_t e, const json& /*unused*/) noexcept
|
||||
{
|
||||
return e != json::parse_event_t::array_end;
|
||||
});
|
||||
|
@ -1413,7 +1413,7 @@ TEST_CASE("parser class")
|
||||
return event != json::parse_event_t::key;
|
||||
};
|
||||
|
||||
json x = json::parse("{\"key\": false}", cb);
|
||||
const json x = json::parse("{\"key\": false}", cb);
|
||||
CHECK(x == json::object());
|
||||
}
|
||||
}
|
||||
@ -1447,14 +1447,14 @@ TEST_CASE("parser class")
|
||||
|
||||
SECTION("filter nothing")
|
||||
{
|
||||
json j_object = json::parse(s_object, [](int /*unused*/, json::parse_event_t /*unused*/, const json& /*unused*/) noexcept
|
||||
const json j_object = json::parse(s_object, [](int /*unused*/, json::parse_event_t /*unused*/, const json& /*unused*/) noexcept
|
||||
{
|
||||
return true;
|
||||
});
|
||||
|
||||
CHECK (j_object == json({{"foo", 2}, {"bar", {{"baz", 1}}}}));
|
||||
|
||||
json j_array = json::parse(s_array, [](int /*unused*/, json::parse_event_t /*unused*/, const json& /*unused*/) noexcept
|
||||
const json j_array = json::parse(s_array, [](int /*unused*/, json::parse_event_t /*unused*/, const json& /*unused*/) noexcept
|
||||
{
|
||||
return true;
|
||||
});
|
||||
@ -1483,7 +1483,7 @@ TEST_CASE("parser class")
|
||||
|
||||
SECTION("filter specific element")
|
||||
{
|
||||
json j_object = json::parse(s_object, [](int /*unused*/, json::parse_event_t event, const json & j) noexcept
|
||||
const json j_object = json::parse(s_object, [](int /*unused*/, json::parse_event_t event, const json & j) noexcept
|
||||
{
|
||||
// filter all number(2) elements
|
||||
return event != json::parse_event_t::value || j != json(2);
|
||||
@ -1491,7 +1491,7 @@ TEST_CASE("parser class")
|
||||
|
||||
CHECK (j_object == json({{"bar", {{"baz", 1}}}}));
|
||||
|
||||
json j_array = json::parse(s_array, [](int /*unused*/, json::parse_event_t event, const json & j) noexcept
|
||||
const json j_array = json::parse(s_array, [](int /*unused*/, json::parse_event_t event, const json & j) noexcept
|
||||
{
|
||||
return event != json::parse_event_t::value || j != json(2);
|
||||
});
|
||||
@ -1501,7 +1501,7 @@ TEST_CASE("parser class")
|
||||
|
||||
SECTION("filter object in array")
|
||||
{
|
||||
json j_filtered1 = json::parse(structured_array, [](int /*unused*/, json::parse_event_t e, const json & parsed)
|
||||
const json j_filtered1 = json::parse(structured_array, [](int /*unused*/, json::parse_event_t e, const json & parsed)
|
||||
{
|
||||
return !(e == json::parse_event_t::object_end && parsed.contains("foo"));
|
||||
});
|
||||
@ -1510,7 +1510,7 @@ TEST_CASE("parser class")
|
||||
CHECK (j_filtered1.size() == 2);
|
||||
CHECK (j_filtered1 == json({1, {{"qux", "baz"}}}));
|
||||
|
||||
json j_filtered2 = json::parse(structured_array, [](int /*unused*/, json::parse_event_t e, const json& /*parsed*/) noexcept
|
||||
const json j_filtered2 = json::parse(structured_array, [](int /*unused*/, json::parse_event_t e, const json& /*parsed*/) noexcept
|
||||
{
|
||||
return e != json::parse_event_t::object_end;
|
||||
});
|
||||
@ -1525,7 +1525,7 @@ TEST_CASE("parser class")
|
||||
SECTION("first closing event")
|
||||
{
|
||||
{
|
||||
json j_object = json::parse(s_object, [](int /*unused*/, json::parse_event_t e, const json& /*unused*/) noexcept
|
||||
const json j_object = json::parse(s_object, [](int /*unused*/, json::parse_event_t e, const json& /*unused*/) noexcept
|
||||
{
|
||||
static bool first = true;
|
||||
if (e == json::parse_event_t::object_end && first)
|
||||
@ -1542,7 +1542,7 @@ TEST_CASE("parser class")
|
||||
}
|
||||
|
||||
{
|
||||
json j_array = json::parse(s_array, [](int /*unused*/, json::parse_event_t e, const json& /*unused*/) noexcept
|
||||
const json j_array = json::parse(s_array, [](int /*unused*/, json::parse_event_t e, const json& /*unused*/) noexcept
|
||||
{
|
||||
static bool first = true;
|
||||
if (e == json::parse_event_t::array_end && first)
|
||||
@ -1566,13 +1566,13 @@ TEST_CASE("parser class")
|
||||
// object and array is discarded only after the closing character
|
||||
// has been read
|
||||
|
||||
json j_empty_object = json::parse("{}", [](int /*unused*/, json::parse_event_t e, const json& /*unused*/) noexcept
|
||||
const json j_empty_object = json::parse("{}", [](int /*unused*/, json::parse_event_t e, const json& /*unused*/) noexcept
|
||||
{
|
||||
return e != json::parse_event_t::object_end;
|
||||
});
|
||||
CHECK(j_empty_object == json());
|
||||
|
||||
json j_empty_array = json::parse("[]", [](int /*unused*/, json::parse_event_t e, const json& /*unused*/) noexcept
|
||||
const json j_empty_array = json::parse("[]", [](int /*unused*/, json::parse_event_t e, const json& /*unused*/) noexcept
|
||||
{
|
||||
return e != json::parse_event_t::array_end;
|
||||
});
|
||||
|
@ -576,7 +576,7 @@ TEST_CASE("lexicographical comparison operators")
|
||||
[1,2,[3,4,5],4,5]
|
||||
)";
|
||||
|
||||
json j_object = json::parse(s_object, [](int /*unused*/, json::parse_event_t /*unused*/, const json & j) noexcept
|
||||
const json j_object = json::parse(s_object, [](int /*unused*/, json::parse_event_t /*unused*/, const json & j) noexcept
|
||||
{
|
||||
// filter all number(2) elements
|
||||
return j != json(2);
|
||||
@ -584,7 +584,7 @@ TEST_CASE("lexicographical comparison operators")
|
||||
|
||||
CHECK (j_object == json({{"bar", {{"baz", 1}}}}));
|
||||
|
||||
json j_array = json::parse(s_array, [](int /*unused*/, json::parse_event_t /*unused*/, const json & j) noexcept
|
||||
const json j_array = json::parse(s_array, [](int /*unused*/, json::parse_event_t /*unused*/, const json & j) noexcept
|
||||
{
|
||||
return j != json(2);
|
||||
});
|
||||
|
@ -779,7 +779,7 @@ TEST_CASE("constructors")
|
||||
|
||||
SECTION("integer literal with u suffix")
|
||||
{
|
||||
json j(42u);
|
||||
const json j(42u);
|
||||
CHECK(j.type() == json::value_t::number_unsigned);
|
||||
CHECK(j == j_unsigned_reference);
|
||||
}
|
||||
@ -793,7 +793,7 @@ TEST_CASE("constructors")
|
||||
|
||||
SECTION("integer literal with ul suffix")
|
||||
{
|
||||
json j(42ul);
|
||||
const json j(42ul);
|
||||
CHECK(j.type() == json::value_t::number_unsigned);
|
||||
CHECK(j == j_unsigned_reference);
|
||||
}
|
||||
@ -807,7 +807,7 @@ TEST_CASE("constructors")
|
||||
|
||||
SECTION("integer literal with ull suffix")
|
||||
{
|
||||
json j(42ull);
|
||||
const json j(42ull);
|
||||
CHECK(j.type() == json::value_t::number_unsigned);
|
||||
CHECK(j == j_unsigned_reference);
|
||||
}
|
||||
@ -1362,7 +1362,7 @@ TEST_CASE("constructors")
|
||||
{
|
||||
{
|
||||
json jarray = {1, 2, 3, 4, 5};
|
||||
json j_new(jarray.begin(), jarray.begin());
|
||||
const json j_new(jarray.begin(), jarray.begin());
|
||||
CHECK(j_new == json::array());
|
||||
}
|
||||
{
|
||||
|
@ -17,63 +17,63 @@ TEST_CASE("other constructors and destructor")
|
||||
{
|
||||
SECTION("object")
|
||||
{
|
||||
json j {{"foo", 1}, {"bar", false}};
|
||||
const json j {{"foo", 1}, {"bar", false}};
|
||||
json k(j); // NOLINT(performance-unnecessary-copy-initialization)
|
||||
CHECK(j == k);
|
||||
}
|
||||
|
||||
SECTION("array")
|
||||
{
|
||||
json j {"foo", 1, 42.23, false};
|
||||
const json j {"foo", 1, 42.23, false};
|
||||
json k(j); // NOLINT(performance-unnecessary-copy-initialization)
|
||||
CHECK(j == k);
|
||||
}
|
||||
|
||||
SECTION("null")
|
||||
{
|
||||
json j(nullptr);
|
||||
const json j(nullptr);
|
||||
json k(j); // NOLINT(performance-unnecessary-copy-initialization)
|
||||
CHECK(j == k);
|
||||
}
|
||||
|
||||
SECTION("boolean")
|
||||
{
|
||||
json j(true);
|
||||
const json j(true);
|
||||
json k(j); // NOLINT(performance-unnecessary-copy-initialization)
|
||||
CHECK(j == k);
|
||||
}
|
||||
|
||||
SECTION("string")
|
||||
{
|
||||
json j("Hello world");
|
||||
const json j("Hello world");
|
||||
json k(j); // NOLINT(performance-unnecessary-copy-initialization)
|
||||
CHECK(j == k);
|
||||
}
|
||||
|
||||
SECTION("number (integer)")
|
||||
{
|
||||
json j(42);
|
||||
const json j(42);
|
||||
json k(j); // NOLINT(performance-unnecessary-copy-initialization)
|
||||
CHECK(j == k);
|
||||
}
|
||||
|
||||
SECTION("number (unsigned)")
|
||||
{
|
||||
json j(42u);
|
||||
const json j(42u);
|
||||
json k(j); // NOLINT(performance-unnecessary-copy-initialization)
|
||||
CHECK(j == k);
|
||||
}
|
||||
|
||||
SECTION("number (floating-point)")
|
||||
{
|
||||
json j(42.23);
|
||||
const json j(42.23);
|
||||
json k(j); // NOLINT(performance-unnecessary-copy-initialization)
|
||||
CHECK(j == k);
|
||||
}
|
||||
|
||||
SECTION("binary")
|
||||
{
|
||||
json j = json::binary({1, 2, 3});
|
||||
const json j = json::binary({1, 2, 3});
|
||||
json k(j); // NOLINT(performance-unnecessary-copy-initialization)
|
||||
CHECK(j == k);
|
||||
}
|
||||
@ -92,7 +92,7 @@ TEST_CASE("other constructors and destructor")
|
||||
{
|
||||
SECTION("object")
|
||||
{
|
||||
json j {{"foo", 1}, {"bar", false}};
|
||||
const json j {{"foo", 1}, {"bar", false}};
|
||||
json k;
|
||||
k = j;
|
||||
CHECK(j == k);
|
||||
@ -100,7 +100,7 @@ TEST_CASE("other constructors and destructor")
|
||||
|
||||
SECTION("array")
|
||||
{
|
||||
json j {"foo", 1, 42.23, false};
|
||||
const json j {"foo", 1, 42.23, false};
|
||||
json k;
|
||||
k = j;
|
||||
CHECK(j == k);
|
||||
@ -108,7 +108,7 @@ TEST_CASE("other constructors and destructor")
|
||||
|
||||
SECTION("null")
|
||||
{
|
||||
json j(nullptr);
|
||||
const json j(nullptr);
|
||||
json k;
|
||||
k = j;
|
||||
CHECK(j == k);
|
||||
@ -116,7 +116,7 @@ TEST_CASE("other constructors and destructor")
|
||||
|
||||
SECTION("boolean")
|
||||
{
|
||||
json j(true);
|
||||
const json j(true);
|
||||
json k;
|
||||
k = j;
|
||||
CHECK(j == k);
|
||||
@ -124,7 +124,7 @@ TEST_CASE("other constructors and destructor")
|
||||
|
||||
SECTION("string")
|
||||
{
|
||||
json j("Hello world");
|
||||
const json j("Hello world");
|
||||
json k;
|
||||
k = j;
|
||||
CHECK(j == k);
|
||||
@ -132,7 +132,7 @@ TEST_CASE("other constructors and destructor")
|
||||
|
||||
SECTION("number (integer)")
|
||||
{
|
||||
json j(42);
|
||||
const json j(42);
|
||||
json k;
|
||||
k = j;
|
||||
CHECK(j == k);
|
||||
@ -140,7 +140,7 @@ TEST_CASE("other constructors and destructor")
|
||||
|
||||
SECTION("number (unsigned)")
|
||||
{
|
||||
json j(42u);
|
||||
const json j(42u);
|
||||
json k;
|
||||
k = j;
|
||||
CHECK(j == k);
|
||||
@ -148,7 +148,7 @@ TEST_CASE("other constructors and destructor")
|
||||
|
||||
SECTION("number (floating-point)")
|
||||
{
|
||||
json j(42.23);
|
||||
const json j(42.23);
|
||||
json k;
|
||||
k = j;
|
||||
CHECK(j == k);
|
||||
@ -156,7 +156,7 @@ TEST_CASE("other constructors and destructor")
|
||||
|
||||
SECTION("binary")
|
||||
{
|
||||
json j = json::binary({1, 2, 3});
|
||||
const json j = json::binary({1, 2, 3});
|
||||
json k;
|
||||
k = j;
|
||||
CHECK(j == k);
|
||||
|
@ -179,9 +179,9 @@ TEST_CASE("convenience functions")
|
||||
|
||||
SECTION("std::string")
|
||||
{
|
||||
std::string str1 = concat(hello_iter, world, '!');
|
||||
std::string str2 = concat(hello_data, world, '!');
|
||||
std::string str3 = concat("Hello, ", world, '!');
|
||||
const std::string str1 = concat(hello_iter, world, '!');
|
||||
const std::string str2 = concat(hello_data, world, '!');
|
||||
const std::string str3 = concat("Hello, ", world, '!');
|
||||
|
||||
CHECK(str1 == expected);
|
||||
CHECK(str2 == expected);
|
||||
@ -190,14 +190,14 @@ TEST_CASE("convenience functions")
|
||||
|
||||
SECTION("alt_string_iter")
|
||||
{
|
||||
alt_string_iter str = concat<alt_string_iter>(hello_iter, world, '!');
|
||||
const alt_string_iter str = concat<alt_string_iter>(hello_iter, world, '!');
|
||||
|
||||
CHECK(str.impl == expected);
|
||||
}
|
||||
|
||||
SECTION("alt_string_data")
|
||||
{
|
||||
alt_string_data str = concat<alt_string_data>(hello_data, world, '!');
|
||||
const alt_string_data str = concat<alt_string_data>(hello_data, world, '!');
|
||||
|
||||
CHECK(str.impl == expected);
|
||||
}
|
||||
|
@ -1514,7 +1514,7 @@ TEST_CASE("value conversion")
|
||||
|
||||
SECTION("std::map (array of pairs)")
|
||||
{
|
||||
std::map<int, int> m{{0, 1}, {1, 2}, {2, 3}};
|
||||
const std::map<int, int> m{{0, 1}, {1, 2}, {2, 3}};
|
||||
json const j6 = m;
|
||||
|
||||
auto m2 = j6.get<std::map<int, int>>();
|
||||
@ -1539,7 +1539,7 @@ TEST_CASE("value conversion")
|
||||
|
||||
SECTION("std::unordered_map (array of pairs)")
|
||||
{
|
||||
std::unordered_map<int, int> m{{0, 1}, {1, 2}, {2, 3}};
|
||||
const std::unordered_map<int, int> m{{0, 1}, {1, 2}, {2, 3}};
|
||||
json const j6 = m;
|
||||
|
||||
auto m2 = j6.get<std::unordered_map<int, int>>();
|
||||
|
@ -227,7 +227,7 @@ TEST_CASE("deserialization")
|
||||
ss1 << R"(["foo",1,2,3,false,{"one":1}])";
|
||||
ss2 << R"(["foo",1,2,3,false,{"one":1}])";
|
||||
ss3 << R"(["foo",1,2,3,false,{"one":1}])";
|
||||
json j = json::parse(ss1);
|
||||
const json j = json::parse(ss1);
|
||||
CHECK(json::accept(ss2));
|
||||
CHECK(j == json({"foo", 1, 2, 3, false, {{"one", 1}}}));
|
||||
|
||||
@ -246,7 +246,7 @@ TEST_CASE("deserialization")
|
||||
SECTION("string literal")
|
||||
{
|
||||
const auto* s = R"(["foo",1,2,3,false,{"one":1}])";
|
||||
json j = json::parse(s);
|
||||
const json j = json::parse(s);
|
||||
CHECK(json::accept(s));
|
||||
CHECK(j == json({"foo", 1, 2, 3, false, {{"one", 1}}}));
|
||||
|
||||
@ -265,7 +265,7 @@ TEST_CASE("deserialization")
|
||||
SECTION("string_t")
|
||||
{
|
||||
json::string_t const s = R"(["foo",1,2,3,false,{"one":1}])";
|
||||
json j = json::parse(s);
|
||||
const json j = json::parse(s);
|
||||
CHECK(json::accept(s));
|
||||
CHECK(j == json({"foo", 1, 2, 3, false, {{"one", 1}}}));
|
||||
|
||||
@ -1134,9 +1134,10 @@ TEST_CASE("deserialization")
|
||||
}
|
||||
}
|
||||
|
||||
// select the types to test - char8_t is only available in C++20
|
||||
// select the types to test - char8_t is only available since C++20 if and only
|
||||
// if __cpp_char8_t is defined.
|
||||
#define TYPE_LIST(...) __VA_ARGS__
|
||||
#ifdef JSON_HAS_CPP_20
|
||||
#if defined(__cpp_char8_t) && (__cpp_char8_t >= 201811L)
|
||||
#define ASCII_TYPES TYPE_LIST(char, wchar_t, char16_t, char32_t, char8_t)
|
||||
#else
|
||||
#define ASCII_TYPES TYPE_LIST(char, wchar_t, char16_t, char32_t)
|
||||
|
@ -16,7 +16,7 @@ TEST_CASE("element access 1")
|
||||
SECTION("array")
|
||||
{
|
||||
json j = {1, 1u, true, nullptr, "string", 42.23, json::object(), {1, 2, 3}};
|
||||
const json j_const = j;
|
||||
const json j_const = j; // NOLINT(performance-unnecessary-copy-initialization)
|
||||
|
||||
SECTION("access specified element with bounds checking")
|
||||
{
|
||||
@ -289,13 +289,13 @@ TEST_CASE("element access 1")
|
||||
{
|
||||
{
|
||||
json jarray = {1, 1u, true, nullptr, "string", 42.23, json::object(), {1, 2, 3}};
|
||||
json::iterator it2 = jarray.erase(jarray.begin(), jarray.end());
|
||||
const json::iterator it2 = jarray.erase(jarray.begin(), jarray.end());
|
||||
CHECK(jarray == json::array());
|
||||
CHECK(it2 == jarray.end());
|
||||
}
|
||||
{
|
||||
json jarray = {1, 1u, true, nullptr, "string", 42.23, json::object(), {1, 2, 3}};
|
||||
json::const_iterator it2 = jarray.erase(jarray.cbegin(), jarray.cend());
|
||||
const json::const_iterator it2 = jarray.erase(jarray.cbegin(), jarray.cend());
|
||||
CHECK(jarray == json::array());
|
||||
CHECK(it2 == jarray.cend());
|
||||
}
|
||||
@ -537,13 +537,13 @@ TEST_CASE("element access 1")
|
||||
{
|
||||
{
|
||||
json j = "foo";
|
||||
json::iterator it = j.erase(j.begin());
|
||||
const json::iterator it = j.erase(j.begin());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
{
|
||||
json j = "bar";
|
||||
json::const_iterator it = j.erase(j.cbegin());
|
||||
const json::const_iterator it = j.erase(j.cbegin());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
@ -553,13 +553,13 @@ TEST_CASE("element access 1")
|
||||
{
|
||||
{
|
||||
json j = false;
|
||||
json::iterator it = j.erase(j.begin());
|
||||
const json::iterator it = j.erase(j.begin());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
{
|
||||
json j = true;
|
||||
json::const_iterator it = j.erase(j.cbegin());
|
||||
const json::const_iterator it = j.erase(j.cbegin());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
@ -569,13 +569,13 @@ TEST_CASE("element access 1")
|
||||
{
|
||||
{
|
||||
json j = 17;
|
||||
json::iterator it = j.erase(j.begin());
|
||||
const json::iterator it = j.erase(j.begin());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
{
|
||||
json j = 17;
|
||||
json::const_iterator it = j.erase(j.cbegin());
|
||||
const json::const_iterator it = j.erase(j.cbegin());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
@ -585,13 +585,13 @@ TEST_CASE("element access 1")
|
||||
{
|
||||
{
|
||||
json j = 17u;
|
||||
json::iterator it = j.erase(j.begin());
|
||||
const json::iterator it = j.erase(j.begin());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
{
|
||||
json j = 17u;
|
||||
json::const_iterator it = j.erase(j.cbegin());
|
||||
const json::const_iterator it = j.erase(j.cbegin());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
@ -601,13 +601,13 @@ TEST_CASE("element access 1")
|
||||
{
|
||||
{
|
||||
json j = 23.42;
|
||||
json::iterator it = j.erase(j.begin());
|
||||
const json::iterator it = j.erase(j.begin());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
{
|
||||
json j = 23.42;
|
||||
json::const_iterator it = j.erase(j.cbegin());
|
||||
const json::const_iterator it = j.erase(j.cbegin());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
@ -617,13 +617,13 @@ TEST_CASE("element access 1")
|
||||
{
|
||||
{
|
||||
json j = json::binary({1, 2, 3});
|
||||
json::iterator it = j.erase(j.begin());
|
||||
const json::iterator it = j.erase(j.begin());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
{
|
||||
json j = json::binary({1, 2, 3});
|
||||
json::const_iterator it = j.erase(j.cbegin());
|
||||
const json::const_iterator it = j.erase(j.cbegin());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
@ -711,13 +711,13 @@ TEST_CASE("element access 1")
|
||||
{
|
||||
{
|
||||
json j = "foo";
|
||||
json::iterator it = j.erase(j.begin(), j.end());
|
||||
const json::iterator it = j.erase(j.begin(), j.end());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
{
|
||||
json j = "bar";
|
||||
json::const_iterator it = j.erase(j.cbegin(), j.cend());
|
||||
const json::const_iterator it = j.erase(j.cbegin(), j.cend());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
@ -727,13 +727,13 @@ TEST_CASE("element access 1")
|
||||
{
|
||||
{
|
||||
json j = false;
|
||||
json::iterator it = j.erase(j.begin(), j.end());
|
||||
const json::iterator it = j.erase(j.begin(), j.end());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
{
|
||||
json j = true;
|
||||
json::const_iterator it = j.erase(j.cbegin(), j.cend());
|
||||
const json::const_iterator it = j.erase(j.cbegin(), j.cend());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
@ -743,13 +743,13 @@ TEST_CASE("element access 1")
|
||||
{
|
||||
{
|
||||
json j = 17;
|
||||
json::iterator it = j.erase(j.begin(), j.end());
|
||||
const json::iterator it = j.erase(j.begin(), j.end());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
{
|
||||
json j = 17;
|
||||
json::const_iterator it = j.erase(j.cbegin(), j.cend());
|
||||
const json::const_iterator it = j.erase(j.cbegin(), j.cend());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
@ -759,13 +759,13 @@ TEST_CASE("element access 1")
|
||||
{
|
||||
{
|
||||
json j = 17u;
|
||||
json::iterator it = j.erase(j.begin(), j.end());
|
||||
const json::iterator it = j.erase(j.begin(), j.end());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
{
|
||||
json j = 17u;
|
||||
json::const_iterator it = j.erase(j.cbegin(), j.cend());
|
||||
const json::const_iterator it = j.erase(j.cbegin(), j.cend());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
@ -775,13 +775,13 @@ TEST_CASE("element access 1")
|
||||
{
|
||||
{
|
||||
json j = 23.42;
|
||||
json::iterator it = j.erase(j.begin(), j.end());
|
||||
const json::iterator it = j.erase(j.begin(), j.end());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
{
|
||||
json j = 23.42;
|
||||
json::const_iterator it = j.erase(j.cbegin(), j.cend());
|
||||
const json::const_iterator it = j.erase(j.cbegin(), j.cend());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
@ -791,13 +791,13 @@ TEST_CASE("element access 1")
|
||||
{
|
||||
{
|
||||
json j = json::binary({1, 2, 3});
|
||||
json::iterator it = j.erase(j.begin(), j.end());
|
||||
const json::iterator it = j.erase(j.begin(), j.end());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
{
|
||||
json j = json::binary({1, 2, 3});
|
||||
json::const_iterator it = j.erase(j.cbegin(), j.cend());
|
||||
const json::const_iterator it = j.erase(j.cbegin(), j.cend());
|
||||
CHECK(j.type() == json::value_t::null);
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
|
@ -847,13 +847,13 @@ TEST_CASE_TEMPLATE("element access 2", Json, nlohmann::json, nlohmann::ordered_j
|
||||
{
|
||||
{
|
||||
Json jobject = {{"a", "a"}, {"b", 1}, {"c", 17u}};
|
||||
typename Json::iterator it2 = jobject.erase(jobject.begin(), jobject.end());
|
||||
const typename Json::iterator it2 = jobject.erase(jobject.begin(), jobject.end());
|
||||
CHECK(jobject == Json::object());
|
||||
CHECK(it2 == jobject.end());
|
||||
}
|
||||
{
|
||||
Json jobject = {{"a", "a"}, {"b", 1}, {"c", 17u}};
|
||||
typename Json::const_iterator it2 = jobject.erase(jobject.cbegin(), jobject.cend());
|
||||
const typename Json::const_iterator it2 = jobject.erase(jobject.cbegin(), jobject.cend());
|
||||
CHECK(jobject == Json::object());
|
||||
CHECK(it2 == jobject.cend());
|
||||
}
|
||||
|
@ -278,8 +278,8 @@ TEST_CASE("object inspection")
|
||||
std::ifstream f_escaped(TEST_DATA_DIRECTORY "/json_nlohmann_tests/all_unicode_ascii.json");
|
||||
std::ifstream f_unescaped(TEST_DATA_DIRECTORY "/json_nlohmann_tests/all_unicode.json");
|
||||
|
||||
json j1 = json::parse(f_escaped);
|
||||
json j2 = json::parse(f_unescaped);
|
||||
const json j1 = json::parse(f_escaped);
|
||||
const json j2 = json::parse(f_unescaped);
|
||||
CHECK(j1 == j2);
|
||||
}
|
||||
|
||||
@ -289,10 +289,10 @@ TEST_CASE("object inspection")
|
||||
std::ifstream f_unescaped(TEST_DATA_DIRECTORY "/json_nlohmann_tests/all_unicode.json");
|
||||
|
||||
json const value = json::parse(f_unescaped);
|
||||
std::string text = value.dump(4, ' ', true);
|
||||
const std::string text = value.dump(4, ' ', true);
|
||||
|
||||
std::string expected((std::istreambuf_iterator<char>(f_escaped)),
|
||||
std::istreambuf_iterator<char>());
|
||||
const std::string expected((std::istreambuf_iterator<char>(f_escaped)),
|
||||
std::istreambuf_iterator<char>());
|
||||
CHECK(text == expected);
|
||||
}
|
||||
}
|
||||
@ -333,7 +333,7 @@ TEST_CASE("object inspection")
|
||||
})
|
||||
{
|
||||
json const j1 = json::parse(s);
|
||||
std::string s1 = j1.dump();
|
||||
const std::string s1 = j1.dump();
|
||||
json const j2 = json::parse(s1);
|
||||
std::string s2 = j2.dump();
|
||||
CHECK(s1 == s2);
|
||||
@ -396,63 +396,63 @@ TEST_CASE("object inspection")
|
||||
SECTION("null")
|
||||
{
|
||||
json const j = nullptr;
|
||||
json::value_t t = j;
|
||||
const json::value_t t = j;
|
||||
CHECK(t == j.type());
|
||||
}
|
||||
|
||||
SECTION("object")
|
||||
{
|
||||
json const j = {{"foo", "bar"}};
|
||||
json::value_t t = j;
|
||||
const json::value_t t = j;
|
||||
CHECK(t == j.type());
|
||||
}
|
||||
|
||||
SECTION("array")
|
||||
{
|
||||
json const j = {1, 2, 3, 4};
|
||||
json::value_t t = j;
|
||||
const json::value_t t = j;
|
||||
CHECK(t == j.type());
|
||||
}
|
||||
|
||||
SECTION("boolean")
|
||||
{
|
||||
json const j = true;
|
||||
json::value_t t = j;
|
||||
const json::value_t t = j;
|
||||
CHECK(t == j.type());
|
||||
}
|
||||
|
||||
SECTION("string")
|
||||
{
|
||||
json const j = "Hello world";
|
||||
json::value_t t = j;
|
||||
const json::value_t t = j;
|
||||
CHECK(t == j.type());
|
||||
}
|
||||
|
||||
SECTION("number (integer)")
|
||||
{
|
||||
json const j = 23;
|
||||
json::value_t t = j;
|
||||
const json::value_t t = j;
|
||||
CHECK(t == j.type());
|
||||
}
|
||||
|
||||
SECTION("number (unsigned)")
|
||||
{
|
||||
json const j = 23u;
|
||||
json::value_t t = j;
|
||||
const json::value_t t = j;
|
||||
CHECK(t == j.type());
|
||||
}
|
||||
|
||||
SECTION("number (floating-point)")
|
||||
{
|
||||
json const j = 42.23;
|
||||
json::value_t t = j;
|
||||
const json::value_t t = j;
|
||||
CHECK(t == j.type());
|
||||
}
|
||||
|
||||
SECTION("binary")
|
||||
{
|
||||
json const j = json::binary({});
|
||||
json::value_t t = j;
|
||||
const json::value_t t = j;
|
||||
CHECK(t == j.type());
|
||||
}
|
||||
}
|
||||
|
@ -18,10 +18,10 @@ TEST_CASE("iterators 1")
|
||||
{
|
||||
SECTION("uninitialized")
|
||||
{
|
||||
json::iterator it;
|
||||
json::iterator it; // NOLINT(misc-const-correctness)
|
||||
CHECK(it.m_object == nullptr);
|
||||
|
||||
json::const_iterator cit;
|
||||
json::const_iterator cit; // NOLINT(misc-const-correctness)
|
||||
CHECK(cit.m_object == nullptr);
|
||||
}
|
||||
|
||||
@ -1498,46 +1498,46 @@ TEST_CASE("iterators 1")
|
||||
|
||||
SECTION("json + begin/end")
|
||||
{
|
||||
json::iterator it = j.begin();
|
||||
const json::iterator it = j.begin();
|
||||
CHECK(it == j.end());
|
||||
}
|
||||
|
||||
SECTION("const json + begin/end")
|
||||
{
|
||||
json::const_iterator it_begin = j_const.begin();
|
||||
const json::const_iterator it_begin = j_const.begin();
|
||||
json::const_iterator it_end = j_const.end();
|
||||
CHECK(it_begin == it_end);
|
||||
}
|
||||
|
||||
SECTION("json + cbegin/cend")
|
||||
{
|
||||
json::const_iterator it_begin = j.cbegin();
|
||||
const json::const_iterator it_begin = j.cbegin();
|
||||
json::const_iterator it_end = j.cend();
|
||||
CHECK(it_begin == it_end);
|
||||
}
|
||||
|
||||
SECTION("const json + cbegin/cend")
|
||||
{
|
||||
json::const_iterator it_begin = j_const.cbegin();
|
||||
const json::const_iterator it_begin = j_const.cbegin();
|
||||
json::const_iterator it_end = j_const.cend();
|
||||
CHECK(it_begin == it_end);
|
||||
}
|
||||
|
||||
SECTION("json + rbegin/rend")
|
||||
{
|
||||
json::reverse_iterator it = j.rbegin();
|
||||
const json::reverse_iterator it = j.rbegin();
|
||||
CHECK(it == j.rend());
|
||||
}
|
||||
|
||||
SECTION("json + crbegin/crend")
|
||||
{
|
||||
json::const_reverse_iterator it = j.crbegin();
|
||||
const json::const_reverse_iterator it = j.crbegin();
|
||||
CHECK(it == j.crend());
|
||||
}
|
||||
|
||||
SECTION("const json + crbegin/crend")
|
||||
{
|
||||
json::const_reverse_iterator it = j_const.crbegin();
|
||||
const json::const_reverse_iterator it = j_const.crbegin();
|
||||
CHECK(it == j_const.crend());
|
||||
}
|
||||
|
||||
|
@ -955,7 +955,8 @@ TEST_CASE("iterators 2")
|
||||
};
|
||||
json j_expected{"a_key", "b_key", "c_key"};
|
||||
|
||||
auto transformed = j.items() | std::views::transform([](const auto & item)
|
||||
// NOLINTNEXTLINE(fuchsia-trailing-return)
|
||||
auto transformed = j.items() | std::views::transform([](const auto & item) -> std::string_view
|
||||
{
|
||||
return item.key();
|
||||
});
|
||||
|
@ -24,9 +24,9 @@ TEST_CASE("JSON patch")
|
||||
SECTION("4. Operations")
|
||||
{
|
||||
// the ordering of members in JSON objects is not significant:
|
||||
json op1 = R"({ "op": "add", "path": "/a/b/c", "value": "foo" })"_json;
|
||||
json op2 = R"({ "path": "/a/b/c", "op": "add", "value": "foo" })"_json;
|
||||
json op3 = R"({ "value": "foo", "path": "/a/b/c", "op": "add" })"_json;
|
||||
const json op1 = R"({ "op": "add", "path": "/a/b/c", "value": "foo" })"_json;
|
||||
const json op2 = R"({ "path": "/a/b/c", "op": "add", "value": "foo" })"_json;
|
||||
const json op3 = R"({ "value": "foo", "path": "/a/b/c", "op": "add" })"_json;
|
||||
|
||||
// check if the operation objects are equivalent
|
||||
CHECK(op1 == op2);
|
||||
@ -642,12 +642,12 @@ TEST_CASE("JSON patch")
|
||||
)"_json;
|
||||
|
||||
// apply the patch
|
||||
json target = source.patch(p1);
|
||||
const json target = source.patch(p1);
|
||||
// target = { "D": "Berlin", "F": "Paris", "GB": "London" }
|
||||
CHECK(target == R"({ "D": "Berlin", "F": "Paris", "GB": "London" })"_json);
|
||||
|
||||
// create a diff from two JSONs
|
||||
json p2 = json::diff(target, source); // NOLINT(readability-suspicious-call-argument)
|
||||
const json p2 = json::diff(target, source); // NOLINT(readability-suspicious-call-argument)
|
||||
// p2 = [{"op": "delete", "path": "/GB"}]
|
||||
CHECK(p2 == R"([{"op":"remove","path":"/GB"}])"_json);
|
||||
}
|
||||
@ -666,7 +666,7 @@ TEST_CASE("JSON patch")
|
||||
j["/2/en"_json_pointer] = "ugly";
|
||||
CHECK(j == R"(["good","bad",{"en":"ugly","it":"cattivo"}])"_json);
|
||||
|
||||
json flat = j.flatten();
|
||||
const json flat = j.flatten();
|
||||
CHECK(flat == R"({"/0":"good","/1":"bad","/2/en":"ugly","/2/it":"cattivo"})"_json);
|
||||
}
|
||||
}
|
||||
|
@ -139,7 +139,12 @@ TEST_CASE("locale-dependent test (LC_NUMERIC=de_DE)")
|
||||
{
|
||||
std::array<char, 6> buffer = {};
|
||||
CHECK(std::snprintf(buffer.data(), buffer.size(), "%.2f", 12.34) == 5); // NOLINT(cppcoreguidelines-pro-type-vararg,hicpp-vararg)
|
||||
CHECK(std::string(buffer.data()) == "12,34");
|
||||
const auto snprintf_result = std::string(buffer.data());
|
||||
if (snprintf_result != "12,34")
|
||||
{
|
||||
CAPTURE(snprintf_result)
|
||||
MESSAGE("To test if number parsing is locale-independent, we set the locale to de_DE. However, on this system, the decimal separator doesn't change to `,` potentially due to a known musl issue (https://github.com/nlohmann/json/issues/4767).");
|
||||
}
|
||||
}
|
||||
|
||||
SECTION("parsing")
|
||||
|
@ -1606,7 +1606,7 @@ TEST_CASE("single MessagePack roundtrip")
|
||||
|
||||
// parse JSON file
|
||||
std::ifstream f_json(filename);
|
||||
json j1 = json::parse(f_json);
|
||||
const json j1 = json::parse(f_json);
|
||||
|
||||
// parse MessagePack file
|
||||
auto packed = utils::read_binary_file(filename + ".msgpack");
|
||||
@ -1817,7 +1817,7 @@ TEST_CASE("MessagePack roundtrips" * doctest::skip())
|
||||
INFO_WITH_TEMP(filename + ": std::vector<uint8_t>");
|
||||
// parse JSON file
|
||||
std::ifstream f_json(filename);
|
||||
json j1 = json::parse(f_json);
|
||||
const json j1 = json::parse(f_json);
|
||||
|
||||
// parse MessagePack file
|
||||
auto packed = utils::read_binary_file(filename + ".msgpack");
|
||||
@ -1832,7 +1832,7 @@ TEST_CASE("MessagePack roundtrips" * doctest::skip())
|
||||
INFO_WITH_TEMP(filename + ": std::ifstream");
|
||||
// parse JSON file
|
||||
std::ifstream f_json(filename);
|
||||
json j1 = json::parse(f_json);
|
||||
const json j1 = json::parse(f_json);
|
||||
|
||||
// parse MessagePack file
|
||||
std::ifstream f_msgpack(filename + ".msgpack", std::ios::binary);
|
||||
@ -1847,7 +1847,7 @@ TEST_CASE("MessagePack roundtrips" * doctest::skip())
|
||||
INFO_WITH_TEMP(filename + ": uint8_t* and size");
|
||||
// parse JSON file
|
||||
std::ifstream f_json(filename);
|
||||
json j1 = json::parse(f_json);
|
||||
const json j1 = json::parse(f_json);
|
||||
|
||||
// parse MessagePack file
|
||||
auto packed = utils::read_binary_file(filename + ".msgpack");
|
||||
@ -1880,3 +1880,82 @@ TEST_CASE("MessagePack roundtrips" * doctest::skip())
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#ifdef JSON_HAS_CPP_17
|
||||
// Test suite for verifying MessagePack handling with std::byte input
|
||||
TEST_CASE("MessagePack with std::byte")
|
||||
{
|
||||
|
||||
SECTION("std::byte compatibility")
|
||||
{
|
||||
SECTION("vector roundtrip")
|
||||
{
|
||||
json original =
|
||||
{
|
||||
{"name", "test"},
|
||||
{"value", 42},
|
||||
{"array", {1, 2, 3}}
|
||||
};
|
||||
|
||||
std::vector<uint8_t> temp = json::to_msgpack(original);
|
||||
// Convert the uint8_t vector to std::byte vector
|
||||
std::vector<std::byte> msgpack_data(temp.size());
|
||||
for (size_t i = 0; i < temp.size(); ++i)
|
||||
{
|
||||
msgpack_data[i] = std::byte(temp[i]);
|
||||
}
|
||||
// Deserialize from std::byte vector back to JSON
|
||||
json from_bytes;
|
||||
CHECK_NOTHROW(from_bytes = json::from_msgpack(msgpack_data));
|
||||
|
||||
CHECK(from_bytes == original);
|
||||
}
|
||||
|
||||
SECTION("empty vector")
|
||||
{
|
||||
const std::vector<std::byte> empty_data;
|
||||
CHECK_THROWS_WITH_AS([&]()
|
||||
{
|
||||
[[maybe_unused]] auto result = json::from_msgpack(empty_data);
|
||||
return true;
|
||||
}
|
||||
(),
|
||||
"[json.exception.parse_error.110] parse error at byte 1: syntax error while parsing MessagePack value: unexpected end of input",
|
||||
json::parse_error&);
|
||||
}
|
||||
|
||||
SECTION("comparison with workaround")
|
||||
{
|
||||
json original =
|
||||
{
|
||||
{"string", "hello"},
|
||||
{"integer", 42},
|
||||
{"float", 3.14},
|
||||
{"boolean", true},
|
||||
{"null", nullptr},
|
||||
{"array", {1, 2, 3}},
|
||||
{"object", {{"key", "value"}}}
|
||||
};
|
||||
|
||||
std::vector<uint8_t> temp = json::to_msgpack(original);
|
||||
|
||||
std::vector<std::byte> msgpack_data(temp.size());
|
||||
for (size_t i = 0; i < temp.size(); ++i)
|
||||
{
|
||||
msgpack_data[i] = std::byte(temp[i]);
|
||||
}
|
||||
// Attempt direct deserialization using std::byte input
|
||||
const json direct_result = json::from_msgpack(msgpack_data);
|
||||
|
||||
// Test the workaround approach: reinterpret as unsigned char* and use iterator range
|
||||
const auto* const char_start = reinterpret_cast<unsigned char const*>(msgpack_data.data());
|
||||
const auto* const char_end = char_start + msgpack_data.size();
|
||||
json workaround_result = json::from_msgpack(char_start, char_end);
|
||||
|
||||
// Verify that the final deserialized JSON matches the original JSON
|
||||
CHECK(direct_result == workaround_result);
|
||||
CHECK(direct_result == original);
|
||||
}
|
||||
}
|
||||
}
|
||||
#endif
|
||||
|
@ -36,7 +36,7 @@ TEST_CASE("ordered_map")
|
||||
{
|
||||
std::map<std::string, std::string> m {{"eins", "one"}, {"zwei", "two"}, {"drei", "three"}};
|
||||
ordered_map<std::string, std::string> om(m.begin(), m.end());
|
||||
const auto com = om;
|
||||
const auto com = om; // NOLINT(performance-unnecessary-copy-initialization)
|
||||
|
||||
SECTION("with Key&&")
|
||||
{
|
||||
@ -69,7 +69,7 @@ TEST_CASE("ordered_map")
|
||||
{
|
||||
std::map<std::string, std::string> m {{"eins", "one"}, {"zwei", "two"}, {"drei", "three"}};
|
||||
ordered_map<std::string, std::string> om(m.begin(), m.end());
|
||||
const auto com = om;
|
||||
const auto com = om; // NOLINT(performance-unnecessary-copy-initialization)
|
||||
|
||||
SECTION("with Key&&")
|
||||
{
|
||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user