Hacker Newsnew | past | comments | ask | show | jobs | submit | nlohmann's commentslogin

JSON for Modern C++ (nlohmann/json) is also accepting comments if you opt in, see https://json.nlohmann.me/features/comments/


I've used this for some personal projects (not a C++ dev at work) - it has a really nice interface for interacting with JSON data. I just wish C++ had a better package system than copy pasting the headerfile from GitHub.


It's sure not perfect (currently overworking that section), but still: https://json.nlohmann.me/integration/package_managers/


Is there an example where nlohmann/json is not compliant? Would like to know (and fix) this.


Have you every played with SQLite virtual tables (https://sqlite.org/vtab.html) - they could allow to provide an SQLite interface while keeping the same structure on disk. Though it requires a bit of work (implementing the interface can be tedious), it can avoid the conversion in the first place.


Good point. Actually CommonCrawl provides Parquet files for their archives too.

And there's this vtable for Parquet extension. https://github.com/cldellow/sqlite-parquet-vtable

But for my use case virtual would be too complicated.


DuckDB would probably be a way better option and works amazingly well on top of parquet (https://duckdb.org/docs/data/parquet)


Then again, do you need virtual tables? The .warc structure won't change, so the tables won't change. But you can have SQL views defined instead for common queries.


I would be happy to have a different way to decide if a member variable exists or not without touching anything else.


Depending on context of that question concepts and/or just SFINAE could help with that.

I'm not shitting on the library in general, I'm saying the second paragraph disproves the title.


The library aims at making anything related to JSON straightforward to implement. For some applications, this is a good compromise. The comment in the README is like the answer to a FAQ "How fast is it?"


> This is a good compromise

You're making the false assumption that there is a compromise to be made - but there isn't. A modern JSON library should be the fastest (or about the same speed as the fastest), and vice-versa.


As I tried to describe earlier: "modern C++" is not necessarily "using the latest standard", but rather "C++ since it was updated with C++11 (and later)".


Aiming at C++11 is not a reasonable definition of "modern", as it was C++'s second ISO standard that was published a decade ago and which has since seen between two and three major updates (depending on the opinions on c++14)


It's not the "aiming at C++11", but rather "Write code that does not look odd in a code base that is using constructs from C++11, C++14, C++17, etc." - The library uses C++11 to implement an API that should not feel weird when used together with other containers from the STL.


C++11 was a huge shift in how C++ is written, and term coined for "code written using the new techniques" was "Modern C++". Whether you think that term should instead mean "the latest C++ version" is a different matter altogether.


In C++ land "modern" has become synonymous with post 11. Effectively a domain specific definition. Reasonable considering the difference between pre and post 11. Pre and post 20 will probably be treated similarly in a decade


There is also a SAX parser.

But to be honest, I have not yet played around with PMR.


I know when I was benching JSON Link, I saw 30-50% increase in perf at the time when using better allocators and PMR can help a lot there with things like bump allocators.


The development started in 2013 when C++11 was still modern. Since then, the term "modern C++" is, to my understanding, a synonym for "C++11 and later". Of course, some code may look dated compared to newer C++ constructs, but the main goal is to integrate JSON into any C++ code bases without making it look odd.

The string_view lookup is nearly done, but I did not want to wait for it, because it would have delayed the release even more.

I'm also working on supporting unordered_map - using it as container for objects would be easy if we would just break the existing API - the hard part is to support it with the current (probably bad designed) template API.


Great to hear that you made good experiences with user-defined allocators. It would be great if you could provide a pointer to an example, because we always fall short in testing the allocator usage. So if you had a small example, you could really improve the status quo :)


Thanks a lot to you for writing such an awesome library! :)

This is briefly how I use C++ Allocators with the ESP32 and Nlohmann/JSON (GCC8, C++17 mode):

I have a series of "extmem" headers which define aliases for STL containers which use my allocator (ext::allocator). The allocator is a simple allocator that just uses IDF's heap_caps_malloc() to allocate memory on the SPIRAM of the ESP-WROVER SoC.

I then define in <extmem/json.hpp>:

    namespace ext {
        using json = nlohmann::basic_json<std::map, std::vector, ext::string, bool, long long, unsigned long long, double, ext::allocator, nlohmann::adl_serializer>;
    }
where `ext::string` is just `std::basic_string<char, std::char_traits<char>, ext::allocator<char>>`. In order to be able to define generic from/to_json functions in an ergonomic way, I had to reexport the following internal macros in a separate header:

    #define JSON_TEMPLATE_PARAMS \
        template<typename, typename, typename...> class ObjectType,   \
        template<typename, typename...> class ArrayType,              \
        class StringType, class BooleanType, class NumberIntegerType, \
        class NumberUnsignedType, class NumberFloatType,              \
        template<typename> class AllocatorType,                       \
        template<typename, typename = void> class JSONSerializer

    #define JSON_TEMPLATE template<JSON_TEMPLATE_PARAMS>

    #define GENERIC_JSON                                            \
        nlohmann::basic_json<ObjectType, ArrayType, StringType, BooleanType,             \
        NumberIntegerType, NumberUnsignedType, NumberFloatType,                \
        AllocatorType, JSONSerializer>
I am now able to just write stuff like the following:

    JSON_TEMPLATE
    inline void from_json(const GENERIC_JSON &j, my_type &t) {
        // ... 
    }

    JSON_TEMPLATE
    inline void to_json(GENERIC_JSON &j, const my_type &t) {
        // ...
    }
And it works fine with both nlohmann::json and ext::json.

In the rest of the code, everything stays the same; I simply use ext::json (and catch const ext::json::exception&) as if it were it's default version, and it works great. FYI, I'm currently using nlohmann/json v.3.9.1.


Can you elaborate on your minuses - I don't understand what you mean with "at once" in that context.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: