300x250 AD TOP

Search This Blog

Pages

Paling Dilihat

Powered by Blogger.

Showing posts with label node-overload-resolution. Show all posts
Showing posts with label node-overload-resolution. Show all posts

Friday, May 26, 2017

Node js C++ Addon Overload Resolution - v1.2.0

Major feature implemented for namespace and class wrapping, no more SetMethod/SetPrototypeMethod and target tracking, all is done automatically. 


Since its inefficient to write so much code, while making sure SetMethod and SetPrototypeMethod is called on each overload and track the namespaces used to store the callbacks manually, an experimental namespace_wrap and class_wrap were introduced, these two classes wrap the needed calls as well as track the targets (v8::Object).
The wrappers do add overhead to function execution time but it didn't even register when I did performance analysis.
to use the wrappers you'll need to initialize an overload_resolution instance normally but also start tracking a root target:
void init(v8::Handle<v8::Object> target) {
 auto overload = std::make_shared<overload_resolution>();
 auto base_overload = overload->register_module(target);
    ...
}
NODE_MODULE(tester, init);
once you have the base_overload you can start attaching methods:
base_overload->add_overload("standalone_function_construct", {}, standalone_function_construct);
nest methods:
auto nested = base_overload->add_namespace("namespace_construct");
nested->add_overload("nc_standalone_function_construct", {}, nc_standalone_function_construct_nc);
add classes:
auto class_def = base_overload->add_class("class_constructs");
class_def->add_overload_constructor({}, New);
class_def->add_static_overload("test_static", {}, test_static);
class_def->add_overload("test_member", {}, test_member);

constructor.Reset(class_def->done<class_constructs>());
note: you must call done<T>() when you're done defining the class methods, otherwise the class won't register in both the overload_resolution and V8.
for now classes can be nested inside namespaces but not the other way around, though it should be theoretically possible.

Minor build and bug fixes:
- integration with node-addon-tracer npm package so build is simpler.
- expose node-overload-resolution as an npm module, so require("node-overload-resolution") works in parent module gyp files.
- remove support for node 0.11
- fix travis ci builds

Tags: , ,

Tuesday, December 13, 2016

Node js C++ Addon Overload Resolution - Refactor & Add C++ Type Converter

Remember node-overload-resolution ?

I've been working on it for a while to see how I can support automatic async support.

I always believed if you can work a little harder to save a lot of time later, its most likely going to pay off, one of the biggest motivation to write the node-overload-resolution project was to easily convert C++ libraries to Node js without changing the API too much.

One of the major roadblocks in the last version of node-overload-resolution to this goal is that v8 objects are not accessible through any other thread other than node js main thread since node doesn't release the locker on v8 vm. To solve this issue I thought and implemented a way to parse and store the v8 objects as their projected C++ objects. in essence copying v8::String to std::string, v8::Number to double and so on.

Some issues I've encountered with this approach is how to store v8 Objects, C++ Class wrappers and Arrays, but if I can actually tell the resolution module which C++ types each v8 type translates to, maybe it could work.

I've been working on this premise and implemented a converter and value holder on top of the working overload resolution module and currently it supports Number, Function, String, Buffer, a typed struct and ObjectWrap.

Currently there are issues with Function, its implemented as or::Callback, but the actual function is not called yet.

Other issues is that I couldn't find a translation for Promise, Proxy and RegExp.

I think this module is on the correct path at the moment, it should be relatively easy to support async function calls by adding a check if the matched function have an additional function callback parameter, caching the parameter value as C++ type, pushing the function call to libuv threadpool and executing the return value conversion and callbacks activation when the thread finishes execution.

You can find the code on the native_types branch.

The code is concentrated in 3 main components, the value_holder, the value_converter and the generic_value_holder. 

The value_holder is a derived template, storing the Value inside a template type.

template<typename T>
class value_holder : public value_holder_base {
public:
    T Value;
    value_holder() : value_holder_base() {}
    value_holder(T val) : value_holder_base(), Value(val) {}
    virtual ~value_holder() {}
};



The value_converter is a derived template with template specialization for each major type handled, the specialization is both for primitives (including v8 basic types) and for derived classes for IStructuredObject and ObjectWrap classes, allowing a more specific behavior for structures and C++ classes, such as parsing/creating new v8 objects as well as ObjectWrap::Wrap and ObjectWrap::Unwrap.

for example, this template specialization is for all derived classes of ObjectWrap, it wraps/unwraps to/from v8::Object:
template<typename T>
class value_converter<T*, typename std::enable_if<std::is_base_of<ObjectWrap, T>::value>::type> : publicvalue_converter_base {
public:

    virtual T* convert(v8::Local<v8::Value> from) {
        return or::ObjectWrap::Unwrap<T>(from.As<v8::Object>());
    }


    virtual v8::Local<v8::Value> convert(T* from) {
        return from->Wrap();
    }

    virtual v8::Local<v8::Value> convert(std::shared_ptr<value_holder_base> from) {
        auto from_value = std::dynamic_pointer_cast<value_holder<T*>>(from);
        return from_value->Value->Wrap();
    }

    virtual std::shared_ptr<value_holder_base> read(v8::Local<v8::Value> val) {
        auto parsed_value = std::make_shared<value_holder<T*>>();
        parsed_value->Value = convert(val);
        return parsed_value;
    }

};

lastly the generic_value_holder stores a pair of value_holder and value_converter and by that it can act as some sort of non-convertible variant that can return a v8 object from intended C++ types.

class generic_value_holder {
private:
    std::shared_ptr< or ::value_converter_base> _prefetcher;
    std::shared_ptr< or ::value_holder_base> _value;
public:
    void Set(std::shared_ptr< or ::value_converter_base> value_converter, std::shared_ptr< or ::value_holder_base> value) {
        _prefetcher = value_converter;
        _value = value;
    }

    template<typename T>
    void Set(T returnValue) {
        //store value_converter type
        auto returnPrefetcher = std::make_shared < or ::value_converter<T>>();

        //store value inside a valueholder
        auto valueHolder = std::make_shared < or ::value_holder<T>>();
        valueHolder->Value = returnValue;

        Set(returnPrefetcher, valueHolder);
    }

    v8::Local<v8::Value> Get() {
        return _prefetcher->convert(_value);
    }
};

Update 2016-12-14:
The automatic async has been partially implemented, at this moment, the tests are working as far as I can see. what happens is that the overload resolution is checking the last argument passed to the function, if its not part of the function parameters and its type is a function, its assumed to be an async request. so the overload resolution engine stores all the function arguments in memory as C++ objects, calls the function and post process the return value and lastly calls the callback function supplied.

for example, assuming you have a function test(arg1 : string, arg2 : number), if a match is found that contains test("a",1), it will execute the function synchronously. But if a match is found as test("a",1,function(err,val){}), it will be executed asynchronously (note that there is no way to know which parameters are defined for function(err,val), so the example above is for clarity sake.

The functions implementation have to use info.at<T>(index) to read the parameters and have to use info.SetReturnValue<T>(value) to return a value, otherwise the implementation will fail because node js keeps the vm locked and there's no access to v8 objects through normal info[index] mechanism, if the implementer insists on using info[index], an access violation will occur and the application will crash when executing as async.

TODO:
- Finish the callback implementation.
- Implement Async call detection and execution - partial.
- This() access from inside a class member function
- cleanup
Tags: , , , ,