From 00219e7e9b24ee2298a7d0d977afdf123a6de88b Mon Sep 17 00:00:00 2001 From: Julien Date: Wed, 19 Jul 2017 21:40:40 +0800 Subject: rename all READM.rst to README.md Change-Id: I95930de9fefd0897bd0b75d2aeb5a1d731332dad Signed-off-by: Julien --- apigateway/README.md | 25 ++ apigateway/README.rst | 25 -- apigateway/setup.cfg | 4 +- tosca2heat/heat-translator/README.md | 56 +++ tosca2heat/heat-translator/README.rst | 56 --- tosca2heat/heat-translator/setup.cfg | 2 +- tosca2heat/tosca-parser/README.md | 54 +++ tosca2heat/tosca-parser/README.rst | 54 --- tosca2heat/tosca-parser/setup.cfg | 2 +- verigraph/README.md | 258 ++++++++++++ verigraph/README.rst | 258 ------------ .../src/tests/j-verigraph-generator/README.md | 54 +++ .../src/tests/j-verigraph-generator/README.rst | 54 --- verigraph/src/main/java/it/polito/grpc/README.md | 442 +++++++++++++++++++++ verigraph/src/main/java/it/polito/grpc/README.rst | 442 --------------------- verigraph/tester/README.md | 38 ++ verigraph/tester/README.rst | 38 -- 17 files changed, 931 insertions(+), 931 deletions(-) create mode 100644 apigateway/README.md delete mode 100644 apigateway/README.rst create mode 100644 tosca2heat/heat-translator/README.md delete mode 100644 tosca2heat/heat-translator/README.rst create mode 100644 tosca2heat/tosca-parser/README.md delete mode 100644 tosca2heat/tosca-parser/README.rst create mode 100644 verigraph/README.md delete mode 100644 verigraph/README.rst create mode 100644 verigraph/service/src/tests/j-verigraph-generator/README.md delete mode 100644 verigraph/service/src/tests/j-verigraph-generator/README.rst create mode 100644 verigraph/src/main/java/it/polito/grpc/README.md delete mode 100644 verigraph/src/main/java/it/polito/grpc/README.rst create mode 100644 verigraph/tester/README.md delete mode 100644 verigraph/tester/README.rst diff --git a/apigateway/README.md b/apigateway/README.md new file mode 100644 index 0000000..48eba40 --- /dev/null +++ b/apigateway/README.md @@ -0,0 +1,25 @@ + +Apigateway +========== +A apigateway of web application for parser sub-projects, such as tosca2heat and policy2tosca. + +Building +-------- + +First you need some dependencies: + +.. code-block:: bash + + pip install bindep + apt-get install $(bindep -b) + pip install -f requirements.txt + pip install grpcio-tools + go get -u github.com/golang/protobuf/protoc-gen-go + +Then you can build the code: + +.. code-block:: bash + + autoreconf -fi + ./configure + make \ No newline at end of file diff --git a/apigateway/README.rst b/apigateway/README.rst deleted file mode 100644 index 48eba40..0000000 --- a/apigateway/README.rst +++ /dev/null @@ -1,25 +0,0 @@ - -Apigateway -========== -A apigateway of web application for parser sub-projects, such as tosca2heat and policy2tosca. - -Building --------- - -First you need some dependencies: - -.. code-block:: bash - - pip install bindep - apt-get install $(bindep -b) - pip install -f requirements.txt - pip install grpcio-tools - go get -u github.com/golang/protobuf/protoc-gen-go - -Then you can build the code: - -.. code-block:: bash - - autoreconf -fi - ./configure - make \ No newline at end of file diff --git a/apigateway/setup.cfg b/apigateway/setup.cfg index 920094a..e26e0bd 100644 --- a/apigateway/setup.cfg +++ b/apigateway/setup.cfg @@ -2,7 +2,7 @@ name = apigateway url = https://opnfv.org/parser summary = API Application -description-file = README.rst +description-file = README.md author = parser author-email = openstack-dev@lists.openstack.org classifier = @@ -51,4 +51,4 @@ input_file = apigateway/locale/apigateway.pot [extract_messages] keywords = _ gettext ngettext l_ lazy_gettext mapping_file = babel.cfg -output_file = apigateway/locale/apigateway.pot \ No newline at end of file +output_file = apigateway/locale/apigateway.pot diff --git a/tosca2heat/heat-translator/README.md b/tosca2heat/heat-translator/README.md new file mode 100644 index 0000000..c8af42a --- /dev/null +++ b/tosca2heat/heat-translator/README.md @@ -0,0 +1,56 @@ +======================== +Team and repository tags +======================== + +.. image:: http://governance.openstack.org/badges/heat-translator.svg + :target: http://governance.openstack.org/reference/tags/index.html + +.. Change things from this point on + +=============== +Heat-Translator +=============== + +Overview +-------- + +Heat-Translator is an Openstack project and licensed under Apache 2. It is a +command line tool which takes non-Heat templates as an input and produces a +Heat Orchestration Template (HOT) which can be deployed by Heat. Currently the +development and testing is done with an aim to translate OASIS Topology and +Orchestration Specification for Cloud Applications (TOSCA) templates to +HOT. However, the tool is designed to be easily extended to use with any +format other than TOSCA. + +Architecture +------------ + +Heat-Translator project takes a non-Heat template (e.g. TOSCA flat YAML +template or template embedded in TOSCA Cloud Service Archive (CSAR) format) as +an input, calls an appropriate Parser (e.g. TOSCA Parser) per the type of input +template to parse it and create an in-memory graph, maps it to Heat resources +and then produces a Heat Orchestration Template (HOT) as an output. + +How To Use +---------- +Please refer to `doc/source/usage.rst `_ + +Directory Structure +------------------- + +Three main directories related to the heat-translator are: + +1. hot: It is the generator, that has logic of converting TOSCA in memory graph to HOT YAML files. +2. common: It has all the file that can support the execution of parser and generator. +3. tests: It contains test programs and more importantly several templates which are used for testing. + +Project Info +------------ + +* License: Apache License, Version 2.0 +* Documentation: http://docs.openstack.org/developer/heat-translator/ +* Launchpad: https://launchpad.net/heat-translator +* Blueprints: https://blueprints.launchpad.net/heat-translator +* Bugs: https://bugs.launchpad.net/heat-translator +* Source: http://git.openstack.org/cgit/openstack/heat-translator/ +* IRC Channel: #openstack-heat-translator diff --git a/tosca2heat/heat-translator/README.rst b/tosca2heat/heat-translator/README.rst deleted file mode 100644 index c8af42a..0000000 --- a/tosca2heat/heat-translator/README.rst +++ /dev/null @@ -1,56 +0,0 @@ -======================== -Team and repository tags -======================== - -.. image:: http://governance.openstack.org/badges/heat-translator.svg - :target: http://governance.openstack.org/reference/tags/index.html - -.. Change things from this point on - -=============== -Heat-Translator -=============== - -Overview --------- - -Heat-Translator is an Openstack project and licensed under Apache 2. It is a -command line tool which takes non-Heat templates as an input and produces a -Heat Orchestration Template (HOT) which can be deployed by Heat. Currently the -development and testing is done with an aim to translate OASIS Topology and -Orchestration Specification for Cloud Applications (TOSCA) templates to -HOT. However, the tool is designed to be easily extended to use with any -format other than TOSCA. - -Architecture ------------- - -Heat-Translator project takes a non-Heat template (e.g. TOSCA flat YAML -template or template embedded in TOSCA Cloud Service Archive (CSAR) format) as -an input, calls an appropriate Parser (e.g. TOSCA Parser) per the type of input -template to parse it and create an in-memory graph, maps it to Heat resources -and then produces a Heat Orchestration Template (HOT) as an output. - -How To Use ----------- -Please refer to `doc/source/usage.rst `_ - -Directory Structure -------------------- - -Three main directories related to the heat-translator are: - -1. hot: It is the generator, that has logic of converting TOSCA in memory graph to HOT YAML files. -2. common: It has all the file that can support the execution of parser and generator. -3. tests: It contains test programs and more importantly several templates which are used for testing. - -Project Info ------------- - -* License: Apache License, Version 2.0 -* Documentation: http://docs.openstack.org/developer/heat-translator/ -* Launchpad: https://launchpad.net/heat-translator -* Blueprints: https://blueprints.launchpad.net/heat-translator -* Bugs: https://bugs.launchpad.net/heat-translator -* Source: http://git.openstack.org/cgit/openstack/heat-translator/ -* IRC Channel: #openstack-heat-translator diff --git a/tosca2heat/heat-translator/setup.cfg b/tosca2heat/heat-translator/setup.cfg index 21d0c6f..ebea57b 100644 --- a/tosca2heat/heat-translator/setup.cfg +++ b/tosca2heat/heat-translator/setup.cfg @@ -2,7 +2,7 @@ name = heat-translator summary = Tool to translate non-heat templates to Heat Orchestration Template. description-file = - README.rst + README.md author = OpenStack author-email = openstack-dev@lists.openstack.org home-page = http://docs.openstack.org/developer/heat-translator/ diff --git a/tosca2heat/tosca-parser/README.md b/tosca2heat/tosca-parser/README.md new file mode 100644 index 0000000..0f94072 --- /dev/null +++ b/tosca2heat/tosca-parser/README.md @@ -0,0 +1,54 @@ +======================== +Team and repository tags +======================== + +.. image:: http://governance.openstack.org/badges/tosca-parser.svg + :target: http://governance.openstack.org/reference/tags/index.html + +.. Change things from this point on + +=============== +TOSCA Parser +=============== + +Overview +-------- + +The TOSCA Parser is an OpenStack project and licensed under Apache 2. It is +developed to parse TOSCA Simple Profile in YAML. It reads the TOSCA templates +and creates an in-memory graph of TOSCA nodes and their relationship. + +Architecture +------------ + +The TOSCA Parser takes TOSCA YAML template or TOSCA Cloud Service Archive (CSAR) +file as an input, with optional input of dictionary of needed parameters with their +values, and produces in-memory objects of different TOSCA elements with their +relationship to each other. It also creates a graph of TOSCA node templates and their +relationship. + +The ToscaTemplate class located in the toscaparser/tosca_template.py is an entry +class of the parser and various functionality of parser can be used by initiating +this class. In order to see an example usage of TOSCA Parser from a separate tool, +refer to the OpenStack heat-translator class TranslateTemplate located in the +translator/osc/v1/translate.py module. The toscaparser/shell.py module of tosca-parser +also provides a good reference on how to invoke TOSCA Parser from Command Line Interface. + +The toscaparser/elements sub-directory contains various modules to handle +various TOSCA type elements like node type, relationship type etc. The +entity_type.py module is a parent of all type elements. The toscaparser +directory contains various python module to handle service template including +topology template, node templates, relationship templates etc. The +entity_template.py is a parent of all template elements. + + +How To Use +---------- +Please refer to `doc/source/usage.rst `_ + +Project Info +------------ + +* License: Apache License, Version 2.0 +* Source: http://git.openstack.org/cgit/openstack/tosca-parser/ + diff --git a/tosca2heat/tosca-parser/README.rst b/tosca2heat/tosca-parser/README.rst deleted file mode 100644 index 0f94072..0000000 --- a/tosca2heat/tosca-parser/README.rst +++ /dev/null @@ -1,54 +0,0 @@ -======================== -Team and repository tags -======================== - -.. image:: http://governance.openstack.org/badges/tosca-parser.svg - :target: http://governance.openstack.org/reference/tags/index.html - -.. Change things from this point on - -=============== -TOSCA Parser -=============== - -Overview --------- - -The TOSCA Parser is an OpenStack project and licensed under Apache 2. It is -developed to parse TOSCA Simple Profile in YAML. It reads the TOSCA templates -and creates an in-memory graph of TOSCA nodes and their relationship. - -Architecture ------------- - -The TOSCA Parser takes TOSCA YAML template or TOSCA Cloud Service Archive (CSAR) -file as an input, with optional input of dictionary of needed parameters with their -values, and produces in-memory objects of different TOSCA elements with their -relationship to each other. It also creates a graph of TOSCA node templates and their -relationship. - -The ToscaTemplate class located in the toscaparser/tosca_template.py is an entry -class of the parser and various functionality of parser can be used by initiating -this class. In order to see an example usage of TOSCA Parser from a separate tool, -refer to the OpenStack heat-translator class TranslateTemplate located in the -translator/osc/v1/translate.py module. The toscaparser/shell.py module of tosca-parser -also provides a good reference on how to invoke TOSCA Parser from Command Line Interface. - -The toscaparser/elements sub-directory contains various modules to handle -various TOSCA type elements like node type, relationship type etc. The -entity_type.py module is a parent of all type elements. The toscaparser -directory contains various python module to handle service template including -topology template, node templates, relationship templates etc. The -entity_template.py is a parent of all template elements. - - -How To Use ----------- -Please refer to `doc/source/usage.rst `_ - -Project Info ------------- - -* License: Apache License, Version 2.0 -* Source: http://git.openstack.org/cgit/openstack/tosca-parser/ - diff --git a/tosca2heat/tosca-parser/setup.cfg b/tosca2heat/tosca-parser/setup.cfg index 77e1b2e..1196aa1 100644 --- a/tosca2heat/tosca-parser/setup.cfg +++ b/tosca2heat/tosca-parser/setup.cfg @@ -3,7 +3,7 @@ name = tosca-parser url = https://launchpad.net/tosca-parser summary = Parser for TOSCA Simple Profile in YAML. description-file = - README.rst + README.md author = OpenStack author-email = openstack-dev@lists.openstack.org home-page = http://docs.openstack.org/developer/tosca-parser/ diff --git a/verigraph/README.md b/verigraph/README.md new file mode 100644 index 0000000..947e893 --- /dev/null +++ b/verigraph/README.md @@ -0,0 +1,258 @@ +.. This work is licensed under a Creative Commons Attribution 4.0 International License. +.. http://creativecommons.org/licenses/by/4.0 + +| Let’s look at how to deploy **VeriGraph** on Apache Tomcat. If you’re + only interested in creating gRPC API and ``neo4jmanager`` is already + deployed, you can skip this section and go straight to the + `documentation `__ +| (though you might find it useful if Tomcat is not yet installed!). + +**Windows** + +- install ``jdk1.8.X_YY`` + `here `__ +- set ambient variable ``JAVA_HOME`` to where you installed the jdk + (e.g. ``C:\Program Files\Java\jdk1.8.X_YY``) +- install Apache Tomcat 8 + `here `__ +- install a pre-compiled distribution of Z3 from + `here `__ + and save the ``[z3_root]/bin`` content under ``[verigraph]/service/build`` +- create the ``mcnet.jar`` of the ``mcnet.*`` packages and put into the ``[verigraph]/service/build`` directory +- download the qjutils library + `here `__ + and create a jar file (i.e. qjutils.jat) in ``[verigrap]/service/build`` +- set ambient variable ``CATALINA_HOME`` to the directory where you + installed Apache (e.g. + ``C:\Program Files\Java\apache-tomcat-8.0.30``) +- create ``shared`` folder under ``%CATALINA_HOME%`` +- add previously created folder to the Windows ``Path`` system variable + (i.e. append the following string at the end: + ``;%CATALINA_HOME%\shared``) +- copy ``[verigraph]/lib/mcnet.jar``, ``[verigraph]/service/build/com.microsoft.z3.jar`` and ``[verigraph]/service/build/qjutils.jar`` + to ``%CATALINA_HOME%\shared`` +- to correctly compile the code you have to put the path of ``com.microsoft.z3.jar`` + and the libraries it refers to as environment variable. i.e. is enough + to add the project subfolder ``build`` to the PATH environment variable (i.e., ``[verigraph]/build``) +- create custom file setenv.bat under ``%CATALINA_HOME%\bin`` with the + following content: + + .. code:: bat + + set CLASSPATH=%CLASSPATH%;%CATALINA_HOME%\shared\qjutils.jar;%CATALINA_HOME%\shared\mcnet.jar;%CATALINA_HOME%\shared\com.microsoft.z3.jar;.;%CATALINA_HOME%\webapps\verify\WEB-INF\classes\tests + +- download ``neo4jmanager.war`` from + `here `__ + and copy into into ``%CATALINA_HOME%\webapps`` +- export the ``verify.war`` file from the project and copy into ``%CATALINA_HOME%\webapps`` +- (optional) configure Tomcat Manager: +- open the file ``%CATALINA_HOME%\conf\tomcat-users.xml`` +- under the ``tomcat-users`` tag place the following content: + ``xml `` +- launch Tomcat 8 with the startup script + ``%CATALINA_HOME%\bin\startup.bat`` +- (optional) if you previously configured Tomcat Manager you can open a + browser and navigate to `this link `__ + and login using ``tomcat/tomcat`` as username/password +- (optional) you can deploy/undeploy/redeploy the downloaded WARs + through the web interface + +**Unix** + +- install ``jdk1.8.X_YY`` from the command line: +- go to `this + link `__ + to check the appropriate version for you OS and architecture +- copy the desired version to the clipboard (e.g. + ``http://download.oracle.com/otn-pub/java/jdk/7u79-b15/jdk-7u79-linux-x64.tar.gz``) +- open a terminal windows and paste the following command (replace + ``link`` with the previously copied link): + ``wget --no-check-certificate --no-cookies --header "Cookie: oraclelicense=accept-securebackup-cookie" 'link'`` + e.g. + ``wget --no-check-certificate --no-cookies --header "Cookie: oraclelicense=accept-securebackup-cookie" http://download.oracle.com/otn-pub/java/jdk/7u79-b15/jdk-7u79-linux-x64.tar.gz`` +- untar the archive with the following command (replace 'jdk' to match + the name of the downloaded archive): + ``tar zxvf 'jdk'.tar.gz`` + e.g. + ``tar zxvf jdk-7u-linux-x64.tar.gz`` +- delete the ``.tar.gz`` file if you want to save disk space +- install and configure Apache Tomcat 8 with the following commands: +- go to `this URL `__ + and see what the latest available version is +- download the archive (substitute every occurrence of '8.0.32' in the + following command with the latest available version): + ``wget http://it.apache.contactlab.it/tomcat/tomcat-8/v8.0.32/bin/apache-tomcat-8.0.32.tar.gz`` +- extract downloaded archive: + ``tar xvf apache-tomcat-8.0.32.tar.gz`` +- edit configuration: + ``nano ./apache-tomcat-8.0.32/conf/tomcat-users.xml`` +- under the ``tomcat-users`` tag place the following content + ``xml `` +- set a few environment variables: + ``sudo nano ~/.bashrc`` +- paste the following content at the end of the file + ``export CATALINA_HOME='/path/to/apache/tomcat/folder'`` + e.g. + ``export CATALINA_HOME=/home/mininet/apache-tomcat-8.0.33`` + ``export JRE_HOME='/path/to/jdk/folder'`` + e.g. + ``export JRE_HOME=/home/mininet/jdk1.8.0_92/jre`` + ``export JDK_HOME='/path/to/jdk/folder'`` + e.g. + ``export JDK_HOME=/home/mininet/jdk1.8.0_92`` + ``export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$CATALINA_HOME/shared`` + ``export JAVA_OPTS="-Djava.library.path=$CATALINA_HOME/shared"`` +- ``exec bash`` +- install a pre-compiled distribution of Z3 from + `here `__ + and save [z3_root]/bin content under ``[verigraph]/service/build`` +- download the qjutils library + `here `__ + and create a jar file (i.e. qjutils.jar) in ``[verigrap]/service/build`` +- create the ``mcnet.jar`` of the ``mcnet.*`` packages and put into the ``[verigraph]/service/build`` directory +- copy ``[verigraph]/service/build/mcnet.jar``, ``[verigraph]/service/build/com.microsoft.z3.jar`` + and ``[verigraph]/service/build/qjutils.jar`` to ``$CATALINA_HOME/shared`` +- customize Tomcat classpath + ``nano $CATALINA_HOME/bin/setenv.sh`` +- paste the following content and save file: + ``bash #!/bin/sh export CLASSPATH=$CLASSPATH:$CATALINA_HOME/shared/qjutils.jar:$CATALINA_HOME/shared/mcnet.jar:$CATALINA_HOME/shared/com.microsoft.z3.jar:.:$CATALINA_HOME/webapps/verify/WEB-INF/classes/tests`` +- save and close the file (``CTRL+O``, ``CTRL+X``) + ``sudo chmod +x $CATALINA_HOME/bin/setenv.sh`` +- download ``neo4jmanager.war`` from + `here `__ + and copy into into ``%CATALINA_HOME%\webapps`` +- export the ``verify.war`` file from the project and copy into ``%CATALINA_HOME%\webapps`` +- launch Tomcat 8 with the startup script + ``$CATALINA_HOME/bin/startup.sh`` +- open a browser and navigate to `this + link `__ and login using + ``tomcat/tomcat`` as username/password +- you can deploy/undeploy/redeploy the downloaded WARs through the web + interface + +**Eclipse** + +- clone project onto your hard drive with this command: + ``git clone git@github.com:netgroup-polito/verigraph.git`` +- Download Apache Tomcat 8 (see instructions above for Windows and + Unix) +- Download JDK (see instructions above for Windows and Unix) +- Configure runtime environment in Eclipse with `the following + incstructions `__ +- Add new Tomcat server on port ``8080`` +- Configure Tomcat server: + + - double-click on the newly created server in the ``Servers`` tab + - make sure under ``Server Locations`` ``Use Tomcat installation`` + is selected + - Open ``Launch Configuration``->``Classpath`` + - add the required JARS (``mcnet.jar``, ``com.microsoft.z3.jar`` and + ``qjutils.jar``) under ``User Entries`` + - Hit ``Apply`` and ``Ok`` + +- Run the server + +**How to add you own function ````** + +#. under the the ``mcnet.netobjs`` package (i.e. under + ``/verify/service/src/mcnet/netobjs``) create a new class + ``.java``, where ```` is the desired function name (i.e. + ```` will be added to the supported node functional types) + which extends ``NetworkObject`` and implement the desired logic + +#. regenerate ``mcnet.jar`` selecting the packages ``mcnet.components`` + and ``mcnet.netobjs`` and exporting the generated JAR to + ``/verify/service/build`` (overwrite the existing file) + +#. under ``/verify/src/main/webapp/json/`` create a file + ``.json``. This file represents a JSON schema (see + `here `__ the official documentation). For + compatibility with the other functions it is mandatory to support an + array as the root of the configuration, but feel free to specify all + the other constraints as needed. A sample of ``.json`` to + describe an empty configuration could be the following: + +``json { "$schema": "http://json-schema.org/draft-04/schema#", "title": "Type", "description": "This is a generic type", "type": "array", "items": { "type": "object" }, "minItems": 0, "maxItems": 0, "uniqueItems": true }`` + +#. in the package ``it.polito.escape.verify.validation`` (i.e. under + ``src/main/java/it/polito/escape/verify/validation``) create a new + class file named ``Validator.java`` (please pay attention to + the naming convention here: ```` is the function type used in + the previous step capitalized, followed by the suffix ``Validator``) + which implements ``ValidationInterface``. This class represents a + custom validator for the newly introduced type and allows for more + complex constraints, which is not possible to express through a JSON + schema file. The validate method that has to be implemented is given + the following objects: + +- ``Graph graph`` represents the nffg that the object node belongs to; +- ``Node node`` represents the node that the object configuration + belongs to; +- ``Configuration configuration`` represents the parsed configuration. + It is sufficient to call the method ``getConfiguration`` on the + ``configuration`` object to get a ``JsonNode`` (Jackson's class) and + iterate over the various fields. + In case a configuration is not valid please throw a new + ``ValidationException`` passing a descriptive failure message. + Adding a custom validator is not strictly necessary whenever a JSON + schema is thought to be sufficient. Note though that, other than the + mandatory validation against a schema, whenever a custom validator is + not found a default validation is triggered, i.e. the value of every + JSON property must refer to the name of an existing node in the + working graph/nffg. If this is not the desired behavior it is + suggested to write a custom validator with looser constraints. + +#. customize the class generator and add the support for the newly + introduced type: + +- open the file + ``/verify/service/src/tests/j-verigraph-generator/config.py`` and + edit the following dictionaries: + + - ``devices_to_classes`` --> add the following entry: + ``"" : ""`` + if you followed these instructions carefully the name of the class + implementing the function ```` should be ``.java`` + under the package ``mcnet.netobjs``. + - ``devices_to_configuration_methods`` --> add the following entry: + ``"" : "configurationMethod"`` + if ```` is a middlebox it should have a configuration method + contained in the implementation ``.java`` under the package + ``mcnet.netobjs``. + - ``devices_initialization``: add the following entry: + ``"" : ["param1", "param2"]`` + if ```` requires any parameter when it gets instanciated + please enter them in the form of a list. Make sure that these + parameters refer to existing keys contained in the configuration + schema file (see step 3). For instance the type ``webclient`` + requires the name of a webserver it wants to communicate with. + This parameter is passed in the configuration of a weblient by + setting a property ``webserver`` to the name of the desired + webserver. The value of this property gets extracted and used by + the test generator automatically. + - ``convert_configuration_property_to_ip`` --> add the following + entry: ``"" : ["key", "value"]`` + Note that both ``key`` and ``value`` are optional and it is + critical to set them only if needed. Since the Z3 provider used + for testing works with IP addresses in this dictionary you have to + indicate whether it is needed an automatic convertion from names + to IP addresses: + - in case the keyword ``key`` is used every key of the JSON + configuration parsed will be prepended with the string ``ip_``; + - in case the keyword ``value`` is used every value of the JSON + configuration parsed will be prepended with the string ``ip_``; + - in case the list does not contain neither ``key`` nor ``value`` + the original configuration won't be touched. + +- open the file + ``/verify/service/src/tests/j-verigraph-generator/test_class_generator.py`` + and under the "switch" case in the form of a series of ifs used to + configure middle-boxes that starts at line #239 add a branch like the + following with the logic to generate the Java code for --> + ``elif nodes_types[i] == "":`` + You can take inspiration from the other branches to see how to + serialize Java code. Note that this addition to the "switch" + statement is not needed if ```` is not a middlebox or it does + not need to be configured. + +#. Restart the web service. diff --git a/verigraph/README.rst b/verigraph/README.rst deleted file mode 100644 index 947e893..0000000 --- a/verigraph/README.rst +++ /dev/null @@ -1,258 +0,0 @@ -.. This work is licensed under a Creative Commons Attribution 4.0 International License. -.. http://creativecommons.org/licenses/by/4.0 - -| Let’s look at how to deploy **VeriGraph** on Apache Tomcat. If you’re - only interested in creating gRPC API and ``neo4jmanager`` is already - deployed, you can skip this section and go straight to the - `documentation `__ -| (though you might find it useful if Tomcat is not yet installed!). - -**Windows** - -- install ``jdk1.8.X_YY`` - `here `__ -- set ambient variable ``JAVA_HOME`` to where you installed the jdk - (e.g. ``C:\Program Files\Java\jdk1.8.X_YY``) -- install Apache Tomcat 8 - `here `__ -- install a pre-compiled distribution of Z3 from - `here `__ - and save the ``[z3_root]/bin`` content under ``[verigraph]/service/build`` -- create the ``mcnet.jar`` of the ``mcnet.*`` packages and put into the ``[verigraph]/service/build`` directory -- download the qjutils library - `here `__ - and create a jar file (i.e. qjutils.jat) in ``[verigrap]/service/build`` -- set ambient variable ``CATALINA_HOME`` to the directory where you - installed Apache (e.g. - ``C:\Program Files\Java\apache-tomcat-8.0.30``) -- create ``shared`` folder under ``%CATALINA_HOME%`` -- add previously created folder to the Windows ``Path`` system variable - (i.e. append the following string at the end: - ``;%CATALINA_HOME%\shared``) -- copy ``[verigraph]/lib/mcnet.jar``, ``[verigraph]/service/build/com.microsoft.z3.jar`` and ``[verigraph]/service/build/qjutils.jar`` - to ``%CATALINA_HOME%\shared`` -- to correctly compile the code you have to put the path of ``com.microsoft.z3.jar`` - and the libraries it refers to as environment variable. i.e. is enough - to add the project subfolder ``build`` to the PATH environment variable (i.e., ``[verigraph]/build``) -- create custom file setenv.bat under ``%CATALINA_HOME%\bin`` with the - following content: - - .. code:: bat - - set CLASSPATH=%CLASSPATH%;%CATALINA_HOME%\shared\qjutils.jar;%CATALINA_HOME%\shared\mcnet.jar;%CATALINA_HOME%\shared\com.microsoft.z3.jar;.;%CATALINA_HOME%\webapps\verify\WEB-INF\classes\tests - -- download ``neo4jmanager.war`` from - `here `__ - and copy into into ``%CATALINA_HOME%\webapps`` -- export the ``verify.war`` file from the project and copy into ``%CATALINA_HOME%\webapps`` -- (optional) configure Tomcat Manager: -- open the file ``%CATALINA_HOME%\conf\tomcat-users.xml`` -- under the ``tomcat-users`` tag place the following content: - ``xml `` -- launch Tomcat 8 with the startup script - ``%CATALINA_HOME%\bin\startup.bat`` -- (optional) if you previously configured Tomcat Manager you can open a - browser and navigate to `this link `__ - and login using ``tomcat/tomcat`` as username/password -- (optional) you can deploy/undeploy/redeploy the downloaded WARs - through the web interface - -**Unix** - -- install ``jdk1.8.X_YY`` from the command line: -- go to `this - link `__ - to check the appropriate version for you OS and architecture -- copy the desired version to the clipboard (e.g. - ``http://download.oracle.com/otn-pub/java/jdk/7u79-b15/jdk-7u79-linux-x64.tar.gz``) -- open a terminal windows and paste the following command (replace - ``link`` with the previously copied link): - ``wget --no-check-certificate --no-cookies --header "Cookie: oraclelicense=accept-securebackup-cookie" 'link'`` - e.g. - ``wget --no-check-certificate --no-cookies --header "Cookie: oraclelicense=accept-securebackup-cookie" http://download.oracle.com/otn-pub/java/jdk/7u79-b15/jdk-7u79-linux-x64.tar.gz`` -- untar the archive with the following command (replace 'jdk' to match - the name of the downloaded archive): - ``tar zxvf 'jdk'.tar.gz`` - e.g. - ``tar zxvf jdk-7u-linux-x64.tar.gz`` -- delete the ``.tar.gz`` file if you want to save disk space -- install and configure Apache Tomcat 8 with the following commands: -- go to `this URL `__ - and see what the latest available version is -- download the archive (substitute every occurrence of '8.0.32' in the - following command with the latest available version): - ``wget http://it.apache.contactlab.it/tomcat/tomcat-8/v8.0.32/bin/apache-tomcat-8.0.32.tar.gz`` -- extract downloaded archive: - ``tar xvf apache-tomcat-8.0.32.tar.gz`` -- edit configuration: - ``nano ./apache-tomcat-8.0.32/conf/tomcat-users.xml`` -- under the ``tomcat-users`` tag place the following content - ``xml `` -- set a few environment variables: - ``sudo nano ~/.bashrc`` -- paste the following content at the end of the file - ``export CATALINA_HOME='/path/to/apache/tomcat/folder'`` - e.g. - ``export CATALINA_HOME=/home/mininet/apache-tomcat-8.0.33`` - ``export JRE_HOME='/path/to/jdk/folder'`` - e.g. - ``export JRE_HOME=/home/mininet/jdk1.8.0_92/jre`` - ``export JDK_HOME='/path/to/jdk/folder'`` - e.g. - ``export JDK_HOME=/home/mininet/jdk1.8.0_92`` - ``export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$CATALINA_HOME/shared`` - ``export JAVA_OPTS="-Djava.library.path=$CATALINA_HOME/shared"`` -- ``exec bash`` -- install a pre-compiled distribution of Z3 from - `here `__ - and save [z3_root]/bin content under ``[verigraph]/service/build`` -- download the qjutils library - `here `__ - and create a jar file (i.e. qjutils.jar) in ``[verigrap]/service/build`` -- create the ``mcnet.jar`` of the ``mcnet.*`` packages and put into the ``[verigraph]/service/build`` directory -- copy ``[verigraph]/service/build/mcnet.jar``, ``[verigraph]/service/build/com.microsoft.z3.jar`` - and ``[verigraph]/service/build/qjutils.jar`` to ``$CATALINA_HOME/shared`` -- customize Tomcat classpath - ``nano $CATALINA_HOME/bin/setenv.sh`` -- paste the following content and save file: - ``bash #!/bin/sh export CLASSPATH=$CLASSPATH:$CATALINA_HOME/shared/qjutils.jar:$CATALINA_HOME/shared/mcnet.jar:$CATALINA_HOME/shared/com.microsoft.z3.jar:.:$CATALINA_HOME/webapps/verify/WEB-INF/classes/tests`` -- save and close the file (``CTRL+O``, ``CTRL+X``) - ``sudo chmod +x $CATALINA_HOME/bin/setenv.sh`` -- download ``neo4jmanager.war`` from - `here `__ - and copy into into ``%CATALINA_HOME%\webapps`` -- export the ``verify.war`` file from the project and copy into ``%CATALINA_HOME%\webapps`` -- launch Tomcat 8 with the startup script - ``$CATALINA_HOME/bin/startup.sh`` -- open a browser and navigate to `this - link `__ and login using - ``tomcat/tomcat`` as username/password -- you can deploy/undeploy/redeploy the downloaded WARs through the web - interface - -**Eclipse** - -- clone project onto your hard drive with this command: - ``git clone git@github.com:netgroup-polito/verigraph.git`` -- Download Apache Tomcat 8 (see instructions above for Windows and - Unix) -- Download JDK (see instructions above for Windows and Unix) -- Configure runtime environment in Eclipse with `the following - incstructions `__ -- Add new Tomcat server on port ``8080`` -- Configure Tomcat server: - - - double-click on the newly created server in the ``Servers`` tab - - make sure under ``Server Locations`` ``Use Tomcat installation`` - is selected - - Open ``Launch Configuration``->``Classpath`` - - add the required JARS (``mcnet.jar``, ``com.microsoft.z3.jar`` and - ``qjutils.jar``) under ``User Entries`` - - Hit ``Apply`` and ``Ok`` - -- Run the server - -**How to add you own function ````** - -#. under the the ``mcnet.netobjs`` package (i.e. under - ``/verify/service/src/mcnet/netobjs``) create a new class - ``.java``, where ```` is the desired function name (i.e. - ```` will be added to the supported node functional types) - which extends ``NetworkObject`` and implement the desired logic - -#. regenerate ``mcnet.jar`` selecting the packages ``mcnet.components`` - and ``mcnet.netobjs`` and exporting the generated JAR to - ``/verify/service/build`` (overwrite the existing file) - -#. under ``/verify/src/main/webapp/json/`` create a file - ``.json``. This file represents a JSON schema (see - `here `__ the official documentation). For - compatibility with the other functions it is mandatory to support an - array as the root of the configuration, but feel free to specify all - the other constraints as needed. A sample of ``.json`` to - describe an empty configuration could be the following: - -``json { "$schema": "http://json-schema.org/draft-04/schema#", "title": "Type", "description": "This is a generic type", "type": "array", "items": { "type": "object" }, "minItems": 0, "maxItems": 0, "uniqueItems": true }`` - -#. in the package ``it.polito.escape.verify.validation`` (i.e. under - ``src/main/java/it/polito/escape/verify/validation``) create a new - class file named ``Validator.java`` (please pay attention to - the naming convention here: ```` is the function type used in - the previous step capitalized, followed by the suffix ``Validator``) - which implements ``ValidationInterface``. This class represents a - custom validator for the newly introduced type and allows for more - complex constraints, which is not possible to express through a JSON - schema file. The validate method that has to be implemented is given - the following objects: - -- ``Graph graph`` represents the nffg that the object node belongs to; -- ``Node node`` represents the node that the object configuration - belongs to; -- ``Configuration configuration`` represents the parsed configuration. - It is sufficient to call the method ``getConfiguration`` on the - ``configuration`` object to get a ``JsonNode`` (Jackson's class) and - iterate over the various fields. - In case a configuration is not valid please throw a new - ``ValidationException`` passing a descriptive failure message. - Adding a custom validator is not strictly necessary whenever a JSON - schema is thought to be sufficient. Note though that, other than the - mandatory validation against a schema, whenever a custom validator is - not found a default validation is triggered, i.e. the value of every - JSON property must refer to the name of an existing node in the - working graph/nffg. If this is not the desired behavior it is - suggested to write a custom validator with looser constraints. - -#. customize the class generator and add the support for the newly - introduced type: - -- open the file - ``/verify/service/src/tests/j-verigraph-generator/config.py`` and - edit the following dictionaries: - - - ``devices_to_classes`` --> add the following entry: - ``"" : ""`` - if you followed these instructions carefully the name of the class - implementing the function ```` should be ``.java`` - under the package ``mcnet.netobjs``. - - ``devices_to_configuration_methods`` --> add the following entry: - ``"" : "configurationMethod"`` - if ```` is a middlebox it should have a configuration method - contained in the implementation ``.java`` under the package - ``mcnet.netobjs``. - - ``devices_initialization``: add the following entry: - ``"" : ["param1", "param2"]`` - if ```` requires any parameter when it gets instanciated - please enter them in the form of a list. Make sure that these - parameters refer to existing keys contained in the configuration - schema file (see step 3). For instance the type ``webclient`` - requires the name of a webserver it wants to communicate with. - This parameter is passed in the configuration of a weblient by - setting a property ``webserver`` to the name of the desired - webserver. The value of this property gets extracted and used by - the test generator automatically. - - ``convert_configuration_property_to_ip`` --> add the following - entry: ``"" : ["key", "value"]`` - Note that both ``key`` and ``value`` are optional and it is - critical to set them only if needed. Since the Z3 provider used - for testing works with IP addresses in this dictionary you have to - indicate whether it is needed an automatic convertion from names - to IP addresses: - - in case the keyword ``key`` is used every key of the JSON - configuration parsed will be prepended with the string ``ip_``; - - in case the keyword ``value`` is used every value of the JSON - configuration parsed will be prepended with the string ``ip_``; - - in case the list does not contain neither ``key`` nor ``value`` - the original configuration won't be touched. - -- open the file - ``/verify/service/src/tests/j-verigraph-generator/test_class_generator.py`` - and under the "switch" case in the form of a series of ifs used to - configure middle-boxes that starts at line #239 add a branch like the - following with the logic to generate the Java code for --> - ``elif nodes_types[i] == "":`` - You can take inspiration from the other branches to see how to - serialize Java code. Note that this addition to the "switch" - statement is not needed if ```` is not a middlebox or it does - not need to be configured. - -#. Restart the web service. diff --git a/verigraph/service/src/tests/j-verigraph-generator/README.md b/verigraph/service/src/tests/j-verigraph-generator/README.md new file mode 100644 index 0000000..c796af7 --- /dev/null +++ b/verigraph/service/src/tests/j-verigraph-generator/README.md @@ -0,0 +1,54 @@ +.. This work is licensed under a Creative Commons Attribution 4.0 International License. +.. http://creativecommons.org/licenses/by/4.0 + +CODE\_GENERATOR Java serializer and formatter + +UTILITY Contains utility methods used by other modules + +JSON\_GENERATOR Interactive module to generate the configuration files +(default names are "chains.json" and "config.json") "chains.json" +describes all the chains of nodes belonging to a certain scenario + +TEST\_CLASS\_GENERATOR Generates one or multiple test scenarios given +the two configuration files above (default names are "chains.json" and +"config.json") All the test scenarios have to be placed in the examples +folder (i. e. under "j-verigraph/service/src/tests/examples"). Here is +the script help: + +test\_class\_generator.py -c -f -o + +Supposing the module gets executed from the project root directory (i.e. +"j-verigraph"), a sample command is the following: + +service/src/tests/j-verigraph-generator/test\_class\_generator.py -c +"service/src/tests/j-verigraph-generator/examples/budapest/chains.json" +-f +"service/src/tests/j-verigraph-generator/examples/budapest/config.json" +-o "service/src/tests/examples/Scenario" + +Keep in mind that in the previous command "Scenario" represents a prefix +which will be followed by an underscore and an incremental number +starting from 1, which represents the n-th scenario starting from the +previously mentioned "chains.json" file (this file can indeed contain +multiple chains). + +TEST\_GENERATOR Generates a file which performs the verification test +through Z3 (theorem prover from Microsoft Research) given a certain +scenario generated with the above snippet. All the test modules have to +be placed under the "tests" directory (i.e. under +"j-verigraph/service/src/tests"). Here is the module help: + +test\_generator.py -i -o -s -d + +Supposing the module gets executed from the project root directory (i.e. +"j-verigraph") a sample command given the previously generated scenario +is the following: + +service/src/tests/j-verigraph-generator/test\_generator.py -i +service/src/tests/examples/Scenario\_1.java -o +service/src/tests/Test.java -s user1 -d webserver + +The aforementioned "Test.java" file can be compiled and executed +normally. Its output will be either "SAT" or "UNSAT". For possible +statistics the test is repeated 10 times and the average execution time +in seconds is printed to the console. diff --git a/verigraph/service/src/tests/j-verigraph-generator/README.rst b/verigraph/service/src/tests/j-verigraph-generator/README.rst deleted file mode 100644 index c796af7..0000000 --- a/verigraph/service/src/tests/j-verigraph-generator/README.rst +++ /dev/null @@ -1,54 +0,0 @@ -.. This work is licensed under a Creative Commons Attribution 4.0 International License. -.. http://creativecommons.org/licenses/by/4.0 - -CODE\_GENERATOR Java serializer and formatter - -UTILITY Contains utility methods used by other modules - -JSON\_GENERATOR Interactive module to generate the configuration files -(default names are "chains.json" and "config.json") "chains.json" -describes all the chains of nodes belonging to a certain scenario - -TEST\_CLASS\_GENERATOR Generates one or multiple test scenarios given -the two configuration files above (default names are "chains.json" and -"config.json") All the test scenarios have to be placed in the examples -folder (i. e. under "j-verigraph/service/src/tests/examples"). Here is -the script help: - -test\_class\_generator.py -c -f -o - -Supposing the module gets executed from the project root directory (i.e. -"j-verigraph"), a sample command is the following: - -service/src/tests/j-verigraph-generator/test\_class\_generator.py -c -"service/src/tests/j-verigraph-generator/examples/budapest/chains.json" --f -"service/src/tests/j-verigraph-generator/examples/budapest/config.json" --o "service/src/tests/examples/Scenario" - -Keep in mind that in the previous command "Scenario" represents a prefix -which will be followed by an underscore and an incremental number -starting from 1, which represents the n-th scenario starting from the -previously mentioned "chains.json" file (this file can indeed contain -multiple chains). - -TEST\_GENERATOR Generates a file which performs the verification test -through Z3 (theorem prover from Microsoft Research) given a certain -scenario generated with the above snippet. All the test modules have to -be placed under the "tests" directory (i.e. under -"j-verigraph/service/src/tests"). Here is the module help: - -test\_generator.py -i -o -s -d - -Supposing the module gets executed from the project root directory (i.e. -"j-verigraph") a sample command given the previously generated scenario -is the following: - -service/src/tests/j-verigraph-generator/test\_generator.py -i -service/src/tests/examples/Scenario\_1.java -o -service/src/tests/Test.java -s user1 -d webserver - -The aforementioned "Test.java" file can be compiled and executed -normally. Its output will be either "SAT" or "UNSAT". For possible -statistics the test is repeated 10 times and the average execution time -in seconds is printed to the console. diff --git a/verigraph/src/main/java/it/polito/grpc/README.md b/verigraph/src/main/java/it/polito/grpc/README.md new file mode 100644 index 0000000..d089908 --- /dev/null +++ b/verigraph/src/main/java/it/polito/grpc/README.md @@ -0,0 +1,442 @@ +.. This work is licensed under a Creative Commons Attribution 4.0 International License. +.. http://creativecommons.org/licenses/by/4.0 + +gRPC Project +============ + +This project contains the interfaces for a web service based on gRPC. + +How to install: +--------------- + +For gRPC interface, add to your ``pom.xml`` (in the project this part is +already present): + +:: + + + io.grpc + grpc-netty + ${grpc.version} + + + io.grpc + grpc-protobuf + ${grpc.version} + + + io.grpc + grpc-stub + ${grpc.version} + + +For protobuf-based codegen integrated with the Maven build system, you +can use protobuf-maven-plugin : + +:: + + + + + kr.motd.maven + os-maven-plugin + 1.4.1.Final + + + + + org.xolstice.maven.plugins + protobuf-maven-plugin + 0.5.0 + + com.google.protobuf:protoc:3.1.0:exe:${os.detected.classifier} + grpc-java + io.grpc:protoc-gen-grpc-java:${grpc.version}:exe:${os.detected.classifier} + + + + + compile + compile-custom + + + + + + + +| In order to run the gRPC server and the junit test, you need to download the Manven Ant Task library + from `here `__ + and copy into ``[verigraph]/lib/`` + +| Due to the fact that the project is intended for Eclipse, you need to + install an additional Eclipse plugin because + `m2e `__ does not evaluate the extension + specified in a ``pom.xml``. `Download + ``os-maven-plugin-1.5.0.Final.jar`` `__ + and put it into the ``/plugins`` directory. +| (As you might have noticed, ``os-maven-plugin`` is a Maven extension, + a Maven plugin, and an Eclipse plugin.) + +If you are using IntelliJ IDEA, you should not have any problem. + +If you are using other IDEs such as NetBeans, you need to set the system +properties ``os-maven-plugin`` sets manually when your IDE is launched. +You usually use JVM's ``-D`` flags like the following: + + | -Dos.detected.name=linux + | -Dos.detected.arch=x86\_64 + | -Dos.detected.classifier=linux-x86\_64 + +Included files: +--------------- + +Here you can find a brief description about useful files for the gRPC +interface: + +**src/main/java:** + +- *it.polito.grpc:* + +This package includes 2 classes that represent the client and server. + + **Client.java:** + + | Client of gRPC application. It implements all possible methods + necessary for communicate with server. + | It prints out the received response. + | Moreover it provides some static methods that are used for + creating the instances of requests. + + **Service.java:** + + | Server of gRPC application. It implements all possible methods + necessary for communicate with client. + | It saves the received request on log. + | This server could be accessed by multiple clients, because + synchronizes concurrent accesses. + | Each method that is possible to call is has the equivalent + operation in REST-interface. + + **GrpcUtils.java:** + + | This class provides some static methods that are used by + ``Service.java`` in order to translate a request into a class that + is accepted by Verigraph. + | Other methods are used to translate the class of Verigraph in the + proper gRPC response. + | These functionalities are exploited by test classes. + | Furthermore this set of methods is public, so in your application + you could call them, even if this should not be useful because + ``Client.java`` provides other high-level functions. + +- *it.polito.grpc.test:* + + This package includes classes for testing the gRPC application. + + **GrpcServerTest.java:** + + | For each possible method we test if works correctly. + | We create a fake client (so this test doesn't use the method that + are present in client class) and test if it receives the expected + response. + | In a nutshell, it tests the methods of Client in case of a fake + server. + | Please notice that the test prints some errors but this is + intentional, because the program tests also error case. + | Indeed, not all methods are tested, because we have another class + (ReachabilityTest.java) that is specialized for testing the + verification method. + + **GrpcTest.java:** + + | This set of tests is intended to control the most common use + cases, in particular all methods that are callable in Client and + Service class, apart from verifyPolicy for the same reason as + before. + | It tries also to raise an exception and verify if the behavior is + as expected. + + **MultiThreadTest.java:** + + | This test creates multiple clients that connect to the server and + verify is the result is correct. These methods test the + synchronization on + | server-side. + + **ReachabilityTest.java:** + + | This file tests the verification method, it exploits the test case + already present in the project and consequently has the certainty + of testing not so simple case. In particular it reads the file in + "src/main/webapp/json" and use this as starting point. + | Some exceptions are thrown in order to verify if they are handled + in a correct way. + +**src/main/proto:** + + **verigraph.proto:** + + | File containing the description of the service. This includes the + definition of all classes used in the application. + | Moreover contains the definition of the methods that is possible + to call. + | Each possible method called by REST API is mapped on a proper gRPC + method. + | In case of error a message containing the reason is returned to + the client. + | More details are available in the section about Proto Buffer. + +**taget/generated-sources/protobuf/java:** + +- *io.grpc.verigraph:* + + This package includes all classes generated from verigraph.proto by + means of protoc. For each object you can find 2 classes : + + **{NameObject}Grpc.java** + + **{NameObject}GrpcOrBuilder.java** + + The first is the real implementation, the second is the + interface. + +**taget/generated-sources/protobuf/grpc-java:** + +- *io.grpc.verigraph:* + + This package includes a single class generated from verigraph.proto + by means of protoc. + + **VerigraphGrpc.java:** + + This is useful in order to create the stubs that are necessary to + communicate both for client and server. + +**lib:** + +This folder includes a jar used for compiling the project with Ant. + + \*\*maven-ant-tasks-2.1.3.\ jar:** + + This file is used by build.xml in order to include the maven + dependencies. + +**pom.xml:** + +| Modified in order to add all necessary dependencies. It contains also + the build tag used for create the generated-sources folders. +| This part is added according to documentation of gRPC for java as + explained above in How To Install section. +| For further clarification go to `this + link `__. + +**build.xml:** + +This ant file permit to run and compile the program in a simple way, it +exploits the maven-ant-tasks-2.1.3.jar already present in project. + +It contains 3 fundamental tasks for gRPC interface: + +- **build:** compile the program + +- **run:** run both client and server + +- **run-client :** run only client + +- **run-server :** run only server + +- **run-test :** launch all tests that are present in the package, + prints out the partial results and global result. + +Note that the execution of these tests may take up to 1-2 minutes when +successful, according to your computer architecture. + +More Information About Proto Buffer: +------------------------------------ + +Further clarification about verigraph.proto: + +- A ``simple RPC`` where the client sends a request to the server using + the stub and waits for a response to come back, just like a normal + function call. + + .. code:: xml + + // Obtains a graph + rpc GetGraph (RequestID) returns (GraphGrpc) {} + +In this case we send a request that contains the id of the graph and the +response is a Graph. + +- A ``server-side streaming RPC`` where the client sends a request to + the server and gets a stream to read a sequence of messages back. The + client reads from the returned stream until there are no more + messages. As you can see in our example, you specify a server-side + streaming method by placing the stream keyword before the response + type. + + .. code:: xml + + + // Obtains a list of Nodes + rpc GetNodes (RequestID) returns (stream NodeGrpc) {} + +In this case we send a request that contains the id of the graph and the +response is a list of Nodes that are inside graph. + +Further possibilities are available but in this project are not +expolied. If you are curious see +`here `__. + +Our ``.proto`` file also contains protocol buffer message type +definitions for all the request and response types used in our service +methods - for example, here’s the ``RequestID`` message type: + +.. code:: xml + + message RequestID { + int64 idGraph = 1; + int64 idNode = 2; + int64 idNeighbour = 3; + } + +The " = 1", " = 2" markers on each element identify the unique "tag" +that field uses in the binary encoding. Tag numbers 1-15 require one +less byte to encode than higher numbers, so as an optimization you can +decide to use those tags for the commonly used or repeated elements, +leaving tags 16 and higher for less-commonly used optional elements. +Each element in a repeated field requires re-encoding the tag number, so +repeated fields are particularly good candidates for this optimization. + +Protocol buffers are the flexible, efficient, automated solution to +solve exactly the problem of serialization. With protocol buffers, you +write a .proto description of the data structure you wish to store. From +that, the protocol buffer compiler creates a class that implements +automatic encoding and parsing of the protocol buffer data with an +efficient binary format. The generated class provides getters and +setters for the fields that make up a protocol buffer and takes care of +the details of reading and writing the protocol buffer as a unit. +Importantly, the protocol buffer format supports the idea of extending +the format over time in such a way that the code can still read data +encoded with the old format. + +:: + + syntax = "proto3"; + + package verigraph; + + option java_multiple_files = true; + option java_package = "io.grpc.verigraph"; + option java_outer_classname = "VerigraphProto"; + ``` + +This .proto file works for protobuf 3, that is slightly different from +the version 2, so be careful if you have code already installed. + +The .proto file starts with a package declaration, which helps to +prevent naming conflicts between different projects. In Java, the +package name is used as the ``Java package`` unless you have explicitly +specified a java\_package, as we have here. Even if you do provide a +``java_package``, you should still define a normal ``package`` as well +to avoid name collisions in the Protocol Buffers name space as well as +in non-Java languages. + +| After the package declaration, you can see two options that are + Java-specific: ``java_package`` and ``java_outer_classname``. + ``java_package`` specifies in what Java package name your generated + classes should live. If you don't specify this explicitly, it simply + matches the package name given by the package declaration, but these + names usually aren't appropriate Java package names (since they + usually don't start with a domain name). The ``java_outer_classname`` + option defines the class name which should contain all of the classes + in this file. If you don't give a ``java_outer_classname explicitly``, + it will be generated by converting the file name to camel case. For + example, "my\_proto.proto" would, by default, use "MyProto" as the + outer class name. +| In this case this file is not generated, because + ``java_multiple_files`` option is true, so for each message we + generate a different class. + +For further clarifications see +`here `__ + +Notes +----- + +For gRPC interface you need that neo4jmanager service is already +deployed, so if this is not the case, please follow the instructions at +this +`link `__. + +In this version there are some modified files compared to the original +`Verigraph project `__ + +**it.polito.escape.verify.service.NodeService:** + +At line 213 we modified the path, because this service is intended to +run not only in container, as Tomcat, so we added other possibility that +files is placed in src/main/webapp/json/ folder. + +**it.polito.escape.verify.service.VerificationService:** + +In the original case it searches for python files in "webapps" folder, +that is present if the service is deployed in a container, but absent +otherwise. So we added another string that will be used in the case the +service doesn't run in Tomcat. + +**it.polito.escape.verify.databese.DatabaseClass:** + +Like before we added the possibility that files are not in "webapps" +folder, so is modified in order to run in any environment. Modification +in method loadDataBase() and persistDatabase(). + +| Pay attention that Python is needed for the project. If it is not + already present on your computer, please `download + it `__. +| It works fine with Python 2.7.3, or in general Python 2. + +| If you have downloaded a Python version for 64-bit architecture please + copy the files in "service/z3\_64" and paste in "service/build" and + substitute them, +| because this project works with Python for 32-bit architecture. + +Python and Z3 must support the same architetcure. + +Moreover you need the following dependencies installed on your python +distribution: + +- "requests" python package -> +http://docs.python-requests.org/en/master/ + +- "jsonschema" python package -> https://pypi.python.org/pypi/jsonschema + +| HINT - to install a package you can raise the following command (Bash + on Linux or DOS shell on Windows): python -m pip install jsonschema + python -m pip install requests +| Pay attention that it is possible that you have to modify the PATH + environment variable because is necessary to address the python + folder, used for verification phase. + +Remember to read the +`README.rtf `__ +and to follow the instructions in order to deploy the Verigraph service. + +| In the latest version of Maven there is the possibility that the + downloaded files are incompatible with Java Version of the project + (1.8). +| In this case you have to modify the file ``hk2-parent-2.4.0-b31.pom`` + under your local Maven repository (e.g. + 'C:\\Users\\Standard.m2\\repository') +| and in the path ``\org\glassfish\hk2\hk2-parent\2.4.0-b31`` find the + file and modify at line 1098 (in section ``profile``) the ``jdk`` + version to ``[1.8,)`` . + +| Admittedly, the version that is supported by the downloaded files from + Maven Dependencies is incompatible with jdk of the project. +| So modify the file ``gson-2.3.pom`` in Maven repository, under + ``com\google\code\gson\gson\2.3`` directory, in particular line 91, + from ``[1.8,`` to ``[1.8,)``. + +This project was also tested on Linux Ubuntu 15.10. diff --git a/verigraph/src/main/java/it/polito/grpc/README.rst b/verigraph/src/main/java/it/polito/grpc/README.rst deleted file mode 100644 index d089908..0000000 --- a/verigraph/src/main/java/it/polito/grpc/README.rst +++ /dev/null @@ -1,442 +0,0 @@ -.. This work is licensed under a Creative Commons Attribution 4.0 International License. -.. http://creativecommons.org/licenses/by/4.0 - -gRPC Project -============ - -This project contains the interfaces for a web service based on gRPC. - -How to install: ---------------- - -For gRPC interface, add to your ``pom.xml`` (in the project this part is -already present): - -:: - - - io.grpc - grpc-netty - ${grpc.version} - - - io.grpc - grpc-protobuf - ${grpc.version} - - - io.grpc - grpc-stub - ${grpc.version} - - -For protobuf-based codegen integrated with the Maven build system, you -can use protobuf-maven-plugin : - -:: - - - - - kr.motd.maven - os-maven-plugin - 1.4.1.Final - - - - - org.xolstice.maven.plugins - protobuf-maven-plugin - 0.5.0 - - com.google.protobuf:protoc:3.1.0:exe:${os.detected.classifier} - grpc-java - io.grpc:protoc-gen-grpc-java:${grpc.version}:exe:${os.detected.classifier} - - - - - compile - compile-custom - - - - - - - -| In order to run the gRPC server and the junit test, you need to download the Manven Ant Task library - from `here `__ - and copy into ``[verigraph]/lib/`` - -| Due to the fact that the project is intended for Eclipse, you need to - install an additional Eclipse plugin because - `m2e `__ does not evaluate the extension - specified in a ``pom.xml``. `Download - ``os-maven-plugin-1.5.0.Final.jar`` `__ - and put it into the ``/plugins`` directory. -| (As you might have noticed, ``os-maven-plugin`` is a Maven extension, - a Maven plugin, and an Eclipse plugin.) - -If you are using IntelliJ IDEA, you should not have any problem. - -If you are using other IDEs such as NetBeans, you need to set the system -properties ``os-maven-plugin`` sets manually when your IDE is launched. -You usually use JVM's ``-D`` flags like the following: - - | -Dos.detected.name=linux - | -Dos.detected.arch=x86\_64 - | -Dos.detected.classifier=linux-x86\_64 - -Included files: ---------------- - -Here you can find a brief description about useful files for the gRPC -interface: - -**src/main/java:** - -- *it.polito.grpc:* - -This package includes 2 classes that represent the client and server. - - **Client.java:** - - | Client of gRPC application. It implements all possible methods - necessary for communicate with server. - | It prints out the received response. - | Moreover it provides some static methods that are used for - creating the instances of requests. - - **Service.java:** - - | Server of gRPC application. It implements all possible methods - necessary for communicate with client. - | It saves the received request on log. - | This server could be accessed by multiple clients, because - synchronizes concurrent accesses. - | Each method that is possible to call is has the equivalent - operation in REST-interface. - - **GrpcUtils.java:** - - | This class provides some static methods that are used by - ``Service.java`` in order to translate a request into a class that - is accepted by Verigraph. - | Other methods are used to translate the class of Verigraph in the - proper gRPC response. - | These functionalities are exploited by test classes. - | Furthermore this set of methods is public, so in your application - you could call them, even if this should not be useful because - ``Client.java`` provides other high-level functions. - -- *it.polito.grpc.test:* - - This package includes classes for testing the gRPC application. - - **GrpcServerTest.java:** - - | For each possible method we test if works correctly. - | We create a fake client (so this test doesn't use the method that - are present in client class) and test if it receives the expected - response. - | In a nutshell, it tests the methods of Client in case of a fake - server. - | Please notice that the test prints some errors but this is - intentional, because the program tests also error case. - | Indeed, not all methods are tested, because we have another class - (ReachabilityTest.java) that is specialized for testing the - verification method. - - **GrpcTest.java:** - - | This set of tests is intended to control the most common use - cases, in particular all methods that are callable in Client and - Service class, apart from verifyPolicy for the same reason as - before. - | It tries also to raise an exception and verify if the behavior is - as expected. - - **MultiThreadTest.java:** - - | This test creates multiple clients that connect to the server and - verify is the result is correct. These methods test the - synchronization on - | server-side. - - **ReachabilityTest.java:** - - | This file tests the verification method, it exploits the test case - already present in the project and consequently has the certainty - of testing not so simple case. In particular it reads the file in - "src/main/webapp/json" and use this as starting point. - | Some exceptions are thrown in order to verify if they are handled - in a correct way. - -**src/main/proto:** - - **verigraph.proto:** - - | File containing the description of the service. This includes the - definition of all classes used in the application. - | Moreover contains the definition of the methods that is possible - to call. - | Each possible method called by REST API is mapped on a proper gRPC - method. - | In case of error a message containing the reason is returned to - the client. - | More details are available in the section about Proto Buffer. - -**taget/generated-sources/protobuf/java:** - -- *io.grpc.verigraph:* - - This package includes all classes generated from verigraph.proto by - means of protoc. For each object you can find 2 classes : - - **{NameObject}Grpc.java** - - **{NameObject}GrpcOrBuilder.java** - - The first is the real implementation, the second is the - interface. - -**taget/generated-sources/protobuf/grpc-java:** - -- *io.grpc.verigraph:* - - This package includes a single class generated from verigraph.proto - by means of protoc. - - **VerigraphGrpc.java:** - - This is useful in order to create the stubs that are necessary to - communicate both for client and server. - -**lib:** - -This folder includes a jar used for compiling the project with Ant. - - \*\*maven-ant-tasks-2.1.3.\ jar:** - - This file is used by build.xml in order to include the maven - dependencies. - -**pom.xml:** - -| Modified in order to add all necessary dependencies. It contains also - the build tag used for create the generated-sources folders. -| This part is added according to documentation of gRPC for java as - explained above in How To Install section. -| For further clarification go to `this - link `__. - -**build.xml:** - -This ant file permit to run and compile the program in a simple way, it -exploits the maven-ant-tasks-2.1.3.jar already present in project. - -It contains 3 fundamental tasks for gRPC interface: - -- **build:** compile the program - -- **run:** run both client and server - -- **run-client :** run only client - -- **run-server :** run only server - -- **run-test :** launch all tests that are present in the package, - prints out the partial results and global result. - -Note that the execution of these tests may take up to 1-2 minutes when -successful, according to your computer architecture. - -More Information About Proto Buffer: ------------------------------------- - -Further clarification about verigraph.proto: - -- A ``simple RPC`` where the client sends a request to the server using - the stub and waits for a response to come back, just like a normal - function call. - - .. code:: xml - - // Obtains a graph - rpc GetGraph (RequestID) returns (GraphGrpc) {} - -In this case we send a request that contains the id of the graph and the -response is a Graph. - -- A ``server-side streaming RPC`` where the client sends a request to - the server and gets a stream to read a sequence of messages back. The - client reads from the returned stream until there are no more - messages. As you can see in our example, you specify a server-side - streaming method by placing the stream keyword before the response - type. - - .. code:: xml - - - // Obtains a list of Nodes - rpc GetNodes (RequestID) returns (stream NodeGrpc) {} - -In this case we send a request that contains the id of the graph and the -response is a list of Nodes that are inside graph. - -Further possibilities are available but in this project are not -expolied. If you are curious see -`here `__. - -Our ``.proto`` file also contains protocol buffer message type -definitions for all the request and response types used in our service -methods - for example, here’s the ``RequestID`` message type: - -.. code:: xml - - message RequestID { - int64 idGraph = 1; - int64 idNode = 2; - int64 idNeighbour = 3; - } - -The " = 1", " = 2" markers on each element identify the unique "tag" -that field uses in the binary encoding. Tag numbers 1-15 require one -less byte to encode than higher numbers, so as an optimization you can -decide to use those tags for the commonly used or repeated elements, -leaving tags 16 and higher for less-commonly used optional elements. -Each element in a repeated field requires re-encoding the tag number, so -repeated fields are particularly good candidates for this optimization. - -Protocol buffers are the flexible, efficient, automated solution to -solve exactly the problem of serialization. With protocol buffers, you -write a .proto description of the data structure you wish to store. From -that, the protocol buffer compiler creates a class that implements -automatic encoding and parsing of the protocol buffer data with an -efficient binary format. The generated class provides getters and -setters for the fields that make up a protocol buffer and takes care of -the details of reading and writing the protocol buffer as a unit. -Importantly, the protocol buffer format supports the idea of extending -the format over time in such a way that the code can still read data -encoded with the old format. - -:: - - syntax = "proto3"; - - package verigraph; - - option java_multiple_files = true; - option java_package = "io.grpc.verigraph"; - option java_outer_classname = "VerigraphProto"; - ``` - -This .proto file works for protobuf 3, that is slightly different from -the version 2, so be careful if you have code already installed. - -The .proto file starts with a package declaration, which helps to -prevent naming conflicts between different projects. In Java, the -package name is used as the ``Java package`` unless you have explicitly -specified a java\_package, as we have here. Even if you do provide a -``java_package``, you should still define a normal ``package`` as well -to avoid name collisions in the Protocol Buffers name space as well as -in non-Java languages. - -| After the package declaration, you can see two options that are - Java-specific: ``java_package`` and ``java_outer_classname``. - ``java_package`` specifies in what Java package name your generated - classes should live. If you don't specify this explicitly, it simply - matches the package name given by the package declaration, but these - names usually aren't appropriate Java package names (since they - usually don't start with a domain name). The ``java_outer_classname`` - option defines the class name which should contain all of the classes - in this file. If you don't give a ``java_outer_classname explicitly``, - it will be generated by converting the file name to camel case. For - example, "my\_proto.proto" would, by default, use "MyProto" as the - outer class name. -| In this case this file is not generated, because - ``java_multiple_files`` option is true, so for each message we - generate a different class. - -For further clarifications see -`here `__ - -Notes ------ - -For gRPC interface you need that neo4jmanager service is already -deployed, so if this is not the case, please follow the instructions at -this -`link `__. - -In this version there are some modified files compared to the original -`Verigraph project `__ - -**it.polito.escape.verify.service.NodeService:** - -At line 213 we modified the path, because this service is intended to -run not only in container, as Tomcat, so we added other possibility that -files is placed in src/main/webapp/json/ folder. - -**it.polito.escape.verify.service.VerificationService:** - -In the original case it searches for python files in "webapps" folder, -that is present if the service is deployed in a container, but absent -otherwise. So we added another string that will be used in the case the -service doesn't run in Tomcat. - -**it.polito.escape.verify.databese.DatabaseClass:** - -Like before we added the possibility that files are not in "webapps" -folder, so is modified in order to run in any environment. Modification -in method loadDataBase() and persistDatabase(). - -| Pay attention that Python is needed for the project. If it is not - already present on your computer, please `download - it `__. -| It works fine with Python 2.7.3, or in general Python 2. - -| If you have downloaded a Python version for 64-bit architecture please - copy the files in "service/z3\_64" and paste in "service/build" and - substitute them, -| because this project works with Python for 32-bit architecture. - -Python and Z3 must support the same architetcure. - -Moreover you need the following dependencies installed on your python -distribution: - -- "requests" python package -> -http://docs.python-requests.org/en/master/ - -- "jsonschema" python package -> https://pypi.python.org/pypi/jsonschema - -| HINT - to install a package you can raise the following command (Bash - on Linux or DOS shell on Windows): python -m pip install jsonschema - python -m pip install requests -| Pay attention that it is possible that you have to modify the PATH - environment variable because is necessary to address the python - folder, used for verification phase. - -Remember to read the -`README.rtf `__ -and to follow the instructions in order to deploy the Verigraph service. - -| In the latest version of Maven there is the possibility that the - downloaded files are incompatible with Java Version of the project - (1.8). -| In this case you have to modify the file ``hk2-parent-2.4.0-b31.pom`` - under your local Maven repository (e.g. - 'C:\\Users\\Standard.m2\\repository') -| and in the path ``\org\glassfish\hk2\hk2-parent\2.4.0-b31`` find the - file and modify at line 1098 (in section ``profile``) the ``jdk`` - version to ``[1.8,)`` . - -| Admittedly, the version that is supported by the downloaded files from - Maven Dependencies is incompatible with jdk of the project. -| So modify the file ``gson-2.3.pom`` in Maven repository, under - ``com\google\code\gson\gson\2.3`` directory, in particular line 91, - from ``[1.8,`` to ``[1.8,)``. - -This project was also tested on Linux Ubuntu 15.10. diff --git a/verigraph/tester/README.md b/verigraph/tester/README.md new file mode 100644 index 0000000..35396a3 --- /dev/null +++ b/verigraph/tester/README.md @@ -0,0 +1,38 @@ +.. This work is licensed under a Creative Commons Attribution 4.0 International License. +.. http://creativecommons.org/licenses/by/4.0 + +In order to run the automatic testing script test.py, you need the +following dependencies installed on your python distribution: + +- "requests" python package -> + http://docs.python-requests.org/en/master/ +- "jsonschema" python package -> + https://pypi.python.org/pypi/jsonschema + +IMPORTANT - If you have multiple versions of Python installed on your +machine, check carefully that the version you are actually using when +running the script, has the required packages installed. Requested +version is Python 3+ + +| HINT - to install a package you can raise the following command (Bash + on Linux or DOS shell on Windows): +| python -m pip install jsonschema +| python -m pip install requests + +Tested on PYTHON 3.4.3 + +To add a new test, just put a new .json file inside the testcases +folder. The corresponding JSON schema is in the testcase\_schema.json +file and some examples are already available. Each json file should +specify: + +- id, an integer for the testcase; +- name, the name for the testcase; +- description, an optional description; +- policy\_url\_parameters, the parameters to be appended after the + verification URL (including the '?' character); +- result, the expected verification result; +- graph, the graph to be tested (the same object that you usually POST + to VeriGraph to create a new graph). + The test.py script will test each .json file contained into the + testcases folder and will provide a complete output. diff --git a/verigraph/tester/README.rst b/verigraph/tester/README.rst deleted file mode 100644 index 35396a3..0000000 --- a/verigraph/tester/README.rst +++ /dev/null @@ -1,38 +0,0 @@ -.. This work is licensed under a Creative Commons Attribution 4.0 International License. -.. http://creativecommons.org/licenses/by/4.0 - -In order to run the automatic testing script test.py, you need the -following dependencies installed on your python distribution: - -- "requests" python package -> - http://docs.python-requests.org/en/master/ -- "jsonschema" python package -> - https://pypi.python.org/pypi/jsonschema - -IMPORTANT - If you have multiple versions of Python installed on your -machine, check carefully that the version you are actually using when -running the script, has the required packages installed. Requested -version is Python 3+ - -| HINT - to install a package you can raise the following command (Bash - on Linux or DOS shell on Windows): -| python -m pip install jsonschema -| python -m pip install requests - -Tested on PYTHON 3.4.3 - -To add a new test, just put a new .json file inside the testcases -folder. The corresponding JSON schema is in the testcase\_schema.json -file and some examples are already available. Each json file should -specify: - -- id, an integer for the testcase; -- name, the name for the testcase; -- description, an optional description; -- policy\_url\_parameters, the parameters to be appended after the - verification URL (including the '?' character); -- result, the expected verification result; -- graph, the graph to be tested (the same object that you usually POST - to VeriGraph to create a new graph). - The test.py script will test each .json file contained into the - testcases folder and will provide a complete output. -- cgit 1.2.3-korg