13.2. framework package

13.2.1. framework.basic_primitives module

framework.basic_primitives.calc_parity_bit(x)

return 0 if the number of bits is even, otherwise returns 1

framework.basic_primitives.corrupt_bits(s, p=0.01, n=None, ascii=False)

Flip a given percentage or number of bits from a string

framework.basic_primitives.corrupt_bytes(s, p=0.01, n=None, ctrl_char=False)

Corrupt a given percentage or number of bytes from a string

framework.basic_primitives.rand_string(size=None, min=1, max=10, str_set='0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ!"#$%&\'()*+, -./:;<=>?@[\\]^_`{|}~ \t\n\r\x0b\x0c')

13.2.2. framework.data module

class framework.data.CallBackOps(remove_cb=False, stop_process_cb=False)

Bases: object

Add_PeriodicData = 10
Del_PeriodicData = 11
RemoveCB = 1
Replace_Data = 30
Set_FbkTimeout = 21
StopProcessingCB = 2
__init__(remove_cb=False, stop_process_cb=False)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.data'
add_operation(instr_type, id=None, param=None, period=None)
get_operations()
is_flag_set(name)
set_flag(name)
class framework.data.Data(data=None, altered=False, tg_ids=None)

Bases: object

__copy__()
__init__(data=None, altered=False, tg_ids=None)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.data'
__repr__() <==> repr(x)
__str__() <==> str(x)
_empty_data_backend = <framework.data.EmptyBackend object>
add_info(info_str)
bind_info(dmaker_type, data_maker_name)
cleanup_all_callbacks()
cleanup_callbacks(hook=<HOOK.after_fbk: 5>)
content
copy_callback_from(data)
copy_info_from(data)
generate_info_from_content(original_data=None, origin=None, additional_info=None)
get_content(do_copy=False)
get_data_id()
get_data_model()
get_history()
get_initial_dmaker()
get_length()
has_info()
info_exists(dmaker_type, data_maker_name, info)
is_blocked()
is_empty()
is_recordable()
is_unusable()
make_blocked()
make_free()
make_recordable()
make_unusable()
origin
pending_callback_ops(hook=<HOOK.after_fbk: 5>)
pretty_print(raw_limit=200, log_func=<built-in method write of file object>)
read_info(dmaker_type, data_maker_name)
register_callback(callback, hook=<HOOK.after_fbk: 5>)
remove_info_from(dmaker_type, data_maker_name)
run_callbacks(feedback=None, hook=<HOOK.after_fbk: 5>)
set_data_id(data_id)
set_data_model(dm)
set_history(hist)
set_initial_dmaker(t)
show(raw_limit=200, log_func=<built-in method write of file object>)
tg_ids
to_bytes()
to_str()
update_from(obj)
class framework.data.DataBackend(content=None)

Bases: object

__init__(content=None)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.data'
content
data_maker_name
data_maker_type
data_model
get_content(do_copy=False, materialize=True)
get_length()
show(raw_limit=200, log_func=<built-in method write of file object>)
to_bytes()
to_str()
update_from(obj)
class framework.data.EmptyBackend(content=None)

Bases: framework.data.DataBackend

__module__ = 'framework.data'
content
data_maker_name
data_maker_type
data_model
get_content(do_copy=False, materialize=True)
get_length()
to_bytes()
to_str()
class framework.data.NodeBackend(content=None)

Bases: framework.data.DataBackend

__copy__()
__module__ = 'framework.data'
content
data_maker_name
data_maker_type
get_content(do_copy=False, materialize=True)
get_length()
show(raw_limit=200, log_func=<built-in method write of file object>)
to_bytes()
to_str()
update_from(obj)
class framework.data.RawBackend(content=None)

Bases: framework.data.DataBackend

__module__ = 'framework.data'
content
data_maker_name
data_maker_type
get_content(do_copy=False, materialize=True)
get_length()
show(raw_limit=200, log_func=<built-in method write of file object>)
to_bytes()
to_str()
update_from(obj)

13.2.3. framework.data_model module

class framework.data_model.DataModel

Bases: object

Data Model Abstraction

__init__()

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.data_model'
__str__() <==> str(x)
_atom_absorption_additional_actions(atom)

Called by .create_atom_from_raw_data(). Should be overloaded if specific actions need to be performed on the atoms created from imported raw data

Parameters:atom – Atom that are going to be registered after being absorbed from raw data
Returns:An atom and a short description of the actions
_backend(atom)
atom_identifiers()
build_data_model()

This method is called when a data model is loaded. It is called only the first time the data model is loaded. To be implemented by the user.

cleanup()
create_atom_from_raw_data(data, idx, filename)

This function is called for each files (with the right extension) present in imported_data/<data_model_name> and absorb their content by leveraging the atoms of the data model registered for absorption or if none are registered, wrap their content in a framework.node.Node.

Parameters:
  • filename (str) – name of the imported file
  • data (bytes) – file content
  • idx (int) – index of the imported file
Returns:

An atom or None

decode(data, atom_name=None, requested_abs_csts=None, colorized=True)
file_extension = 'bin'
get_atom(hash_key, name=None)
get_atom_for_absorption(hash_key)
get_external_atom(dm_name, data_id, name=None)
get_import_directory_path(subdir=None)
import_file_contents(extension=None, absorber=None, subdir=None, path=None, filename=None)
knowledge_source = None
load_data_model(dm_db)
merge_with(data_model)
name = None
pre_build()

This method is called when a data model is loaded. It is executed before build_data_model(). To be implemented by the user.

register(*atom_list)
register_atom_for_absorption(atom, absorb_constraints=AbsFullCsts(), decoding_scope=None)

Register an atom that will be used by the DataModel when an operation requiring data absorption is performed, like self.decode().

Parameters:
  • atom – Atom to register for absorption
  • absorb_constraints – Constraints to be used for the absorption
  • decoding_scope – Should be either an atom name or a list of the atom name that can be absorbed by the registered atom. If set to None, the atom will be the default one used for absorption operation if no other nodes exist with a specific scope.
show()
update_atom(atom)
validation_tests()

Optional test cases to validate the correct behavior of the data model

Returns:True if the validation succeeds. False otherwise
Return type:bool
class framework.data_model.NodeBackend(data_model)

Bases: object

__init__(data_model)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.data_model'
atom_copy(orig_atom, new_name=None)
get_all_confs()
merge_with(node_backend)
prepare_atom(atom)
update_atom(atom, existing_env=False)

13.2.4. framework.node module

class framework.node.BitFieldCondition(sf, val=None, neg_val=None, gt_val=None, lt_val=None)

Bases: framework.node.NodeCondition

__init__(sf, val=None, neg_val=None, gt_val=None, lt_val=None)
Parameters:
  • sf (int/list of int) – subfield(s) of the BitField() on which the condition apply
  • val (int/list of int/list of list of int) – integer(s) that satisfies the condition(s)
  • neg_val (int/list of int/list of list of int) – integer(s) that does NOT satisfy the condition(s) (AND clause)
  • gt_val (int/list of int/list of list of int) – condition met if subfield(s) greater than or equal to values in this field (AND clause)
  • lt_val (int/list of int/list of list of int) – condition met if subfield(s) lesser than or equal to values in this field (AND clause)
__module__ = 'framework.node'
check(node)
class framework.node.DJobGroup(node_list)

Bases: object

__id__()
__init__(node_list)

x.__init__(…) initializes x; see help(type(x)) for signature

__iter__()
__module__ = 'framework.node'
__repr__() <==> repr(x)
__reversed__()
class framework.node.DynNode_Helpers

Bases: object

__copy__()
__init__()

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.node'
_get_graph_info()
_update_dyn_helper(env)
clear_graph_info_since(node)
graph_info
make_private(env=None)
reset_graph_info()
set_graph_info(node, info)
class framework.node.Env

Bases: object

__copy__()
__getattr__(name)
__init__()

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.node'
add_node_to_corrupt(node, corrupt_type=None, corrupt_op=<function <lambda>>)
cleanup_basic_djobs(prio)
cleanup_remaining_djobs(prio)
clear_all_exhausted_nodes()
clear_exhausted_node(node)
delayed_jobs_pending
djobs_exists(prio)
execute_basic_djobs(prio)
exhausted_node_exists()
exhausted_nodes_amount()
get_all_djob_groups(prio)
get_basic_djobs(prio)
get_data_model()
get_djobs_by_gid(group_id, prio)
get_exhausted_nodes()
is_djob_registered(key, prio)
is_empty()
is_node_exhausted(node)
knowledge_source = None
notify_exhausted_node(node)
register_basic_djob(func, args, prio=1)
register_djob(func, group, key, cleanup=None, args=None, prio=1)
remove_djob(group, key, prio)
remove_node_to_corrupt(node)
set_data_model(dm)
update_node_refs(node_dico, ignore_frozen_state)
class framework.node.Env4NT

Bases: object

Define methods for non-terminal nodes

__copy__()
__init__()

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.node'
clear_drawn_node_attrs(node_id)
get_drawn_node_qty(node_id)
get_drawn_node_sz(node_id)
is_empty()
node_exists(node_id)
reset()
set_drawn_node_attrs(node_id, nb, sz)
update_node_ids(id_list)
class framework.node.FuncCusto(items_to_set=None, items_to_clear=None, transform_func=None)

Bases: framework.node.NodeCustomization

Function node behavior-customization To be provided to NodeInternals.customize()

CloneExtNodeArgs = 2
FrozenArgs = 1
__module__ = 'framework.node'
_custo_items = {1: True, 2: False}
clone_ext_node_args_mode
frozen_args_mode
class framework.node.GenFuncCusto(items_to_set=None, items_to_clear=None, transform_func=None)

Bases: framework.node.NodeCustomization

Generator node behavior-customization To be provided to NodeInternals.customize()

CloneExtNodeArgs = 2
ForwardConfChange = 1
ResetOnUnfreeze = 3
TriggerLast = 4
__module__ = 'framework.node'
_custo_items = {1: True, 2: False, 3: True, 4: False}
clone_ext_node_args_mode
forward_conf_change_mode
reset_on_unfreeze_mode
trigger_last_mode
class framework.node.IntCondition(val=None, neg_val=None, gt_val=None, lt_val=None)

Bases: framework.node.NodeCondition

__init__(val=None, neg_val=None, gt_val=None, lt_val=None)
Parameters:
  • val (int/list of int) – integer(s) that satisfies the condition
  • neg_val (int/list of int) – integer(s) that does NOT satisfy the condition (AND clause)
  • gt_val (int) – condition met if greater than or equal to this value (AND clause)
  • lt_val (int) – condition met if lesser than or equal to this value (AND clause)
__module__ = 'framework.node'
check(node)
class framework.node.Node(name, base_node=None, copy_dico=None, ignore_frozen_state=False, accept_external_entanglement=False, acceptance_set=None, subnodes=None, values=None, value_type=None, vt=None, new_env=False)

Bases: object

A Node is the basic building-block used within a graph-based data model.

internals

Contains all the configuration of a node. A configuration is associated to the internals/contents of a node, which can live independently of the other configuration.

Type:dict of str –> NodeInternals
current_conf

Identifier to a configuration. Every usable node use at least one main configuration, namely 'MAIN'.

Type:str
name

Identifier of a node. Defined at instantiation. Shall be unique from its parent perspective.

Type:str
env

One environment object is added to all the nodes of a node graph when the latter is registered within a data model (cf. DataModel.register()). It is used for sharing global resources between nodes.

Type:Env
entangled_nodes

Collection of all the nodes entangled with this one. All the entangled nodes will react the same way as one of their peers (within some extent) if this peer is subjected to a stimuli. The node’s properties related to entanglement are only the ones that directly define a node. For instance, changing a node’s NodeInternals will propagate to its entangled peers but changing the state of a node’s NodeInternals won’t propagate. It is used for dealing with multiple instance of a same node (within the scope of a NonTerm node—cf. NodeInternals_NonTerm.get_subnodes_with_csts()). But this mechanism can also be used for your own specific purpose.

Type:set(Node)
semantics

(optional) Used to associate a semantics to a node. Can be used during graph traversal in order to perform actions related to semantics.

Type:NodeSemantics
fuzz_weight

The fuzz weight is an optional attribute of Node() which express Data Model designer’s hints for prioritizing the nodes to fuzz. If set, this attribute is used by some generic disruptors (the ones that rely on a ModelWalker object—refer to fuzzing_primitives.py)

Type:int
depth

Depth of the node within the graph from a specific given root. Will be computed lazily (only when requested).

Type:int
tmp_ref_count

(internal use) Temporarily used during the creation of multiple instance of a same node, especially in order to generate unique names.

Type:int
_post_freeze_handler

Is executed just after a node is frozen (which is the result of requesting its value when it is not freezed—e.g., at its creation).

Type:function
CORRUPT_EXIST_COND = 5
CORRUPT_NODE_QTY = 7
CORRUPT_QTY_SYNC = 6
CORRUPT_SIZE_SYNC = 8
DEFAULT_DISABLED_NODEINT = <framework.node.NodeInternals_Empty object>
DEFAULT_DISABLED_VALUE = ''
DJOBS_PRIO_dynhelpers = 200
DJOBS_PRIO_genfunc = 300
DJOBS_PRIO_nterm_existence = 100
_Node__check_conf(conf)
_Node__get_confs()
_Node__get_current_internals()
_Node__get_internals()
_Node__set_current_internals(internal)
__copy__()
__getattr__(name)
__getitem__(key)
__hash__() <==> hash(x)
__init__(name, base_node=None, copy_dico=None, ignore_frozen_state=False, accept_external_entanglement=False, acceptance_set=None, subnodes=None, values=None, value_type=None, vt=None, new_env=False)
Parameters:
  • name (str) – Name of the node. Every children node of a node shall have a unique name. Useful to look for specific nodes within a graph.
  • subnodes (list) – (Optional) List of subnodes. If provided the Node will be created as a non-terminal node.
  • values (list) – (Optional) List of strings. If provided the instantiated node will be a String-typed leaf node (taking its possible values from the parameter).
  • value_type (VT) – (Optional) The value type that characterize the node. Defined within value_types.py and inherits from either VT or VT_Alt. If provided the instantiated node will be a value_type-typed leaf node.
  • vt (VT) – alias to value_type.
  • base_node (Node) – (Optional) If provided, it will be used as a template to create the new node.
  • ignore_frozen_state (bool) – [If base_node provided] If True, the clone process of base_node will ignore its current state.
  • accept_external_entanglement (bool) – [If base_node provided] If True, during the cloning process of base_node, every entangled nodes outside the current graph will be referenced within the new node without being copied. Otherwise, a Warning message will be raised.
  • acceptance_set (set) – [If base_node provided] If provided, will be used as a set of entangled nodes that could be referenced within the new node during the cloning process.
  • copy_dico (dict) – [If base_node provided] It is used internally during the cloning process, and should not be used for any functional purpose.
  • new_env (bool) – [If base_node provided] If True, the base_node attached Env() will be copied. Otherwise, the same will be used. If ignore_frozen_state is True, a new Env() will be used.
__lt__(other)

x.__lt__(y) <==> x<y

__module__ = 'framework.node'
__setitem__(key, val)
__str__() <==> str(x)
_compute_confs(conf, recursive)
_finalize_nonterm_node(conf, depth=None)
_get_all_paths_rec(pname, htable, conf, recursive, first=True, clone_idx=0)
_get_value(conf=None, recursive=True, return_node_internals=False)
_post_freeze(node_internals, wrapping_node)
static _print(msg, rgb, style='', nl=True, log_func=<built-in method write of file object>, pretty_print=True)
static _print_contents(msg, style='', nl=True, log_func=<built-in method write of file object>, pretty_print=True)
static _print_name(msg, style='', nl=True, log_func=<built-in method write of file object>, pretty_print=True)
static _print_nonterm(msg, style='\x1b[1m', nl=True, log_func=<built-in method write of file object>, pretty_print=True)
static _print_raw(msg, style='', nl=True, hlight=False, log_func=<built-in method write of file object>, pretty_print=True)
static _print_type(msg, style='\x1b[1m', nl=True, log_func=<built-in method write of file object>, pretty_print=True)
_reset_depth(parent_depth)
_set_clone_info(info, node)

Used to propagate random draw results when a NonTerm node is frozen to the dynamic nodes of its attached subgraphs, namely GenFunc/Func nodes which are the only ones which can act dynamically.

_set_subtrees_current_conf(node, conf, reverse, ignore_entanglement=False)
_tobytes(conf=None, recursive=True)
absorb(blob, constraints=AbsCsts(), conf=None, pending_postpone_desc=None)
add_conf(conf)
c

Property linked to self.internals (read only)

cc

Property linked to the current node’s internals (read / write)

clear_attr(name, conf=None, all_conf=False, recursive=False)
compliant_with(internals_criteria=None, semantics_criteria=None, conf=None)
conf(conf=None)
confs

Property giving all node’s configurations (read only)

enforce_absorb_constraints(csts, conf=None)
entangle_with(node)
static filter_out_entangled_nodes(node_list)
fix_synchronized_nodes(conf=None)
freeze(conf=None, recursive=True, return_node_internals=False)
gather_alt_confs()
get_all_paths(conf=None, recursive=True, depth_min=None, depth_max=None)
Returns:
the keys are either a ‘path’ or a tuple (‘path’, int) when the path already
exists (case of the same node used more than once within the same non-terminal)
Return type:dict
get_all_paths_from(node, conf=None)
get_clone(name=None, ignore_frozen_state=False, new_env=True)

Create a new node. To be used wihtin a graph-based data model.

Parameters:
  • name (str) – name of the new Node instance. If None the current name will be used.
  • ignore_frozen_state (bool) – if set to False, the clone function will produce a Node with the same state as the duplicated Node. Otherwise, the only the state won’t be kept.
  • new_env (bool) – If True, the current Env() will be copied. Otherwise, the same will be used.
Returns:

duplicated Node object

Return type:

Node

get_current_conf()
get_env()
get_fuzz_weight()

Return the fuzzing weight of the node.

Returns:the fuzzing weight
Return type:int
get_internals_backup()
get_node_by_path(path_regexp=None, path=None, conf=None)

The set of nodes that is used to perform the search include the node itself and all the subnodes behind it.

get_nodes_names(conf=None, verbose=False, terminal_only=False)
get_path_from(node, conf=None)
get_private(conf=None)
get_reachable_nodes(internals_criteria=None, semantics_criteria=None, owned_conf=None, conf=None, path_regexp=None, exclude_self=False, respect_order=False, relative_depth=-1, top_node=None, ignore_fstate=False)
get_semantics()
get_value(conf=None, recursive=True, return_node_internals=False)
is_attr_set(name, conf=None)
is_conf_existing(conf)
is_empty(conf=None)
is_exhausted(conf=None)
is_frozen(conf=None)
is_func(conf=None)
is_genfunc(conf=None)
is_nonterm(conf=None)
is_term(conf=None)
is_typed_value(conf=None, subkind=None)
iter_paths(conf=None, recursive=True, depth_min=None, depth_max=None, only_paths=False)
make_determinist(conf=None, all_conf=False, recursive=False)
make_empty(conf=None)
make_finite(conf=None, all_conf=False, recursive=False)
make_infinite(conf=None, all_conf=False, recursive=False)
make_random(conf=None, all_conf=False, recursive=False)
make_synchronized_with(scope, node=None, param=None, sync_obj=None, conf=None)
pretty_print(max_size=None, conf=None)
register_post_freeze_handler(func)
remove_conf(conf)
reset_fuzz_weight(recursive=False)

Reset to standard (1) the fuzzing weight that is associated to this node, and all its subnodes if recursive parameter is set to True.

Parameters:recursive (bool) – if set to True, reset also every subnodes (all reachable nodes from this one).
Returns:None
reset_state(recursive=False, exclude_self=False, conf=None, ignore_entanglement=False)
set_absorb_helper(helper, conf=None)
set_attr(name, conf=None, all_conf=False, recursive=False)
set_contents(base_node, copy_dico=None, ignore_frozen_state=False, accept_external_entanglement=False, acceptance_set=None, preserve_node=True)

Set the contents of the node based on the one provided within base_node. This method performs a deep copy of base_node, but some parameters can change the behavior of the copy.

Note

python deepcopy() is not used for perfomance reason (10 to 20 times slower).

Parameters:
  • base_node (Node) – (Optional) Used as a template to create the new node.
  • ignore_frozen_state (bool) – If True, the clone process of base_node will ignore its current state.
  • preserve_node (bool) – preserve the NodeInternals attributes (making sense to preserve) of the possible overwritten NodeInternals.
  • accept_external_entanglement (bool) – If True, during the cloning process of base_node, every entangled nodes outside the current graph will be referenced within the new node without being copied. Otherwise, a Warning message will be raised.
  • acceptance_set (set) – If provided, will be used as a set of entangled nodes that could be referenced within the new node during the cloning process.
  • copy_dico (dict) – It is used internally during the cloning process, and should not be used for any functional purpose.
Returns:

For each subnodes of base_node (keys), reference the corresponding subnodes within the new node.

Return type:

dict

set_current_conf(conf, recursive=True, reverse=False, root_regexp=None, ignore_entanglement=False)
set_env(env)
set_frozen_value(value, conf=None)
set_func(func, func_node_arg=None, func_arg=None, conf=None, ignore_entanglement=False, provide_helpers=False, preserve_node=True)
set_fuzz_weight(w)

Set the fuzzing weight of the node to w.

The fuzz weight is an optional attribute of Node() which express Data Model designer’s hints for prioritizing the nodes to fuzz. If set, this attribute is used by some generic disruptors (the ones that rely on a ModelWalker object—refer to fuzzing_primitives.py)

Parameters:w (int) – Value of the weight (by default every nodes has a weight of 1)
Returns:None
set_generator_func(gen_func, func_node_arg=None, func_arg=None, conf=None, ignore_entanglement=False, provide_helpers=False, preserve_node=True)
set_internals(backup)
set_private(val, conf=None)
set_semantics(sem)
set_size_from_constraints(size=None, encoded_size=None, conf=None)
set_subnodes_basic(node_list, conf=None, ignore_entanglement=False, separator=None, preserve_node=True)
set_subnodes_full_format(subnodes_order, subnodes_attrs, conf=None, separator=None, preserve_node=True)
set_subnodes_with_csts(wlnode_list, conf=None, ignore_entanglement=False, separator=None, preserve_node=True)
set_values(values=None, value_type=None, conf=None, ignore_entanglement=False, preserve_node=True)
show(conf=None, verbose=True, print_name_func=None, print_contents_func=None, print_raw_func=None, print_nonterm_func=None, print_type_func=None, alpha_order=False, raw_limit=None, log_func=<built-in method write of file object>, pretty_print=True, display_title=True, display_gen_node=True)
synchronized_with(scope, conf=None)
to_ascii(conf=None, recursive=True)
to_bytes(conf=None, recursive=True)
to_str(conf=None, recursive=True)
unfreeze(conf=None, recursive=True, dont_change_state=False, ignore_entanglement=False, only_generators=False, reevaluate_constraints=False)
unfreeze_all(recursive=True, ignore_entanglement=False)
class framework.node.NodeAbstraction

Bases: object

This class can be used in place of an node_arg for Func and GenFunc Nodes. It enables you to define in your data model higher level classes upon Nodes to facilitate Nodes manipulation within Func and GenFunc Nodes, with regards to your data model paradigm.

__module__ = 'framework.node'
get_concrete_nodes()

Shall return an Node or a list of Nodes

make_private()

This method is called during Node copy process. It aims to make all your metadata private (if needed). Note that you don’t have to deal with your Nodes.

set_concrete_nodes(nodes_args)

Shall save an Node or a list of Nodes (depending on what returns get_concrete_nodes())

class framework.node.NodeCondition

Bases: object

Base class for every node-related conditions. (Note that NodeCondition may be copied many times. If some attributes need to be fully copied, handle this through __copy__() overriding).

__module__ = 'framework.node'
_check_inclusion(curr_val, val=None, neg_val=None)
_check_int(val, gt_val=None, lt_val=None)
check(node)
class framework.node.NodeCustomization(items_to_set=None, items_to_clear=None, transform_func=None)

Bases: object

Base class for node cutomization

__copy__()
__getitem__(key)
__init__(items_to_set=None, items_to_clear=None, transform_func=None)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.node'
_custo_items = {}
transform_func
class framework.node.NodeInternals(arg=None)

Bases: object

Base class for implementing the contents of a node.

Abs_Postpone = 6
DEBUG = 30
DISABLED = 100
Determinist = 3
Finite = 4
Freezable = 1
LOCKED = 50
Mutable = 2
Separator = 15
__init__(arg=None)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.node'
_clear_attr_direct(name)
_get_value(conf=None, recursive=True, return_node_internals=False)
_init_specific(arg)
_make_private_specific(ignore_frozen_state, accept_external_entanglement)
_make_specific(name)
_match_mandatory_attrs(criteria)
_match_mandatory_custo(criteria)
_match_negative_attrs(criteria)
_match_negative_custo(criteria)
_match_negative_node_kinds(criteria)
_match_negative_node_subkinds(criteria)
_match_node_constraints(criteria)
_match_node_kinds(criteria)
_match_node_subkinds(criteria)
_set_attr_direct(name)
_unmake_specific(name)
_update_node_refs(node_dico, debug)
absorb(blob, constraints, conf, pending_postpone_desc=None)
clear_attr(name)
clear_clone_info_since(node)

Cleanup obsolete graph internals information prior to what has been registered with the node given as parameter.

customize(custo)
default_custo = None
enforce_absorb_constraints(csts)
env
get_attrs_copy()
get_current_subkind()
get_node_sync(scope)
get_private()
get_raw_value(**kwargs)
has_subkinds()
is_attr_set(name)
is_exhausted()
is_frozen()
make_private(ignore_frozen_state, accept_external_entanglement, delayed_node_internals, forget_original_sync_objs=False)
match(internals_criteria)
pretty_print(max_size=None)
reset_depth_specific(depth)
set_absorb_helper(helper)
set_attr(name)
set_attrs_from(all_attrs)
set_clone_info(info, node)

Report to Node._set_clone_info() some information about graph internals

set_contents_from(node_internals)
set_node_sync(scope, node=None, param=None, sync_obj=None)
set_private(val)
set_size_from_constraints(size, encoded_size)
synchronize_nodes(src_node)
class framework.node.NodeInternalsCriteria(mandatory_attrs=None, negative_attrs=None, node_kinds=None, negative_node_kinds=None, node_subkinds=None, negative_node_subkinds=None, mandatory_custo=None, negative_custo=None, required_csts=None, negative_csts=None)

Bases: object

__init__(mandatory_attrs=None, negative_attrs=None, node_kinds=None, negative_node_kinds=None, node_subkinds=None, negative_node_subkinds=None, mandatory_custo=None, negative_custo=None, required_csts=None, negative_csts=None)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.node'
clear_node_constraint(cst)
extend(ic)
get_all_node_constraints()
get_node_constraint(cst)
has_node_constraints()
set_node_constraint(cst, required)
class framework.node.NodeInternals_Empty(arg=None)

Bases: framework.node.NodeInternals

__module__ = 'framework.node'
_get_value(conf=None, recursive=True, return_node_internals=False)
get_raw_value(**kwargs)
set_child_env(env)
class framework.node.NodeInternals_Func(arg=None)

Bases: framework.node.NodeInternals_Term

_NodeInternals_Func__get_value_specific_mode1(conf, recursive)

In mode1, we freeze ‘node_arg’ attribute and give the value to the function

_NodeInternals_Func__get_value_specific_mode2(conf, recursive)

In mode2, we give the ‘node_arg’ to the function and let it do whatever it wants

__module__ = 'framework.node'
_get_value_specific(conf, recursive)
_init_specific(arg)
_make_private_term_specific(ignore_frozen_state, accept_external_entanglement)
_reset_state_specific(recursive, exclude_self, conf, ignore_entanglement)
_unfreeze_reevaluate_constraints(current_val)
_unfreeze_without_state_change(current_val)
absorb(blob, constraints, conf, pending_postpone_desc=None)
cancel_absorb()
clear_clone_info_since(node)

Cleanup obsolete graph internals information prior to what has been registered with the node given as parameter.

confirm_absorb()
customize(custo)
default_custo = <framework.node.FuncCusto object>
get_node_args()
import_func(fct, fct_node_arg=None, fct_arg=None, provide_helpers=False)
make_args_private(node_dico, entangled_set, ignore_frozen_state, accept_external_entanglement)
set_clone_info(info, node)

Report to Node._set_clone_info() some information about graph internals

set_func_arg(node=None, fct_arg=None)
set_size_from_constraints(size, encoded_size)
class framework.node.NodeInternals_GenFunc(arg=None)

Bases: framework.node.NodeInternals

__getattr__(name)
__module__ = 'framework.node'
_get_delayed_value(conf=None, recursive=True)
_get_value(conf=None, recursive=True, return_node_internals=False)
_init_specific(arg)
_make_private_specific(ignore_frozen_state, accept_external_entanglement)
_make_specific(name)
_unmake_specific(name)
absorb(blob, constraints, conf, pending_postpone_desc=None)
cancel_absorb()
clear_child_attr(name, conf=None, all_conf=False, recursive=False)
clear_clone_info_since(node)

Cleanup obsolete graph internals information prior to what has been registered with the node given as parameter.

confirm_absorb()
default_custo = <framework.node.GenFuncCusto object>
env
generated_node
get_child_all_path(name, htable, conf, recursive)
get_child_nodes_by_attr(internals_criteria, semantics_criteria, owned_conf, conf, path_regexp, exclude_self, respect_order, relative_depth, top_node, ignore_fstate)
get_node_args()
get_raw_value(**kwargs)
import_generator_func(generator_func, generator_node_arg=None, generator_arg=None, provide_helpers=False)
is_exhausted()
is_frozen()
make_args_private(node_dico, entangled_set, ignore_frozen_state, accept_external_entanglement)
reset_depth_specific(depth)
reset_fuzz_weight(recursive)
reset_generator()
reset_state(recursive=False, exclude_self=False, conf=None, ignore_entanglement=False)
set_child_attr(name, conf=None, all_conf=False, recursive=False)
set_child_current_conf(node, conf, reverse, ignore_entanglement)
set_child_env(env)
set_clone_info(info, node)

Report to Node._set_clone_info() some information about graph internals

set_generator_func_arg(generator_node_arg=None, generator_arg=None)
set_size_from_constraints(size, encoded_size)
unfreeze(conf=None, recursive=True, dont_change_state=False, ignore_entanglement=False, only_generators=False, reevaluate_constraints=False)
unfreeze_all(recursive=True, ignore_entanglement=False)
class framework.node.NodeInternals_NonTerm(arg=None)

Bases: framework.node.NodeInternals

It is a kind of node internals that enable to structure the graph through a specific grammar…

INFINITY_LIMIT = 30
_NodeInternals_NonTerm__iter_csts(node_list)
_NodeInternals_NonTerm__iter_csts_verbose(node_list)
__module__ = 'framework.node'
static _cleanup_delayed_nodes(node, node_list, idx, conf, rec)
_cleanup_entangled_nodes()
_cleanup_entangled_nodes_from(node)
_clear_drawn_node_attrs(node)
_clone_node(base_node, node_no, force_clone=False, ignore_frozen_state=True)
_clone_node_cleanup()
_clone_separator(sep_node, unique, force_clone=False, ignore_frozen_state=True)
_clone_separator_cleanup()
_construct_subnodes(node_desc, subnode_list, mode, ignore_sep_fstate, ignore_separator=False, lazy_mode=True)
_copy_nodelist(node_list)
static _existence_from_node(node)
static _expand_delayed_nodes(node, node_list, idx, conf, rec)
_generate_expanded_nodelist(node_list, determinist=True)
_get_heavier_component(comp_list, check_existence=False)
_get_info_from_subnode_description(node_desc)
static _get_next_heavier_component(comp_list, excluded_idx=[])
static _get_next_random_component(comp_list, excluded_idx=[], seed=None)
_get_node_and_attrs_from(node_desc)
_get_node_from(node_desc)
_get_random_component(comp_list, total_weight, check_existence=False)
_get_value(conf=None, recursive=True, after_encoding=True, return_node_internals=False)

The parameter return_node_internals is not used for non terminal nodes, only for terminal nodes. However, keeping it also for non terminal nodes avoid additional checks in the code.

_init_specific(arg)
_make_private_specific(ignore_frozen_state, accept_external_entanglement)
_make_specific(name)
_parse_node_desc(node_desc)
_precondition_subnode_ops()
static _qty_from_node(node)
_reset_state_info(new_info=None, nodes_drawn_qty=None)
_set_drawn_node_attrs(node, nb, sz)
static _size_from_node(node, for_encoded_size=False)
_unmake_specific(name)
absorb(blob, constraints, conf, pending_postpone_desc=None)
TOFIX: Checking existence condition independently from data
description order is not supported. Only supported within the same non-terminal node. Use delayed job infrastructure to cover all cases (TBC).
cancel_absorb()
change_subnodes_csts(csts_ch)
clear_child_attr(name, conf=None, all_conf=False, recursive=False)
clear_clone_info_since(node)

Cleanup obsolete graph internals information prior to what has been registered with the node given as parameter.

confirm_absorb()
default_custo = <framework.node.NonTermCusto object>
static existence_corrupt_hook(node, exist)
get_child_all_path(name, htable, conf, recursive)
get_child_nodes_by_attr(internals_criteria, semantics_criteria, owned_conf, conf, path_regexp, exclude_self, respect_order, relative_depth, top_node, ignore_fstate)
get_drawn_node_qty(node_ref)
get_raw_value(**kwargs)
get_separator_node()
get_subnode(num)
get_subnode_idx(node)
get_subnode_minmax(node)
get_subnode_off(num)
get_subnode_qty()
get_subnodes_collection()
get_subnodes_csts_copy(node_dico=None)
get_subnodes_with_csts()

Generate the structure of the non terminal node.

import_subnodes_basic(node_list, separator=None, preserve_node=False)
import_subnodes_full_format(subnodes_order=None, subnodes_attrs=None, frozen_node_list=None, internals=None, nodes_drawn_qty=None, custo=None, exhaust_info=None, separator=None)
import_subnodes_with_csts(wlnode_list, separator=None, preserve_node=False)
is_exhausted()
is_frozen()
make_private_subnodes(node_dico, func_nodes, env, ignore_frozen_state, accept_external_entanglement, entangled_set, delayed_node_internals)
static nodeqty_corrupt_hook(node, mini, maxi)
static qtysync_corrupt_hook(node, qty)
replace_subnode(old, new)
reset(nodes_drawn_qty=None, custo=None, exhaust_info=None, preserve_node=False)
reset_depth_specific(depth)
reset_fuzz_weight(recursive)
reset_state(recursive=False, exclude_self=False, conf=None, ignore_entanglement=False)
set_child_attr(name, conf=None, all_conf=False, recursive=False)
set_child_current_conf(node, conf, reverse, ignore_entanglement)
set_child_env(env)
set_clone_info(info, node)

Report to Node._set_clone_info() some information about graph internals

set_encoder(encoder)
set_separator_node(sep_node, prefix=True, suffix=True, unique=False)
set_size_from_constraints(size, encoded_size)
set_subnode_minmax(node, min=None, max=None)
static sizesync_corrupt_hook(node, length)
structure_will_change()

To be used only in Finite mode. Return True if the structure will change the next time _get_value() will be called.

Returns: bool

unfreeze(conf=None, recursive=True, dont_change_state=False, ignore_entanglement=False, only_generators=False, reevaluate_constraints=False)
unfreeze_all(recursive=True, ignore_entanglement=False)
class framework.node.NodeInternals_Term(arg=None)

Bases: framework.node.NodeInternals

__module__ = 'framework.node'
static _convert_to_internal_repr(val)
_get_value(conf=None, recursive=True, return_node_internals=False)
_get_value_specific(conf, recursive)
_init_specific(arg)
_make_private_specific(ignore_frozen_state, accept_external_entanglement)
_make_private_term_specific(ignore_frozen_state, accept_external_entanglement)
_reset_state_specific(recursive, exclude_self, conf, ignore_entanglement)
_set_frozen_value(val)
_unfreeze_reevaluate_constraints(current_val)
_unfreeze_without_state_change(current_val)
absorb(blob, constraints, conf, pending_postpone_desc=None)
absorb_auto_helper(blob, constraints)
cancel_absorb()
clear_child_attr(name, conf=None, all_conf=False, recursive=False)
confirm_absorb()
do_absorb(blob, constraints, off, size)
do_cleanup_absorb()
do_revert_absorb()
get_child_all_path(name, htable, conf, recursive)
get_child_nodes_by_attr(internals_criteria, semantics_criteria, owned_conf, conf, path_regexp, exclude_self, respect_order, relative_depth, top_node, ignore_fstate)
get_raw_value(**kwargs)
is_exhausted()
is_frozen()
reset_depth_specific(depth)
reset_fuzz_weight(recursive)
reset_state(recursive=False, exclude_self=False, conf=None, ignore_entanglement=False)
set_child_attr(name, conf=None, all_conf=False, recursive=False)
set_child_current_conf(node, conf, reverse, ignore_entanglement)
set_child_env(env)
unfreeze(conf=None, recursive=True, dont_change_state=False, ignore_entanglement=False, only_generators=False, reevaluate_constraints=False)
unfreeze_all(recursive=True, ignore_entanglement=False)
class framework.node.NodeInternals_TypedValue(arg=None)

Bases: framework.node.NodeInternals_Term

__getattr__(name)
__module__ = 'framework.node'
_get_value_specific(conf=None, recursive=True)
_init_specific(arg)
_make_private_term_specific(ignore_frozen_state, accept_external_entanglement)
_make_specific(name)
_reset_state_specific(recursive, exclude_self, conf, ignore_entanglement)
_unfreeze_reevaluate_constraints(current_val)
_unfreeze_without_state_change(current_val)
_unmake_specific(name)
absorb_auto_helper(blob, constraints)
do_absorb(blob, constraints, off, size)
do_cleanup_absorb()
do_revert_absorb()
get_current_subkind()
get_raw_value(**kwargs)
get_specific_fuzzy_values()
get_value_type()
has_subkinds()
import_value_type(value_type)
is_exhausted()
pretty_print(max_size=None)
set_size_from_constraints(size, encoded_size)
set_specific_fuzzy_values(vals)
class framework.node.NodeSemantics(attrs=[])

Bases: object

To be used while defining a data model as a means to associate semantics to an Node.

__init__(attrs=[])

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.node'
_match_exclusive_criteria(criteria)
_match_mandatory_criteria(criteria)
_match_negative_criteria(criteria)
_match_optionalbut1_criteria(criteria)
add_attributes(attrs)
make_private()

This method is called during Node copy process. It aims to make all your metadata private (if needed).

match(semantics_criteria)

This method is called within get_reachable_nodes() (when the ‘semantics’ parameter is provided) to select Node that match the given semantics.

what_match_from(raw_criteria_list)
class framework.node.NodeSemanticsCriteria(optionalbut1_criteria=None, mandatory_criteria=None, exclusive_criteria=None, negative_criteria=None)

Bases: object

__init__(optionalbut1_criteria=None, mandatory_criteria=None, exclusive_criteria=None, negative_criteria=None)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.node'
extend(sc)
get_exclusive_criteria()
get_mandatory_criteria()
get_negative_criteria()
get_optionalbut1_criteria()
set_exclusive_criteria(criteria)
set_mandatory_criteria(criteria)
set_negative_criteria(criteria)
set_optionalbut1_criteria(criteria)
class framework.node.NodeSeparator(node, prefix=True, suffix=True, unique=False)

Bases: object

A node separator is used (optionnaly) by a non-terminal node as a separator between each subnode.

make_private

used for full copy

Type:function
__init__(node, prefix=True, suffix=True, unique=False)
Parameters:
  • node (Node) – node to be used for separation.
  • prefix (bool) – if True, a serapator will also be placed at the begining.
  • suffix (bool) – if True, a serapator will also be placed at the end.
  • unique (bool) – if False, the same node will be used for each separation, otherwise a new node will be generated.
__module__ = 'framework.node'
make_private(node_dico, ignore_frozen_state)
class framework.node.NonTermCusto(items_to_set=None, items_to_clear=None, transform_func=None)

Bases: framework.node.NodeCustomization

Non-terminal node behavior-customization To be provided to NodeInternals.customize()

CollapsePadding = 3
FrozenCopy = 2
MutableClone = 1
__module__ = 'framework.node'
_custo_items = {1: True, 2: True, 3: False}
collapse_padding_mode
frozen_copy_mode
mutable_clone_mode
class framework.node.RawCondition(val=None, neg_val=None, cond_func=None)

Bases: framework.node.NodeCondition

__init__(val=None, neg_val=None, cond_func=None)
Parameters:
  • val (bytes/list of bytes) – value(s) that satisfies the condition
  • neg_val (bytes/list of bytes) – value(s) that does NOT satisfy the condition (AND clause)
  • cond_func – function that takes the node value and return a boolean
__module__ = 'framework.node'
_handle_cond(val)
check(node)
class framework.node.SyncExistenceObj(sync_list, and_junction=True)

Bases: framework.node.SyncObj

__init__(sync_list, and_junction=True)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.node'
_condition_satisfied(node, condition)
check()
get_node_containers()

Shall return either a Node or a list of Nodes or a list of (Node, param) where param should provide __copy__ method if needed.

put_node_containers(new_containers)

This method will be called to provide updated containers that should replace the old ones.

Parameters:new_containers – the updated containers
class framework.node.SyncObj

Bases: object

__module__ = 'framework.node'
_sync_nodes_specific(src_node)
get_node_containers()

Shall return either a Node or a list of Nodes or a list of (Node, param) where param should provide __copy__ method if needed.

make_private(node_dico)
put_node_containers(new_containers)

This method will be called to provide updated containers that should replace the old ones.

Parameters:new_containers – the updated containers
synchronize_nodes(src_node)
class framework.node.SyncQtyFromObj(node, base_qty=0)

Bases: framework.node.SyncObj

__init__(node, base_qty=0)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.node'
get_node_containers()

Shall return either a Node or a list of Nodes or a list of (Node, param) where param should provide __copy__ method if needed.

put_node_containers(new_containers)

This method will be called to provide updated containers that should replace the old ones.

Parameters:new_containers – the updated containers
qty
class framework.node.SyncScope

Bases: enum.Enum

Existence = 10
Inexistence = 11
Qty = 1
QtyFrom = 2
Size = 20
__module__ = 'framework.node'
class framework.node.SyncSizeObj(node, base_size=0, apply_to_enc_size=False)

Bases: framework.node.SyncObj

__init__(node, base_size=0, apply_to_enc_size=False)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.node'
_sync_nodes_specific(src_node)
get_node_containers()

Shall return either a Node or a list of Nodes or a list of (Node, param) where param should provide __copy__ method if needed.

put_node_containers(new_containers)

This method will be called to provide updated containers that should replace the old ones.

Parameters:new_containers – the updated containers
set_size_on_source_node(size)
size_for_absorption
framework.node.flatten(nested)
framework.node.make_entangled_nodes(node_list)
framework.node.make_wrapped_node(name, vals=None, node=None, prefix=None, suffix=None, key_node_name='KEY_ELT')
framework.node.split_verbose_with(predicate, iterable)
framework.node.split_with(predicate, iterable)

13.2.5. framework.node_builder module

class framework.node_builder.NodeBuilder(dm=None, delayed_jobs=True, add_env=True)

Bases: object

HIGH_PRIO = 1
LOW_PRIO = 3
MEDIUM_PRIO = 2
VERYLOW_PRIO = 4
_NodeBuilder__get_node_from_db(name_desc)
_NodeBuilder__handle_clone(desc, parent_node)
_NodeBuilder__post_handling(desc, node)
_NodeBuilder__pre_handling(desc, node)
__init__(dm=None, delayed_jobs=True, add_env=True)

Help the process of data description. This class is able to construct a framework.data_model.Node object from a JSON-like description.

Parameters:
  • dm (DataModel) – a DataModel object, only required if the ‘import_from’ statement is used with create_graph_from_desc().
  • delayed_jobs (bool) – Enable or disabled delayed jobs feature. Used for instance for delaying constraint that cannot be solved immediately.
  • add_env (bool) – If True, an framework.data_model.Env object will be assigned to the generated framework.data_model.Node from create_graph_from_desc(). Should be set to False if you consider using the generated Node within another description or if you will copy it for building a new node type. Keeping an Env() object can be dangerous if you make some clones of it and don’t pay attention to set a new Env() for each copy, because. A graph node SHALL have only one Env() shared between all the nodes and an Env() shall not be shared between independent graph (otherwise it could lead to unexpected results).
__module__ = 'framework.node_builder'
_clone_from_dict(node, ref, desc)
_complete_func(node, args, conf)
_complete_generator(node, args, conf)
_complete_generator_from_desc(node, args, conf)
_create_generator_node(desc, node=None)
_create_graph_from_desc(desc, parent_node)
_create_leaf_node(desc, node=None)
_create_nodes_from_shape(shapes, parent_node, shape_type='>', dup_mode='u')
_create_non_terminal_node(desc, node=None)
_create_non_terminal_node_from_regex(desc, node=None)
_create_todo_list()
_get_from_dict(node, ref, parent_node)
_handle_common_attr(node, desc, conf)
_handle_custo(node, desc, conf)
_handle_name(name_desc)
_register_todo(node, func, args=None, unpack_args=True, prio=2)
_set_env(node, args)
_set_sync_node(node, comp, scope, conf, private)
_update_provided_node(desc, node=None)
_verify_keys_conformity(desc)
create_graph_from_desc(desc)
valid_keys = ['name', 'contents', 'qty', 'clone', 'type', 'alt', 'conf', 'custo_set', 'custo_clear', 'evolution_func', 'weight', 'shape_type', 'section_type', 'duplicate_mode', 'weights', 'separator', 'prefix', 'suffix', 'unique', 'encoder', 'node_args', 'other_args', 'provide_helpers', 'trigger_last', 'specific_fuzzy_vals', 'import_from', 'data_id', 'determinist', 'random', 'finite', 'infinite', 'mutable', 'clear_attrs', 'set_attrs', 'absorb_csts', 'absorb_helper', 'semantics', 'fuzz_weight', 'sync_qty_with', 'qty_from', 'exists_if', 'exists_if_not', 'exists_if/and', 'exists_if/or', 'sync_size_with', 'sync_enc_size_with', 'post_freeze', 'charset', 'debug']
class framework.node_builder.RegexParser(machine=None)

Bases: framework.node_builder.StateMachine

class Brackets(machine=None)

Bases: framework.node_builder.StateMachine, framework.node_builder.QtyState

class Comma(machine)

Bases: framework.node_builder.Max

INITIAL = False
__module__ = 'framework.node_builder'
_run(ctx)
class Final(machine)

Bases: framework.node_builder.State

INITIAL = False
__module__ = 'framework.node_builder'
_run(ctx)
advance(context)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
INITIAL = False
class Initial(machine)

Bases: framework.node_builder.State

INITIAL = True
__module__ = 'framework.node_builder'
_run(ctx)
advance(ctx)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
class Max(machine)

Bases: framework.node_builder.State

INITIAL = False
__module__ = 'framework.node_builder'
_run(ctx)
advance(context)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
class Min(machine)

Bases: framework.node_builder.State

INITIAL = False
__module__ = 'framework.node_builder'
_run(ctx)
advance(context)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
__module__ = 'framework.node_builder'
advance(ctx)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
class Choice(machine)

Bases: framework.node_builder.Initial

INITIAL = False
__module__ = 'framework.node_builder'
_run(ctx)
class Dot(machine)

Bases: framework.node_builder.Group

INITIAL = False
__module__ = 'framework.node_builder'
_run(ctx)
class Escape(machine)

Bases: framework.node_builder.State

INITIAL = False
__module__ = 'framework.node_builder'
_run(ctx)
advance(ctx)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
class EscapeMetaSequence(machine)

Bases: framework.node_builder.Group

INITIAL = False
__module__ = 'framework.node_builder'
_run(ctx)
class Final(machine)

Bases: framework.node_builder.State

INITIAL = False
__module__ = 'framework.node_builder'
_run(ctx)
advance(ctx)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
class Group(machine)

Bases: framework.node_builder.State

__module__ = 'framework.node_builder'
advance(ctx)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
class Initial(machine)

Bases: framework.node_builder.State

INITIAL = True
__module__ = 'framework.node_builder'
_run(ctx)
advance(ctx)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
class Main(machine)

Bases: framework.node_builder.State

INITIAL = False
__module__ = 'framework.node_builder'
_run(ctx)
advance(ctx)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
class Parenthesis(machine=None)

Bases: framework.node_builder.StateMachine, framework.node_builder.Group

class Choice(machine)

Bases: framework.node_builder.Initial

INITIAL = False
__module__ = 'framework.node_builder'
_run(ctx)
advance(ctx)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
class Escape(machine)

Bases: framework.node_builder.State

INITIAL = False
__module__ = 'framework.node_builder'
_run(ctx)
advance(ctx)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
class Final(machine)

Bases: framework.node_builder.State

INITIAL = False
__module__ = 'framework.node_builder'
_run(context)
advance(context)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
INITIAL = False
class Initial(machine)

Bases: framework.node_builder.State

INITIAL = True
__module__ = 'framework.node_builder'
_run(ctx)
advance(ctx)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
class Main(machine)

Bases: framework.node_builder.Initial

INITIAL = False
__module__ = 'framework.node_builder'
_run(ctx)
advance(ctx)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
__module__ = 'framework.node_builder'
class QtyState(machine)

Bases: framework.node_builder.State

INITIAL = False
__module__ = 'framework.node_builder'
_run(ctx)
advance(ctx)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
class SquareBrackets(machine=None)

Bases: framework.node_builder.StateMachine, framework.node_builder.Group

class AfterRange(machine)

Bases: framework.node_builder.Initial

INITIAL = False
__module__ = 'framework.node_builder'
_run(ctx)
advance(ctx)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
class BeforeRange(machine)

Bases: framework.node_builder.Initial

INITIAL = False
__module__ = 'framework.node_builder'
_run(ctx)
advance(ctx)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
class EscapeAfterRange(machine)

Bases: framework.node_builder.State

INITIAL = False
__module__ = 'framework.node_builder'
_run(ctx)
advance(ctx)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
class EscapeBeforeRange(machine)

Bases: framework.node_builder.State

INITIAL = False
__module__ = 'framework.node_builder'
_run(ctx)
advance(ctx)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
class EscapeMetaSequence(machine)

Bases: framework.node_builder.BeforeRange

INITIAL = False
__module__ = 'framework.node_builder'
_run(ctx)
class Final(machine)

Bases: framework.node_builder.State

INITIAL = False
__module__ = 'framework.node_builder'
_run(ctx)
advance(ctx)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
INITIAL = False
class Initial(machine)

Bases: framework.node_builder.State

INITIAL = True
__module__ = 'framework.node_builder'
_run(ctx)
advance(ctx)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
class Range(machine)

Bases: framework.node_builder.State

INITIAL = False
__module__ = 'framework.node_builder'
_run(ctx)
advance(ctx)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
__module__ = 'framework.node_builder'
__module__ = 'framework.node_builder'
_create_non_terminal_node()
_create_terminal_node(name, type, values=None, alphabet=None, qty=None)
append_to_alphabet(alphabet)
append_to_buffer(str)
append_to_contents(content)
buffer
flush()
init_specific()

Can be overridden to express additional initializations

parse(inputs, name, charset=2)
reset()
start_new_shape()
start_new_shape_from_buffer()
class framework.node_builder.State(machine)

Bases: object

Represent states at the lower level

__init__(machine)
Parameters:machine (StateMachine) – state machine where it lives (local context)
__module__ = 'framework.node_builder'
_run(context)
advance(context)

Check transitions using the first non-run character. :param context: root state machine (global context) :type context: StateMachine

Returns:Class of the next state de run (None if we are in a final state)
init_specific()

Can be overridden to express additional initializations

run(context)

Do some actions on the current character. :param context: root state machine (global context) :type context: StateMachine

class framework.node_builder.StateMachine(machine=None)

Bases: framework.node_builder.State

Represent states that contain other states.

__init__(machine=None)

Args: machine (StateMachine): state machine where it lives (local context)

__module__ = 'framework.node_builder'
_run(context)
input
run(context)

Do some actions on the current character. :param context: root state machine (global context) :type context: StateMachine

framework.node_builder.initial(cls)
framework.node_builder.register(cls)

13.2.6. framework.value_types module

class framework.value_types.BitField(subfield_limits=None, subfield_sizes=None, subfield_values=None, subfield_val_extremums=None, padding=0, lsb_padding=True, endian=1, determinist=True, subfield_descs=None, defaults=None)

Bases: framework.value_types.VT_Alt

Provide: - either @subfield_limits or @subfield_sizes - either @subfield_values or @subfield_val_extremums

_BitField__compute_total_possible_values()

the returned number correspond to the total number of values that can be returned by the BitField in determinist mode. This number does not cover all the values such a BitField should be able to generate. Refer to get_value() comments for more information.

__init__(subfield_limits=None, subfield_sizes=None, subfield_values=None, subfield_val_extremums=None, padding=0, lsb_padding=True, endian=1, determinist=True, subfield_descs=None, defaults=None)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.value_types'
_enable_fuzz_mode(fuzz_magnitude=1.0)
_enable_normal_mode()
_encode_bitfield(val)
_read_value_from(blob, size, endian, constraints)

Used by .do_absorb(). side effect: may change self.padding_one dictionary.

_reset_idx(reset_idx_inuse=True)
absorb_auto_helper(blob, constraints)
after_enabling_mode()
bit_length
byte_length
count_of_possible_values

the returned number correspond to the total number of values that can be returned by the BitField in determinist mode. This number does not cover all the values such a BitField should be able to generate. Refer to get_value() comments for more information.

do_absorb(blob, constraints, off=0, size=None)
do_cleanup_absorb()

To be called after self.do_absorb() or self.do_revert_absorb()

do_revert_absorb()

If needed should be called just after self.do_absorb().

extend(bitfield, rightside=True)
extend_left(bitfield)
extend_right(bitfield)
get_current_raw_val()
get_current_value()

Provide the current value of the object. Should not change the state of the object except if no current values.

Returns: bytes

get_subfield(idx)
get_value()

In determinist mode, all the values such a BitField should be able to generate are not covered but only a subset of them (i.e., all combinations are not computed). It has been chosen to only keep the value based on the following algorithm: “exhaust each subfield one at a time”.

Rationale: In most cases, computing all combinations does not make sense for fuzzing purpose.

is_compatible(integer, size)
is_exhausted()
make_determinist()
make_private(forget_current_state)
make_random()
padding_one = [0, 1, 3, 7, 15, 31, 63, 127]
pretty_print(max_size=None)
reset_state()
rewind()
set_bitfield(sf_values=None, sf_val_extremums=None, sf_limits=None, sf_sizes=None, sf_descs=None, sf_defaults=None)
set_size_from_constraints(size=None, encoded_size=None)
set_subfield(idx, val)
Parameters:
  • idx (int) – subfield index, from 0 (low significant subfield) to nb_subfields-1 (specific index -1 is used to choose the last subfield).
  • val (int) – new value for the subfield
class framework.value_types.Filename(values=None, size=None, min_sz=None, max_sz=None, determinist=True, codec='latin-1', extra_fuzzy_list=None, absorb_regexp=None, alphabet=None, min_encoded_sz=None, max_encoded_sz=None, encoding_arg=None)

Bases: framework.value_types.String

__module__ = 'framework.value_types'
subclass_fuzzing_list
class framework.value_types.GSM7bitPacking(values=None, size=None, min_sz=None, max_sz=None, determinist=True, codec='latin-1', extra_fuzzy_list=None, absorb_regexp=None, alphabet=None, min_encoded_sz=None, max_encoded_sz=None, encoding_arg=None)

Bases: framework.value_types.String

__module__ = 'framework.value_types'
decode(msg)

To be overloaded by a subclass that deals with encoding. (Should be stateless.)

Parameters:val (bytes) – the encoded value
Returns:the decoded value
Return type:bytes
encode(msg)

To be overloaded by a subclass that deals with encoding. (Should be stateless.)

Parameters:val (bytes) – the value
Returns:the encoded value
Return type:bytes
init_encoding_scheme(arg)

To be optionally overloaded by a subclass that deals with encoding, if encoding need to be initialized in some way. (called at init and in String.reset())

Parameters:arg – provided through the encoding_arg parameter of the String constructor
class framework.value_types.GSMPhoneNum(values=None, size=None, min_sz=None, max_sz=None, determinist=True, codec='latin-1', extra_fuzzy_list=None, absorb_regexp=None, alphabet=None, min_encoded_sz=None, max_encoded_sz=None, encoding_arg=None)

Bases: framework.value_types.String

__module__ = 'framework.value_types'
decode(msg)

To be overloaded by a subclass that deals with encoding. (Should be stateless.)

Parameters:val (bytes) – the encoded value
Returns:the decoded value
Return type:bytes
encode(msg)

To be overloaded by a subclass that deals with encoding. (Should be stateless.)

Parameters:val (bytes) – the value
Returns:the encoded value
Return type:bytes
init_encoding_scheme(arg)

To be optionally overloaded by a subclass that deals with encoding, if encoding need to be initialized in some way. (called at init and in String.reset())

Parameters:arg – provided through the encoding_arg parameter of the String constructor
class framework.value_types.GZIP(values=None, size=None, min_sz=None, max_sz=None, determinist=True, codec='latin-1', extra_fuzzy_list=None, absorb_regexp=None, alphabet=None, min_encoded_sz=None, max_encoded_sz=None, encoding_arg=None)

Bases: framework.value_types.String

__module__ = 'framework.value_types'
decode(val)

To be overloaded by a subclass that deals with encoding. (Should be stateless.)

Parameters:val (bytes) – the encoded value
Returns:the decoded value
Return type:bytes
encode(val)

To be overloaded by a subclass that deals with encoding. (Should be stateless.)

Parameters:val (bytes) – the value
Returns:the encoded value
Return type:bytes
init_encoding_scheme(arg=None)

To be optionally overloaded by a subclass that deals with encoding, if encoding need to be initialized in some way. (called at init and in String.reset())

Parameters:arg – provided through the encoding_arg parameter of the String constructor
class framework.value_types.INT(values=None, min=None, max=None, default=None, determinist=True, force_mode=False, fuzz_mode=False, values_desc=None)

Bases: framework.value_types.VT

Base class to be inherited and not used directly

GEN_MAX_INT = 4294967296
GEN_MIN_INT = -4294967296
__init__(values=None, min=None, max=None, default=None, determinist=True, force_mode=False, fuzz_mode=False, values_desc=None)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.value_types'
_convert_value(val)
_read_value_from(blob, size)
_unconvert_value(val)
absorb_auto_helper(blob, constraints)
add_specific_fuzzy_vals(vals)
alt_cformat = None
cformat = None
copy_attrs_from(vt)
determinist = True
do_absorb(blob, constraints, off=0, size=None)
do_cleanup_absorb()
do_revert_absorb()

If needed should be called just after self.do_absorb().

endian = None
extend_value_list(new_list)
fuzzy_values = None
get_current_raw_val()
get_current_value()

Provide the current value of the object. Should not change the state of the object except if no current values.

Returns: bytes

get_fuzzed_vt_list()
get_specific_fuzzy_vals()
get_value()

Walk other the values of the object on a per-call basis.

Returns: bytes

get_value_list()
is_compatible(integer)
is_exhausted()
is_size_compatible(integer)
make_determinist()
make_private(forget_current_state)
make_random()
maxi = None
maxi_gen = None
mini = None
mini_gen = None
pretty_print(max_size=None)
remove_value_list(value_list)
reset_state()
rewind()
set_size_from_constraints(size=None, encoded_size=None)
set_value_list(new_list)
size = None
update_raw_value(val)
usable = False
value_space_size = None
class framework.value_types.INT16(values=None, min=None, max=None, default=None, determinist=True, force_mode=False, fuzz_mode=False, values_desc=None)

Bases: framework.value_types.INT

__module__ = 'framework.value_types'
fuzzy_values = [65535, 0, 32768, 32767]
size = 16
usable = False
value_space_size = 65535
class framework.value_types.INT32(values=None, min=None, max=None, default=None, determinist=True, force_mode=False, fuzz_mode=False, values_desc=None)

Bases: framework.value_types.INT

__module__ = 'framework.value_types'
fuzzy_values = [4294967295, 0, 2147483648, 2147483647]
size = 32
usable = False
value_space_size = 4294967295
class framework.value_types.INT64(values=None, min=None, max=None, default=None, determinist=True, force_mode=False, fuzz_mode=False, values_desc=None)

Bases: framework.value_types.INT

__module__ = 'framework.value_types'
fuzzy_values = [18446744073709551615L, 0, 9223372036854775808L, 9223372036854775807, 1229782938247303441]
size = 64
usable = False
value_space_size = 18446744073709551615L
class framework.value_types.INT8(values=None, min=None, max=None, default=None, determinist=True, force_mode=False, fuzz_mode=False, values_desc=None)

Bases: framework.value_types.INT

__module__ = 'framework.value_types'
fuzzy_values = [255, 0, 1, 128, 127]
size = 8
usable = False
value_space_size = 255
class framework.value_types.INT_str(values=None, min=None, max=None, default=None, determinist=True, force_mode=False, fuzz_mode=False, base=10, letter_case='upper', min_size=None, reverse=False)

Bases: framework.value_types.INT

__init__(values=None, min=None, max=None, default=None, determinist=True, force_mode=False, fuzz_mode=False, base=10, letter_case='upper', min_size=None, reverse=False)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.value_types'
_convert_value(val)
_prepare_format_str(min_size, base, letter_case)
_read_value_from(blob, size)
_unconvert_value(val)
copy_attrs_from(vt)
endian = 3
fuzzy_values = [0, -1, -4294967296, 4294967295, 4294967296]
get_fuzzed_vt_list()
is_compatible(integer)
pretty_print(max_size=None)
regex_bin = '-?[01]+'
regex_decimal = '-?\\d+'
regex_lower_hex = '-?[0123456789abcdef]+'
regex_octal = '-?[01234567]+'
regex_upper_hex = '-?[0123456789ABCDEF]+'
usable = True
value_space_size = -1
class framework.value_types.SINT16_be(values=None, min=None, max=None, default=None, determinist=True, force_mode=False, fuzz_mode=False, values_desc=None)

Bases: framework.value_types.INT16

__module__ = 'framework.value_types'
alt_cformat = '>H'
cformat = '>h'
endian = 1
maxi = 32767
mini = -32768
usable = True
class framework.value_types.SINT16_le(values=None, min=None, max=None, default=None, determinist=True, force_mode=False, fuzz_mode=False, values_desc=None)

Bases: framework.value_types.INT16

__module__ = 'framework.value_types'
alt_cformat = '<H'
cformat = '<h'
endian = 2
maxi = 32767
mini = -32768
usable = True
class framework.value_types.SINT32_be(values=None, min=None, max=None, default=None, determinist=True, force_mode=False, fuzz_mode=False, values_desc=None)

Bases: framework.value_types.INT32

__module__ = 'framework.value_types'
alt_cformat = '>L'
cformat = '>l'
endian = 1
maxi = 2147483647
mini = -2147483648
usable = True
class framework.value_types.SINT32_le(values=None, min=None, max=None, default=None, determinist=True, force_mode=False, fuzz_mode=False, values_desc=None)

Bases: framework.value_types.INT32

__module__ = 'framework.value_types'
alt_cformat = '<L'
cformat = '<l'
endian = 2
maxi = 2147483647
mini = -2147483648
usable = True
class framework.value_types.SINT64_be(values=None, min=None, max=None, default=None, determinist=True, force_mode=False, fuzz_mode=False, values_desc=None)

Bases: framework.value_types.INT64

__module__ = 'framework.value_types'
alt_cformat = '>Q'
cformat = '>q'
endian = 1
maxi = 9223372036854775807L
mini = -9223372036854775808L
usable = True
class framework.value_types.SINT64_le(values=None, min=None, max=None, default=None, determinist=True, force_mode=False, fuzz_mode=False, values_desc=None)

Bases: framework.value_types.INT64

__module__ = 'framework.value_types'
alt_cformat = '<Q'
cformat = '<q'
endian = 2
maxi = 9223372036854775807L
mini = -9223372036854775808L
usable = True
class framework.value_types.SINT8(values=None, min=None, max=None, default=None, determinist=True, force_mode=False, fuzz_mode=False, values_desc=None)

Bases: framework.value_types.INT8

__module__ = 'framework.value_types'
alt_cformat = 'B'
cformat = 'b'
endian = 3
maxi = 127
mini = -128
usable = True
class framework.value_types.String(values=None, size=None, min_sz=None, max_sz=None, determinist=True, codec='latin-1', extra_fuzzy_list=None, absorb_regexp=None, alphabet=None, min_encoded_sz=None, max_encoded_sz=None, encoding_arg=None)

Bases: framework.value_types.VT_Alt

Value type that represents a character string.

encoded_string

shall be set to True by any subclass that deals with encoding

Type:bool
subclass_fuzzing_list

attribute to be added by subclasses that provide specific test cases.

Type:list
ASCII = 'ascii'
DEFAULT_MAX_SZ = 10000
LATIN_1 = 'iso8859-1'
UTF16BE = 'utf-16-be'
UTF16LE = 'utf-16-le'
__init__(values=None, size=None, min_sz=None, max_sz=None, determinist=True, codec='latin-1', extra_fuzzy_list=None, absorb_regexp=None, alphabet=None, min_encoded_sz=None, max_encoded_sz=None, encoding_arg=None)

Initialize the String

Parameters:
  • values – List of the character strings that are considered valid for the node backed by this String object.
  • size – Valid character string size for the node backed by this String object.
  • min_sz – Minimum valid size for the character strings for the node backed by this String object. If not set, this parameter will be automatically inferred by looking at the parameter values whether this latter is provided.
  • max_sz – Maximum valid size for the character strings for the node backed by this String object. If not set, this parameter will be automatically inferred by looking at the parameter values whether this latter is provided.
  • determinist – If set to True generated values will be in a deterministic order, otherwise in a random order.
  • codec – codec to use for encoding the string (e.g., ‘latin-1’, ‘utf8’)
  • extra_fuzzy_list – During data generation, if this parameter is specified with some specific values, they will be part of the test cases generated by the generic disruptor tTYPE.
  • absorb_regexp (str) – You can specify a regular expression in this parameter as a supplementary constraint for data absorption operation.
  • alphabet – The alphabet to use for generating data, in case no values is provided. Also use during absorption to validate the contents. It is checked if there is no values.
  • min_encoded_sz – Only relevant for subclasses that leverage the encoding infrastructure. Enable to provide the minimum legitimate size for an encoded string.
  • max_encoded_sz – Only relevant for subclasses that leverage the encoding infrastructure. Enable to provide the maximum legitimate size for an encoded string.
  • encoding_arg – Only relevant for subclasses that leverage the encoding infrastructure and that allow their encoding scheme to be configured. This parameter is directly provided to String.init_encoding_scheme(). Any object that go through this parameter should support the __copy__ method.
__module__ = 'framework.value_types'
__repr__() <==> repr(x)
_bytes2str(val)
_check_alphabet(val, constraints)
_check_compliance(value, force_max_enc_sz, force_min_enc_sz, update_list=True)
_check_sizes(values)
_enable_fuzz_mode(fuzz_magnitude=1.0)
_enable_normal_mode()
_ensure_enc_sizes_consistency()
_populate_values(force_max_enc_sz=False, force_min_enc_sz=False)
_read_value_from(blob, constraints)
_str2bytes(val)
absorb_auto_helper(blob, constraints)
ctrl_char_set = '\x00\x01\x02\x03\x04\x05\x06\x07\x08\t\n\x0b\x0c\r\x0e\x0f\x10\x11\x12\x13\x14\x15\x16\x17\x18\x19\x1a\x1b\x1c\x1d\x1e\x1f\x7f'
decode(val)

To be overloaded by a subclass that deals with encoding. (Should be stateless.)

Parameters:val (bytes) – the encoded value
Returns:the decoded value
Return type:bytes
do_absorb(blob, constraints, off=0, size=None)

Core function for absorption.

Parameters:
  • blob – binary string on which to perform absorption
  • constraints – constraints to comply with
  • off – absorption should start at offset off from blob
  • size – if provided, size relates to the string to be absorbed (which can be encoded)
Returns:

value, off, size

do_cleanup_absorb()

To be called after self.do_absorb() or self.do_revert_absorb()

do_revert_absorb()

If needed should be called just after self.do_absorb(). (safe to recall it more than once)

encode(val)

To be overloaded by a subclass that deals with encoding. (Should be stateless.)

Parameters:val (bytes) – the value
Returns:the encoded value
Return type:bytes
encoded_string = False
encoding_test_cases(current_val, max_sz, min_sz, min_encoded__sz, max_encoded_sz)

To be optionally overloaded by a subclass that deals with encoding in order to provide specific test cases on encoding scheme.

Parameters:
  • current_val – the current value (not encoded)
  • max_sz – maximum size for a not encoded string
  • min_sz – minimum size for a not encoded string
  • min_encoded_sz – minimum encoded size for a string
  • max_encoded_sz – maximum encoded size for a string
Returns:

the list of encoded test cases

Return type:

list

extended_char_set = u'\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f\xa0\xa1\xa2\xa3\xa4\xa5\xa6\xa7\xa8\xa9\xaa\xab\xac\xad\xae\xaf\xb0\xb1\xb2\xb3\xb4\xb5\xb6\xb7\xb8\xb9\xba\xbb\xbc\xbd\xbe\xbf\xc0\xc1\xc2\xc3\xc4\xc5\xc6\xc7\xc8\xc9\xca\xcb\xcc\xcd\xce\xcf\xd0\xd1\xd2\xd3\xd4\xd5\xd6\xd7\xd8\xd9\xda\xdb\xdc\xdd\xde\xdf\xe0\xe1\xe2\xe3\xe4\xe5\xe6\xe7\xe8\xe9\xea\xeb\xec\xed\xee\xef\xf0\xf1\xf2\xf3\xf4\xf5\xf6\xf7\xf8\xf9\xfa\xfb\xfc\xfd\xfe\xff'
static fuzz_cases_c_strings(knowledge, orig_val, sz, fuzz_magnitude)
static fuzz_cases_ctrl_chars(knowledge, orig_val, sz, max_sz, codec)
get_current_raw_val(str_form=False)
get_current_value()

Provide the current value of the object. Should not change the state of the object except if no current values.

Returns: bytes

get_value()

Walk other the values of the object on a per-call basis.

Returns: bytes

i = 255
init_encoding_scheme(arg)

To be optionally overloaded by a subclass that deals with encoding, if encoding need to be initialized in some way. (called at init and in String.reset())

Parameters:arg – provided through the encoding_arg parameter of the String constructor
is_exhausted()
make_determinist()
make_private(forget_current_state)
make_random()
non_ctrl_char = u' !"#$%&\'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f\xa0\xa1\xa2\xa3\xa4\xa5\xa6\xa7\xa8\xa9\xaa\xab\xac\xad\xae\xaf\xb0\xb1\xb2\xb3\xb4\xb5\xb6\xb7\xb8\xb9\xba\xbb\xbc\xbd\xbe\xbf\xc0\xc1\xc2\xc3\xc4\xc5\xc6\xc7\xc8\xc9\xca\xcb\xcc\xcd\xce\xcf\xd0\xd1\xd2\xd3\xd4\xd5\xd6\xd7\xd8\xd9\xda\xdb\xdc\xdd\xde\xdf\xe0\xe1\xe2\xe3\xe4\xe5\xe6\xe7\xe8\xe9\xea\xeb\xec\xed\xee\xef\xf0\xf1\xf2\xf3\xf4\xf5\xf6\xf7\xf8\xf9\xfa\xfb\xfc\xfd\xfe\xff'
pretty_print(max_size=None)
printable_char_set = ' !"#$%&\'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~'
reset_state()
rewind()
set_description(values=None, size=None, min_sz=None, max_sz=None, determinist=True, codec='latin-1', extra_fuzzy_list=None, absorb_regexp=None, alphabet=None, min_encoded_sz=None, max_encoded_sz=None)

@size take precedence over @min_sz and @max_sz

set_size_from_constraints(size=None, encoded_size=None)
class framework.value_types.UINT16_be(values=None, min=None, max=None, default=None, determinist=True, force_mode=False, fuzz_mode=False, values_desc=None)

Bases: framework.value_types.INT16

__module__ = 'framework.value_types'
alt_cformat = '>h'
cformat = '>H'
endian = 1
maxi = 65535
mini = 0
usable = True
class framework.value_types.UINT16_le(values=None, min=None, max=None, default=None, determinist=True, force_mode=False, fuzz_mode=False, values_desc=None)

Bases: framework.value_types.INT16

__module__ = 'framework.value_types'
alt_cformat = '<h'
cformat = '<H'
endian = 2
maxi = 65535
mini = 0
usable = True
class framework.value_types.UINT32_be(values=None, min=None, max=None, default=None, determinist=True, force_mode=False, fuzz_mode=False, values_desc=None)

Bases: framework.value_types.INT32

__module__ = 'framework.value_types'
alt_cformat = '>l'
cformat = '>L'
endian = 1
maxi = 4294967295
mini = 0
usable = True
class framework.value_types.UINT32_le(values=None, min=None, max=None, default=None, determinist=True, force_mode=False, fuzz_mode=False, values_desc=None)

Bases: framework.value_types.INT32

__module__ = 'framework.value_types'
alt_cformat = '<l'
cformat = '<L'
endian = 2
maxi = 4294967295
mini = 0
usable = True
class framework.value_types.UINT64_be(values=None, min=None, max=None, default=None, determinist=True, force_mode=False, fuzz_mode=False, values_desc=None)

Bases: framework.value_types.INT64

__module__ = 'framework.value_types'
alt_cformat = '>q'
cformat = '>Q'
endian = 1
maxi = 18446744073709551615L
mini = 0
usable = True
class framework.value_types.UINT64_le(values=None, min=None, max=None, default=None, determinist=True, force_mode=False, fuzz_mode=False, values_desc=None)

Bases: framework.value_types.INT64

__module__ = 'framework.value_types'
alt_cformat = '<q'
cformat = '<Q'
endian = 2
maxi = 18446744073709551615L
mini = 0
usable = True
class framework.value_types.UINT8(values=None, min=None, max=None, default=None, determinist=True, force_mode=False, fuzz_mode=False, values_desc=None)

Bases: framework.value_types.INT8

__module__ = 'framework.value_types'
alt_cformat = 'b'
cformat = 'B'
endian = 3
maxi = 255
mini = 0
usable = True
class framework.value_types.VT

Bases: object

Base class for value type classes accepted by value Elts

BigEndian = 1
LittleEndian = 2
Native = 3
__module__ = 'framework.value_types'
add_specific_fuzzy_vals(vals)
copy_attrs_from(vt)
enc2struct = {1: '>', 2: '<', 3: '='}
endian = None
get_current_raw_val()
get_current_value()

Provide the current value of the object. Should not change the state of the object except if no current values.

Returns: bytes

get_fuzzed_vt_list()
get_specific_fuzzy_vals()
get_value()

Walk other the values of the object on a per-call basis.

Returns: bytes

is_exhausted()
knowledge_source = None
make_determinist()
make_private(forget_current_state)
make_random()
maxi = None
mini = None
pretty_print(max_size=None)
reset_state()
rewind()
set_size_from_constraints(size=None, encoded_size=None)
class framework.value_types.VT_Alt

Bases: framework.value_types.VT

__init__()

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.value_types'
_enable_fuzz_mode(fuzz_magnitude=1.0)
_enable_normal_mode()
add_specific_fuzzy_vals(vals)
after_enabling_mode()
enable_fuzz_mode(fuzz_magnitude=1.0)
enable_normal_mode()
get_specific_fuzzy_vals()
switch_mode()
class framework.value_types.Wrapper(values=None, size=None, min_sz=None, max_sz=None, determinist=True, codec='latin-1', extra_fuzzy_list=None, absorb_regexp=None, alphabet=None, min_encoded_sz=None, max_encoded_sz=None, encoding_arg=None)

Bases: framework.value_types.String

__module__ = 'framework.value_types'
decode(val)

To be overloaded by a subclass that deals with encoding. (Should be stateless.)

Parameters:val (bytes) – the encoded value
Returns:the decoded value
Return type:bytes
encode(val)

To be overloaded by a subclass that deals with encoding. (Should be stateless.)

Parameters:val (bytes) – the value
Returns:the encoded value
Return type:bytes
init_encoding_scheme(arg)

Take a list parameter specifying the prefix and the suffix to add to the value to encode, or to remove from an encoded value.

Parameters:arg (list) – Prefix and suffix character strings. Can be individually set to None
framework.value_types.from_encoder(encoder_cls, encoding_arg=None)

13.2.7. framework.generic_data_makers module

class framework.generic_data_makers.SwapperDisruptor

Bases: framework.tactics_helpers.StatefulDisruptor

Merge two nodes to produce two children

__module__ = 'framework.generic_data_makers'
_swap_nodes(node_1, node_2)
disrupt_data(dm, target, data)

@data: it is either equal to prev_data the first time disrupt_data() is called by the FMK, or it is a an empty data (that is Data()).

set_seed(prev_data)
class framework.generic_data_makers.d_call_external_program

Bases: framework.tactics_helpers.Disruptor

Call an external program to deal with the data.

__module__ = 'framework.generic_data_makers'
_args_desc = {'cmd': ('The external command the execute.', None, (<type 'list'>, <type 'tuple'>, <type 'str'>)), 'file_mode': ('If True the data will be provided through a file to the external program, otherwise it will be provided on the command line directly.', True, <type 'bool'>), 'path': ('Graph path regexp to select nodes on which the disruptor should apply.', None, <type 'str'>)}
_get_cmd()
_modelwalker_user = False
disrupt_data(dm, target, prev_data)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.generic_data_makers.d_call_function

Bases: framework.tactics_helpers.Disruptor

Call the function provided with the first parameter being the Data() object received as input of this disruptor, and optionally with additional parameters if @params is set. The function should return a Data() object.

The signature of the function should be compatible with:

func(data, *args) –> Data()

__module__ = 'framework.generic_data_makers'
_args_desc = {'func': ('The function that will be called with a node as its first parameter, and provided optionnaly with addtionnal parameters if @params is set.', <function <lambda>>, (<type 'instancemethod'>, <type 'function'>)), 'params': ('Tuple of parameters that will be provided to the function.', None, <type 'tuple'>)}
_modelwalker_user = False
disrupt_data(dm, target, prev_data)
class framework.generic_data_makers.d_corrupt_bits_by_position

Bases: framework.tactics_helpers.Disruptor

Corrupt bit at a specific byte.

__module__ = 'framework.generic_data_makers'
_args_desc = {'ascii': ('Enforce all outputs to be ascii 7bits.', False, <type 'bool'>), 'idx': ('Byte index to be corrupted (from 1 to data length).', 1, <type 'int'>), 'new_val': ('If provided change the selected byte with the new one.', None, <type 'str'>)}
_modelwalker_user = False
disrupt_data(dm, target, prev_data)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.generic_data_makers.d_corrupt_node_bits

Bases: framework.tactics_helpers.Disruptor

Corrupt bits on some nodes of the data model.

__module__ = 'framework.generic_data_makers'
_args_desc = {'ascii': ('Enforce all outputs to be ascii 7bits.', False, <type 'bool'>), 'nb': ('Apply corruption on @nb Nodes fetched randomly within the data model.', 2, <type 'int'>), 'new_val': ('If provided change the selected byte with the new one.', None, <type 'str'>), 'path': ('Graph path regexp to select nodes on which the disruptor should apply.', None, <type 'str'>)}
_modelwalker_user = False
disrupt_data(dm, target, prev_data)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.generic_data_makers.d_fix_constraints

Bases: framework.tactics_helpers.Disruptor

Fix data constraints.

Release constraints from input data or from only a piece of it (if the parameter path is provided), then recompute them. By constraints we mean every generator (or function) nodes that may embeds constraints between nodes, and every node existence conditions.

__module__ = 'framework.generic_data_makers'
_args_desc = {'clone_node': ('If True the dmaker will always return a copy of the node. (For stateless disruptors dealing with big data it can be useful to it to False.)', False, <type 'bool'>), 'path': ('Graph path regexp to select nodes on which the disruptor should apply.', None, <type 'str'>)}
_modelwalker_user = False
disrupt_data(dm, target, prev_data)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.generic_data_makers.d_fuzz_model_structure

Bases: framework.tactics_helpers.Disruptor

Disrupt the data model structure (replace ordered sections by unordered ones).

__module__ = 'framework.generic_data_makers'
_args_desc = {'path': ('Graph path regexp to select nodes on which the disruptor should apply.', None, <type 'str'>)}
_modelwalker_user = False
disrupt_data(dm, target, prev_data)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.generic_data_makers.d_max_size

Bases: framework.tactics_helpers.Disruptor

Truncate the data (or part of the data) to the provided size.

__module__ = 'framework.generic_data_makers'
_args_desc = {'path': ('Graph path regexp to select nodes on which the disruptor should apply.', None, <type 'str'>), 'sz': ('Truncate the data (or part of the data) to the provided size.', 10, <type 'int'>)}
_modelwalker_user = False
disrupt_data(dm, target, prev_data)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.generic_data_makers.d_modify_nodes

Bases: framework.tactics_helpers.Disruptor

Perform modifications on the provided data. Two ways are possible:

  • Either the change is performed on the content of the nodes specified by the path parameter with the new value provided, and the optional constraints for the absorption (use node absorption infrastructure);
  • Or the changed is performed based on a dictionary provided through the parameter multi_mod
__module__ = 'framework.generic_data_makers'
_add_info(prev_data, n, status, size)
_args_desc = {'clone_node': ('If True the dmaker will always return a copy of the node. (For stateless disruptors dealing with big data it can be useful to it to False.)', False, <type 'bool'>), 'constraints': ('Constraints for the absorption of the new value.', AbsNoCsts(), <class 'framework.global_resources.AbsCsts'>), 'multi_mod': ('Dictionary of <path>:<item> pairs to change multiple nodes with diferent values. <item> can be either only the new <value> or a tuple (<value>,<abscsts>) if new constraint for absorption is needed', None, <type 'dict'>), 'path': ('Graph path regexp to select nodes on which the disruptor should apply.', None, <type 'str'>), 'value': ('The new value to inject within the data.', '', <type 'str'>)}
_modelwalker_user = False
disrupt_data(dm, target, prev_data)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.generic_data_makers.d_next_node_content

Bases: framework.tactics_helpers.Disruptor

Move to the next content of the nodes from input data or from only a piece of it (if the parameter path is provided). Basically, unfreeze the nodes then freeze them again, which will consequently produce a new data.

__module__ = 'framework.generic_data_makers'
_args_desc = {'clone_node': ('If True the dmaker will always return a copy of the node. (for stateless disruptors dealing with big data it can be useful to it to False).', False, <type 'bool'>), 'path': ('Graph path regexp to select nodes on which the disruptor should apply.', None, <type 'str'>), 'recursive': ('Apply the disruptor recursively.', True, <type 'str'>)}
_modelwalker_user = False
disrupt_data(dm, target, prev_data)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.generic_data_makers.d_operate_on_nodes

Bases: framework.tactics_helpers.Disruptor

Perform an operation on the nodes specified by the regexp path. @op is an operation that applies to a node and @params are a tuple containing the parameters that will be provided to @op. If no path is provided, the root node will be used.

__module__ = 'framework.generic_data_makers'
_add_info(prev_data, n)
_args_desc = {'clone_node': ('If True the dmaker will always return a copy of the node. (For stateless disruptors dealing with big data it can be useful to set it to False.)', False, <type 'bool'>), 'op': ('The operation to perform on the selected nodes.', <unbound method Node.clear_attr>, (<type 'instancemethod'>, <type 'function'>)), 'params': ('Tuple of parameters that will be provided to the operation. (default: MH.Attr.Mutable)', (2,), <type 'tuple'>), 'path': ('Graph path regexp to select nodes on which the disruptor should apply.', None, <type 'str'>)}
_modelwalker_user = False
disrupt_data(dm, target, prev_data)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.generic_data_makers.d_shallow_copy

Bases: framework.tactics_helpers.Disruptor

Shallow copy of the input data, which means: ignore its frozen state during the copy.

__module__ = 'framework.generic_data_makers'
_args_desc = {}
_modelwalker_user = False
disrupt_data(dm, target, prev_data)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.generic_data_makers.d_switch_to_alternate_conf

Bases: framework.tactics_helpers.Disruptor

Switch to an alternate configuration.

__module__ = 'framework.generic_data_makers'
_args_desc = {'conf': ('Change the configuration, with the one provided (by name), of all subnodes fetched by @path, one-by-one. [default value is set dynamically with the first-found existing alternate configuration]', None, <type 'str'>), 'path': ('Graph path regexp to select nodes on which the disruptor should apply.', None, <type 'str'>), 'recursive': ('Does the reachable nodes from the selected ones need also to be changed?', True, <type 'bool'>)}
_modelwalker_user = False
disrupt_data(dm, target, prev_data)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.generic_data_makers.g_population

Bases: framework.tactics_helpers.Generator

Walk through the given population

__module__ = 'framework.generic_data_makers'
_args_desc = {'population': ('The population to iterate over.', None, <class 'framework.evolutionary_helpers.Population'>)}
_modelwalker_user = False
generate_data(dm, monitor, target)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.generic_data_makers.sd_combine

Bases: framework.generic_data_makers.SwapperDisruptor

Merge two nodes by swapping some roots’ children

__module__ = 'framework.generic_data_makers'
_args_desc = {'node': ('Node to combine with.', None, <class 'framework.node.Node'>)}
_modelwalker_user = False
get_nodes(node)
set_seed(prev_data)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.generic_data_makers.sd_crossover

Bases: framework.generic_data_makers.SwapperDisruptor

Makes two graphs share a certain percentages of their leaf nodes in order to produce two children

class Operand(node)

Bases: object

__init__(node)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.generic_data_makers'
_count_brothers(index, pattern)
_merge_brothers(index, pattern, length)
compute_sub_graphs(percentage)
__module__ = 'framework.generic_data_makers'
_args_desc = {'node': ('Node to crossover with.', None, <class 'framework.node.Node'>), 'percentage_to_share': ('Percentage of the base node to share.', 0.5, <type 'float'>)}
_modelwalker_user = False
set_seed(prev_data)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.generic_data_makers.sd_fuzz_separator_nodes

Bases: framework.tactics_helpers.StatefulDisruptor

Perform alterations on separators (one at a time). Each time a separator is encountered in the provided data, it will be replaced by another separator picked from the ones existing within the provided data.

__module__ = 'framework.generic_data_makers'
_args_desc = {'clone_node': ('if True the dmaker will always return a copy of the node. (for stateless diruptors dealing with big data it can be usefull to it to False)', True, <type 'bool'>), 'deep': ('When set to True, if a node structure has changed, the modelwalker will reset its walk through the children nodes.', True, <type 'bool'>), 'init': ('make the model walker ignore all the steps until the provided one', 1, <type 'int'>), 'max_steps': ('maximum number of steps (-1 means until the end)', -1, <type 'int'>), 'order': ('When set to True, the fuzzing order is strictly guided by the data structure. Otherwise, fuzz weight (if specified in the data model) is used for ordering.', True, <type 'bool'>), 'path': ('Graph path regexp to select nodes on which the disruptor should apply.', None, <type 'str'>), 'runs_per_node': ('maximum number of test cases for a single node (-1 means until the end)', -1, <type 'int'>)}
_modelwalker_user = True
disrupt_data(dm, target, data)

@data: it is either equal to prev_data the first time disrupt_data() is called by the FMK, or it is a an empty data (that is Data()).

set_seed(prev_data)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.generic_data_makers.sd_fuzz_typed_nodes

Bases: framework.tactics_helpers.StatefulDisruptor

Perform alterations on typed nodes (one at a time) according to: - their type (e.g., INT, Strings, …) - their attributes (e.g., allowed values, minimum size, …) - knowledge retrieved from the data (e.g., if the input data uses separators, their symbols are leveraged in the fuzzing) - knowledge on the target retrieved from the project file or dynamically from feedback inspection (e.g., C language, GNU/Linux OS, …)

If the input has different shapes (described in non-terminal nodes), this will be taken into account by fuzzing every shape combinations.

Note: this disruptor includes what tSEP does and goes beyond with respect to separators.

__module__ = 'framework.generic_data_makers'
_args_desc = {'clone_node': ('if True the dmaker will always return a copy of the node. (for stateless diruptors dealing with big data it can be usefull to it to False)', True, <type 'bool'>), 'deep': ('When set to True, if a node structure has changed, the modelwalker will reset its walk through the children nodes.', True, <type 'bool'>), 'determinism': ("If set to 'True', the whole model will be fuzzed in a deterministic way. Otherwise it will be guided by the data model determinism.", True, <type 'bool'>), 'fix': ("Limit constraints fixing to the nodes related to the currently fuzzed one (only implemented for 'sync_size_with' and 'sync_enc_size_with').", True, <type 'bool'>), 'fix_all': ('For each produced data, reevaluate the constraints on the whole graph.', False, <type 'bool'>), 'fuzz_mag': ('Order of magnitude for maximum size of some fuzzing test cases.', 1.0, <type 'float'>), 'ign_sep': ('When set to True, separators will be ignored if any are defined.', False, <type 'bool'>), 'init': ('make the model walker ignore all the steps until the provided one', 1, <type 'int'>), 'leaf_determinism': ("If set to 'True', each typed node will be fuzzed in a deterministic way. Otherwise it will be guided by the data model determinism. Note: this option is complementary to 'determinism' is it acts on the typed node substitutions that occur through this disruptor", True, <type 'bool'>), 'max_steps': ('maximum number of steps (-1 means until the end)', -1, <type 'int'>), 'order': ('When set to True, the fuzzing order is strictly guided by the data structure. Otherwise, fuzz weight (if specified in the data model) is used for ordering.', True, <type 'bool'>), 'path': ('Graph path regexp to select nodes on which the disruptor should apply.', None, <type 'str'>), 'runs_per_node': ('maximum number of test cases for a single node (-1 means until the end)', -1, <type 'int'>)}
_modelwalker_user = True
disrupt_data(dm, target, data)

@data: it is either equal to prev_data the first time disrupt_data() is called by the FMK, or it is a an empty data (that is Data()).

set_seed(prev_data)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.generic_data_makers.sd_iter_over_data

Bases: framework.tactics_helpers.StatefulDisruptor

Walk through the provided data and for each visited node, iterates over the allowed values (with respect to the data model). Note: no alteration is performed by this disruptor.

__module__ = 'framework.generic_data_makers'
_args_desc = {'clone_node': ('if True the dmaker will always return a copy of the node. (for stateless diruptors dealing with big data it can be usefull to it to False)', True, <type 'bool'>), 'fix_all': ('For each produced data, reevaluate the constraints on the whole graph.', True, <type 'bool'>), 'init': ('make the model walker ignore all the steps until the provided one', 1, <type 'int'>), 'max_steps': ('maximum number of steps (-1 means until the end)', -1, <type 'int'>), 'nt_only': ('Walk through non-terminal nodes only.', False, <type 'bool'>), 'order': ('When set to True, the walking order is strictly guided by the data structure. Otherwise, fuzz weight (if specified in the data model) is used for ordering.', True, <type 'bool'>), 'path': ('Graph path regexp to select nodes on which the disruptor should apply.', None, <type 'str'>), 'runs_per_node': ('maximum number of test cases for a single node (-1 means until the end)', -1, <type 'int'>)}
_modelwalker_user = True
disrupt_data(dm, target, data)

@data: it is either equal to prev_data the first time disrupt_data() is called by the FMK, or it is a an empty data (that is Data()).

set_seed(prev_data)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.generic_data_makers.sd_struct_constraints

Bases: framework.tactics_helpers.StatefulDisruptor

Perform constraints alteration (one at a time) on each node that depends on another one regarding its existence, its quantity, its size, …

If deep is set, enable more corruption cases on the data structure, based on the internals of each non-terminal node: - the minimum and maximum amount of the subnodes of each non-terminal nodes - …

__module__ = 'framework.generic_data_makers'
_args_desc = {'deep': ('If True, enable corruption of non-terminal node internals', False, <type 'bool'>), 'init': ('Make the model walker ignore all the steps until the provided one.', 1, <type 'int'>), 'max_steps': ('Maximum number of steps (-1 means until the end).', -1, <type 'int'>), 'path': ('Graph path regexp to select nodes on which the disruptor should apply.', None, <type 'str'>)}
_modelwalker_user = False
disrupt_data(dm, target, data)

@data: it is either equal to prev_data the first time disrupt_data() is called by the FMK, or it is a an empty data (that is Data()).

set_seed(prev_data)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.generic_data_makers.sd_switch_to_alternate_conf

Bases: framework.tactics_helpers.StatefulDisruptor

Switch the configuration of each node, one by one, with the provided alternate configuration.

__module__ = 'framework.generic_data_makers'
_args_desc = {'clone_node': ('if True the dmaker will always return a copy of the node. (for stateless diruptors dealing with big data it can be usefull to it to False)', True, <type 'bool'>), 'conf': ('Change the configuration, with the one provided (by name), of all nodes reachable from the root, one-by-one. [default value is set dynamically with the first-found existing alternate configuration]', None, (<type 'str'>, <type 'list'>, <type 'tuple'>)), 'init': ('make the model walker ignore all the steps until the provided one', 1, <type 'int'>), 'max_steps': ('maximum number of steps (-1 means until the end)', -1, <type 'int'>), 'runs_per_node': ('maximum number of test cases for a single node (-1 means until the end)', -1, <type 'int'>)}
_modelwalker_user = True
disrupt_data(dm, target, data)

@data: it is either equal to prev_data the first time disrupt_data() is called by the FMK, or it is a an empty data (that is Data()).

set_seed(prev_data)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

framework.generic_data_makers.truncate_info(info, max_size=60)

13.2.8. framework.target_helpers module

class framework.target_helpers.EmptyTarget(enable_feedback=True)

Bases: framework.target_helpers.Target

__init__(enable_feedback=True)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.target_helpers'
_feedback_mode = 1
is_target_ready_for_new_data()

The FMK busy wait on this method() before sending a new data. This method should take into account feedback timeout (that is the maximum time duration for gathering feedback from the target)

send_data(data, from_fmk=False)

To be overloaded.

Note: use data.to_bytes() to get binary data.

Parameters:
  • from_fmk (bool) – set to True if the call was performed by the framework itself, otherwise the call comes from user-code (e.g., from a probe or an operator)
  • data (Data) – data container that embeds generally a modeled data accessible through data.content. However if the latter is None, it only embeds the raw data.
send_multiple_data(data_list, from_fmk=False)

Used to send multiple data to the target, or to stimulate several target’s inputs in one shot.

Note: Use data.to_bytes() to get binary data

Parameters:
  • from_fmk (bool) – set to True if the call was performed by the framework itself, otherwise the call comes from user-code (e.g., from a Probe or an Operator)
  • data_list (list) – list of data to be sent
supported_feedback_mode = [1, 2]
class framework.target_helpers.Target

Bases: object

Class abstracting the target we interact with.

FBK_WAIT_FULL_TIME = 1
FBK_WAIT_UNTIL_RECV = 2
__module__ = 'framework.target_helpers'
__str__() <==> str(x)
_altered_data_queued = None
_feedback_mode = None
_logger = None
_pending_data = None
_probes = None
_send_data_lock = <thread.lock object>
_set_feedback_timeout_specific(fbk_timeout)

Overload this function to handle feedback specifics

Parameters:fbk_timeout (float) – time duration for collecting the feedback
_start(target_desc, tg_id)
_stop(target_desc, tg_id)
add_pending_data(data)
add_probe(probe)
cleanup()

To be overloaded if something needs to be performed after each data emission. It is called after any feedback has been retrieved.

collect_pending_feedback(timeout=0)

If overloaded, it can be used by the framework to retrieve additional feedback from the target without sending any new data.

Parameters:timeout – Maximum delay before returning from feedback collecting
Returns:False if it is not possible, otherwise it should be True
Return type:bool
fbk_wait_full_time_slot_mode
fbk_wait_full_time_slot_msg = 'Wait for the full time slot allocated for feedback retrieval'
fbk_wait_until_recv_msg = 'Wait until the target has sent something back to us'
feedback_timeout = None
get_description()
static get_fbk_mode_desc(fbk_mode, short=False)
get_feedback()

If overloaded, should return a FeedbackCollector object.

get_last_target_ack_date()

If different from None the return value is used by the FMK to log the date of the target acknowledgment after a message has been sent to it.

[Note: If this method is overloaded, is_target_ready_for_new_data() should also be]

is_processed_data_altered()
is_target_ready_for_new_data()

The FMK busy wait on this method() before sending a new data. This method should take into account feedback timeout (that is the maximum time duration for gathering feedback from the target)

probes
record_info(info)

Can be used by the target to record some information during initialization or anytime it make sense for your purpose.

Parameters:info (str) – info to be recorded
Returns:None
recover_target()

Implementation of target recovering operations, when a target problem has been detected (i.e. a negative feedback from a probe, an operator or the Target() itself)

Returns:True if the target has been recovered. False otherwise.
Return type:bool
remove_probes()
send_data(data, from_fmk=False)

To be overloaded.

Note: use data.to_bytes() to get binary data.

Parameters:
  • from_fmk (bool) – set to True if the call was performed by the framework itself, otherwise the call comes from user-code (e.g., from a probe or an operator)
  • data (Data) – data container that embeds generally a modeled data accessible through data.content. However if the latter is None, it only embeds the raw data.
send_data_sync(data, from_fmk=False)

Can be used in user-code to send data to the target without interfering with the framework.

Use case example: The user needs to send some message to the target on a regular basis in background. For that purpose, it can quickly define a framework.monitor.Probe that just emits the message by itself.

send_multiple_data(data_list, from_fmk=False)

Used to send multiple data to the target, or to stimulate several target’s inputs in one shot.

Note: Use data.to_bytes() to get binary data

Parameters:
  • from_fmk (bool) – set to True if the call was performed by the framework itself, otherwise the call comes from user-code (e.g., from a Probe or an Operator)
  • data_list (list) – list of data to be sent
send_multiple_data_sync(data_list, from_fmk=False)

Can be used in user-code to send data to the target without interfering with the framework.

send_pending_data(from_fmk=False)
sending_delay = 0
set_data_model(dm)
set_feedback_mode(mode)
set_feedback_timeout(fbk_timeout)

To set dynamically the feedback timeout.

Parameters:fbk_timeout (float) – maximum time duration for collecting the feedback
set_logger(logger)
set_sending_delay(sending_delay)

Set the sending delay.

Parameters:sending_delay (float) – maximum time (in seconds) taken to send data once the method send_(multiple_)data() has been called.
start()

To be overloaded if needed

stop()

To be overloaded if needed

supported_feedback_mode = []
exception framework.target_helpers.TargetStuck

Bases: exceptions.Exception

__module__ = 'framework.target_helpers'

13.2.9. framework.targets.network module

class framework.targets.network.NetworkTarget(host='localhost', port=12345, socket_type=(2, 1), data_semantics='Unknown Semantic', server_mode=False, target_address=None, wait_for_client=True, hold_connection=False, keep_first_client=True, mac_src=None, mac_dst=None, add_eth_header=False, fbk_timeout=2, sending_delay=1, recover_timeout=0.5)

Bases: framework.target_helpers.Target

Generic target class for interacting with a network resource. Can be used directly, but some methods may require to be overloaded to fit your needs.

CHUNK_SZ = 2048
General_Info_ID = 'General Information'
UNKNOWN_SEMANTIC = 'Unknown Semantic'
_INTERNALS_ID = 'NetworkTarget()'
__init__(host='localhost', port=12345, socket_type=(2, 1), data_semantics='Unknown Semantic', server_mode=False, target_address=None, wait_for_client=True, hold_connection=False, keep_first_client=True, mac_src=None, mac_dst=None, add_eth_header=False, fbk_timeout=2, sending_delay=1, recover_timeout=0.5)
Parameters:
  • host (str) – IP address of the target to connect to, or the IP address on which we will wait for target connecting to us (if server_mode is True). For raw socket type, it should contain the name of the interface.
  • port (int) – Port for communicating with the target, or the port to listen to. For raw socket type, it should contain the protocol ID.
  • socket_type (tuple) – Tuple composed of the socket address family and socket type
  • data_semantics (str) – String of characters that will be used for data routing decision. Useful only when more than one interface are defined. In such case, the data semantics will be checked in order to find a matching interface to which data will be sent. If the data have no semantic, it will be routed to the default first declared interface.
  • server_mode (bool) – If True, the interface will be set in server mode, which means we will wait for the real target to connect to us for sending it data.
  • target_address (tuple) – Used only if server_mode is True and socket type is SOCK_DGRAM. To be used if data has to be sent to a specific address (which is not necessarily the client). It is especially useful if you need to send data before receiving anything. What should be provided is a tuple (host(str), port(int)) associated to the target.
  • wait_for_client (bool) – Used only in server mode (server_mode is True) when the socket type is SOCK_DGRAM and a target_address is provided, or when the socket_type is SOCK_RAW. If set to True, before sending any data, the NetworkTarget will wait for the reception of data (from any client); otherwise it will send data as soon as provided.
  • hold_connection (bool) – If True, we will maintain the connection while sending data to the real target. Otherwise, after each data emission, we close the related socket.
  • keep_first_client (bool) – Used only in server mode (server_mode is True) with SOCK_STREAM socket type. If set to True, the first client that connects to the server will remain the one used for data sending until the target is reloaded. Otherwise, last client information are used. This is not supported for SOCK_DGRAM where the first client will always be the one used for data sending.
  • mac_src (bytes) – Only in conjunction with raw socket. For each data sent through this interface, and if this data contain nodes with the semantic 'mac_src', these nodes will be overwritten (through absorption) with this parameter. If nothing is provided, the MAC address will be retrieved from the interface specified in ‘host’. (works accurately for Linux system).
  • mac_dst (bytes) – Only in conjunction with raw socket. For each data sent through this interface, and if this data contain nodes with the semantic 'mac_dst', these nodes will be overwritten (through absorption) with this parameter.
  • add_eth_header (bool) – Add an ethernet header to the data to send. Only possible in combination with a SOCK_RAW socket type.
  • fbk_timeout (float) – maximum time duration for collecting the feedback
  • sending_delay (float) – maximum time (in seconds) taken to send data once the method send_(multiple_)data() has been called.
  • recover_timeout (int) – Allowed delay for recovering the target. (the recovering can be triggered by the framework if the feedback threads did not terminate before the target health check) Impact the behavior of self.recover_target().
__module__ = 'framework.targets.network'
_before_sending_data(data_list, from_fmk)
_collect_feedback_from(thread_id, fbk_sockets, fbk_ids, fbk_lengths, epobj, fileno2fd, fbk_timeout, flush_received_fbk, pre_fbk)
_connect_to_additional_feedback_sockets()

Connection to additional feedback sockets, if any.

_connect_to_target(host, port, socket_type)
_custom_data_handling_before_emission(data_list)

To be overloaded if you want to perform some operation before sending data_list to the target.

Parameters:data_list (list) – list of Data objects that will be sent to the target.
Returns:the data list to send
Return type:list
_feedback_collect(fbk, ref, error=0)
_feedback_complete()
_feedback_handling(fbk, ref)

To be overloaded if feedback from the target need to be filtered before being logged and/or collected in some way and/or for any other reasons.

Parameters:
  • fbk (bytes) – feedback received by the target through a socket referenced by ref.
  • ref (string) – user-defined reference of the socket used to retrieve the feedback.
Returns:

a tuple (new_fbk, status) where new_fbk is the feedback

you want to log and status is a status that enables you to notify a problem to the framework (should be positive if everything is fine, otherwise should be negative).

Return type:

tuple

_feedback_mode = 1
_get_additional_feedback_sockets()

Used if any additional socket to get feedback from has been added by NetworkTarget.add_additional_feedback_interface(), related to the data emitted if needed.

Parameters:data (Data) – the data that will be sent.
Returns:
list of sockets, dict of associated ids/names,
dict of associated length (a length can be None)
Return type:tuple
_get_data_semantic_key(data)
_get_net_info_from(data)
_get_socket_type(host, port)
_handle_connection_to_fbk_server(clientsocket, address, args, pre_fbk=None)
_handle_target_connection(clientsocket, address, args, pre_fbk=None)
_is_valid_socket_type(socket_type)
_listen_to_target(host, port, socket_type, func, args=None)
_raw_connect_to(host, port, ref_id, socket_type=(2, 1), chk_size=2048, hold_connection=True)
_raw_listen_to(host, port, ref_id, socket_type=(2, 1), chk_size=2048, wait_time=None)
_raw_server_main(serversocket, host, port, sock_type, func, sending_event, notif_host_event)
_register_last_ack_date(ack_date)
_send_data(sockets, data_refs, fbk_timeout, from_fmk, pre_fbk=None)
_server_main(serversocket, host, port, func)
_start_fbk_collector(fbk_sockets, fbk_ids, fbk_lengths, epobj, fileno2fd, pre_fbk=None, timeout=None, flush_received_fbk=False)
add_additional_feedback_interface(host, port, socket_type=(2, 1), fbk_id=None, fbk_length=None, server_mode=False)

Allows to register additional socket to get feedback from. Connection is attempted be when target starts, that is when NetworkTarget.start() is called.

cleanup()

To be overloaded if something needs to be performed after each data emission. It is called after any feedback has been retrieved.

collect_pending_feedback(timeout=0)

If overloaded, it can be used by the framework to retrieve additional feedback from the target without sending any new data.

Parameters:timeout – Maximum delay before returning from feedback collecting
Returns:False if it is not possible, otherwise it should be True
Return type:bool
connect_to(host, port, ref_id, socket_type=(2, 1), chk_size=2048, hold_connection=True)

Used for collecting feedback from the target while it is already started.

get_description()
get_feedback()

If overloaded, should return a FeedbackCollector object.

get_last_target_ack_date()

If different from None the return value is used by the FMK to log the date of the target acknowledgment after a message has been sent to it.

[Note: If this method is overloaded, is_target_ready_for_new_data() should also be]

initialize()

To be overloaded if some intial setup for the target is necessary.

is_target_ready_for_new_data()

The FMK busy wait on this method() before sending a new data. This method should take into account feedback timeout (that is the maximum time duration for gathering feedback from the target)

listen_to(host, port, ref_id, socket_type=(2, 1), chk_size=2048, wait_time=None, hold_connection=True)

Used for collecting feedback from the target while it is already started.

recover_target()

Implementation of target recovering operations, when a target problem has been detected (i.e. a negative feedback from a probe, an operator or the Target() itself)

Returns:True if the target has been recovered. False otherwise.
Return type:bool
register_new_interface(host, port, socket_type, data_semantics, server_mode=False, target_address=None, wait_for_client=True, hold_connection=False, keep_first_client=True, mac_src=None, mac_dst=None, add_eth_header=False)
remove_all_dynamic_interfaces()
remove_dynamic_interface(host, port)
send_data(data, from_fmk=False)

To be overloaded.

Note: use data.to_bytes() to get binary data.

Parameters:
  • from_fmk (bool) – set to True if the call was performed by the framework itself, otherwise the call comes from user-code (e.g., from a probe or an operator)
  • data (Data) – data container that embeds generally a modeled data accessible through data.content. However if the latter is None, it only embeds the raw data.
send_multiple_data(data_list, from_fmk=False)

Used to send multiple data to the target, or to stimulate several target’s inputs in one shot.

Note: Use data.to_bytes() to get binary data

Parameters:
  • from_fmk (bool) – set to True if the call was performed by the framework itself, otherwise the call comes from user-code (e.g., from a Probe or an Operator)
  • data_list (list) – list of data to be sent
set_timeout(fbk_timeout, sending_delay)

Set the time duration for feedback gathering and the sending delay above which we give up: - sending data to the target (client mode) - waiting for client connections before sending data to them (server mode)

Parameters:
  • fbk_timeout – time duration for feedback gathering (in seconds)
  • sending_delay – sending delay (in seconds)
start()

To be overloaded if needed

stop()

To be overloaded if needed

supported_feedback_mode = [1, 2]
terminate()

To be overloaded if some cleanup is necessary for stopping the target.

13.2.10. framework.targets.local module

class framework.targets.local.LocalTarget(target_path=None, pre_args='', post_args='', tmpfile_ext='.bin', send_via_stdin=False, send_via_cmdline=False)

Bases: framework.target_helpers.Target

__init__(target_path=None, pre_args='', post_args='', tmpfile_ext='.bin', send_via_stdin=False, send_via_cmdline=False)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.targets.local'
_before_sending_data()
_feedback_mode = 2
cleanup()

To be overloaded if something needs to be performed after each data emission. It is called after any feedback has been retrieved.

get_description()
get_feedback(timeout=0.2)

If overloaded, should return a FeedbackCollector object.

get_post_args()
get_pre_args()
get_target_path()
initialize()

To be overloaded if some intial setup for the target is necessary.

send_data(data, from_fmk=False)

To be overloaded.

Note: use data.to_bytes() to get binary data.

Parameters:
  • from_fmk (bool) – set to True if the call was performed by the framework itself, otherwise the call comes from user-code (e.g., from a probe or an operator)
  • data (Data) – data container that embeds generally a modeled data accessible through data.content. However if the latter is None, it only embeds the raw data.
set_post_args(post_args)
set_pre_args(pre_args)
set_target_path(target_path)
set_tmp_file_extension(tmpfile_ext)
start()

To be overloaded if needed

stop()

To be overloaded if needed

supported_feedback_mode = [2]
terminate()

To be overloaded if some cleanup is necessary for stopping the target.

13.2.11. framework.targets.sim module

13.2.12. framework.targets.printer module

class framework.targets.printer.PrinterTarget(tmpfile_ext)

Bases: framework.target_helpers.Target

__init__(tmpfile_ext)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.targets.printer'
_feedback_mode = None
get_description()
get_printer_name()
get_target_ip()
get_target_port()
send_data(data, from_fmk=False)

To be overloaded.

Note: use data.to_bytes() to get binary data.

Parameters:
  • from_fmk (bool) – set to True if the call was performed by the framework itself, otherwise the call comes from user-code (e.g., from a probe or an operator)
  • data (Data) – data container that embeds generally a modeled data accessible through data.content. However if the latter is None, it only embeds the raw data.
set_printer_name(printer_name)
set_target_ip(target_ip)
set_target_port(target_port)
set_tmp_file_extension(tmpfile_ext)
start()

To be overloaded if needed

supported_feedback_mode = []

13.2.13. framework.targets.debug module

class framework.targets.debug.TestTarget(recover_ratio=100, fbk_samples=None)

Bases: framework.target_helpers.Target

__init__(recover_ratio=100, fbk_samples=None)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.targets.debug'
_feedback_mode = None
_last_ack_date = None
get_last_target_ack_date()

If different from None the return value is used by the FMK to log the date of the target acknowledgment after a message has been sent to it.

[Note: If this method is overloaded, is_target_ready_for_new_data() should also be]

is_target_ready_for_new_data()

The FMK busy wait on this method() before sending a new data. This method should take into account feedback timeout (that is the maximum time duration for gathering feedback from the target)

recover_target()

Implementation of target recovering operations, when a target problem has been detected (i.e. a negative feedback from a probe, an operator or the Target() itself)

Returns:True if the target has been recovered. False otherwise.
Return type:bool
send_data(data, from_fmk=False)

To be overloaded.

Note: use data.to_bytes() to get binary data.

Parameters:
  • from_fmk (bool) – set to True if the call was performed by the framework itself, otherwise the call comes from user-code (e.g., from a probe or an operator)
  • data (Data) – data container that embeds generally a modeled data accessible through data.content. However if the latter is None, it only embeds the raw data.
send_multiple_data(data_list, from_fmk=False)

Used to send multiple data to the target, or to stimulate several target’s inputs in one shot.

Note: Use data.to_bytes() to get binary data

Parameters:
  • from_fmk (bool) – set to True if the call was performed by the framework itself, otherwise the call comes from user-code (e.g., from a Probe or an Operator)
  • data_list (list) – list of data to be sent
start()

To be overloaded if needed

supported_feedback_mode = [2]

13.2.14. framework.project module

class framework.project.Project(enable_fbk_processing=True)

Bases: object

__init__(enable_fbk_processing=True)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.project'
_feedback_processing()

core function of the feedback processing thread

add_knowledge(*info)
default_dm = None
estimate_last_data_impact_uniqueness()
feedback_gate = None
get_operator(name)
get_operators()
get_probes()
knowledge_source
map_targets_to_scenario(scenario, target_mapping)
name = None
notify_data_sending(data_list, timestamp, target)
register_evolutionary_processes(*processes)
register_feedback_handler(fbk_handler)
register_operator(name, obj)
register_probe(probe, blocking=False)
register_scenarios(*scenarios)
reset_knowledge()
reset_target_mappings()
set_data_model(dm)
set_exportable_fmk_ops(fmkops)
set_logger(logger)
set_monitor(monitor)
set_targets(targets)
start()
stop()
trigger_feedback_handlers(source, timestamp, content, status)

13.2.15. framework.operator_helpers module

class framework.operator_helpers.LastInstruction

Bases: object

RecordData = 1
__init__()

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.operator_helpers'
get_comments()
get_operator_feedback()
get_operator_status()
get_timestamp()
is_instruction_set(name)
set_comments(comments)
set_instruction(name)
set_operator_feedback(info)
set_operator_status(status_code)
class framework.operator_helpers.Operation

Bases: object

CleanupDMakers = 3
Exportable = 2
Stop = 1
__init__()

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.operator_helpers'
add_instruction(actions, seed=None, tg_ids=None)
get_instructions()
is_flag_set(name)
set_flag(name)
set_status(status)
class framework.operator_helpers.Operator

Bases: object

__module__ = 'framework.operator_helpers'
__str__() <==> str(x)
_args_desc = None
_start(fmk_ops, dm, monitor, target, logger, user_input)
do_after_all(fmk_ops, dm, monitor, target, logger)

This action is executed after data has been sent to the target AND that all blocking probes have returned. BUT just before data is logged.

Returns:
Last-minute instructions you request fuddly
to perform.
Return type:LastInstruction
plan_next_operation(fmk_ops, dm, monitor, target, logger, fmk_feedback)

Shall return a Operation object that contains the operations that you want fuddly to perform.

Returns:Operation you want fuddly to perform.
Return type:Operation
start(fmk_ops, dm, monitor, target, logger, user_input)

To be overloaded if specific initialization code is needed. Shall return True if setup has succeeded, otherwise shall return False.

stop(fmk_ops, dm, monitor, target, logger)

To be overloaded if specific termination code is needed.

framework.operator_helpers.operator(prj, args=None)

13.2.16. framework.logger module

class framework.logger.Logger(name=None, prefix='', record_data=False, explicit_data_recording=False, export_orig=True, export_raw_data=True, console_display_limit=800, enable_file_logging=False)

Bases: object

The Logger is used for keeping the history of the communication with the Target. The methods are used by the framework, but can also be leveraged by an Operator.

__init__(name=None, prefix='', record_data=False, explicit_data_recording=False, export_orig=True, export_raw_data=True, console_display_limit=800, enable_file_logging=False)
Parameters:
  • name (str) – Name to be used in the log filenames. If not specified, the name of the project in which the logger is embedded will be used.
  • record_data (bool) – If True, each emitted data will be stored in a specific file within exported_data/.
  • explicit_data_recording (bool) – Used for logging outcomes further to an Operator instruction. If True, the operator would have to state explicitly if it wants the just emitted data to be recorded. Such notification is possible when the framework call its method framework.operator_helpers.Operator.do_after_all(), where the Operator can take its decision after the observation of the target feedback and/or probes outputs.
  • export_orig (bool) – If True, will also log the original data on which disruptors have been called.
  • export_raw_data (bool) – If True, will log the data as it is, without trying to interpret it as human readable text.
  • console_display_limit (int) – maximum amount of characters to display on the console at once. If this threshold is overrun, the message to print on the console will be truncated.
  • prefix (str) – prefix to use for printing on the console.
  • enable_file_logging (bool) – If True, file logging will be enabled.
__module__ = 'framework.logger'
__str__() <==> str(x)
_encode_target_feedback(feedback)
_export_data_func(data, suffix='')
_handle_binary_content(content, raw=False)
_log_feedback(source, content, status_code, timestamp, record=True)
_process_target_feedback(feedback)
collect_feedback(content, status_code=None)

Used within the scope of the Logger feedback-collector infrastructure. If your target implement the interface Target.get_feedback(), no need to use this infrastructure.

To be called by the target each time feedback need to be registered.

Parameters:
  • content – feedback record
  • status_code (int) – should be negative for error
commit_data_table_entry(group_id, prj_name)
fmkDB = None
log_collected_feedback(preamble=None, epilogue=None)

Used within the scope of the Logger feedback-collector feature. If your target implement the interface Target.get_feedback(), no need to use this infrastructure.

It allows to retrieve the collected feedback, that has been populated by the target (through call to Logger.collect_feedback()).

Parameters:
  • preamble (str) – prefix added to each collected feedback
  • epilogue (str) – suffix added to each collected feedback
Returns:

True if target feedback has been collected through logger infrastructure

Logger.collect_feedback(), False otherwise.

Return type:

bool

log_comment(comment)
log_data(data, verbose=False)
log_data_info(data_info, dmaker_type, data_maker_name)
log_disruptor_info(dmaker_type, name, user_input)
log_dmaker_step(num)
log_error(err_msg)
log_fmk_info(info, nl_before=False, nl_after=False, rgb=6750207, data_id=None, do_record=True, delay_recording=False)
log_generator_info(dmaker_type, name, user_input, data_id=None, disabled=False)
log_info(info)
log_operator_feedback(operator, content, status_code, timestamp)
log_orig_data(data)
log_probe_feedback(probe, content, status_code, timestamp, related_tg=None)
log_target_ack_date()
log_target_feedback_from(source, content, status_code, timestamp, preamble=None, epilogue=None)
print_console(msg, nl_before=True, nl_after=False, rgb=None, style=None, raw_limit=None, limit_output=True)
reset_current_state()
set_target_ack_date(tg_ref, date)
shall_record()
start()
start_new_log_entry(preamble='')
stop()

13.2.17. framework.monitor module

exception framework.monitor.AddExistingProbeToMonitorError(probe_name)

Bases: exceptions.Exception

Raised when a probe is being added a second time in a monitor

__init__(probe_name)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.monitor'
probe_name
class framework.monitor.Backend(codec='latin_1')

Bases: object

__init__(codec='latin_1')
Parameters:codec (str) – codec used by the monitored system to answer.
__module__ = 'framework.monitor'
_exec_command(cmd)
_start()
_stop()
exec_command(cmd)
start()
stop()
exception framework.monitor.BackendError

Bases: exceptions.Exception

__module__ = 'framework.monitor'
class framework.monitor.BlockingProbeUser(probe, after_target_feedback_retrieval)

Bases: framework.monitor.ProbeUser

__init__(probe, after_target_feedback_retrieval)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.monitor'
_clear()

Clear all events

_notify_armed()
_notify_status_retrieved()
_run(*args, **kwargs)
_wait_for_data_ready()

Wait on a request to arm

Returns:
True if the arm event happened, False if a stop was asked
or an error was signaled
Return type:bool
_wait_for_fmk_sync()

Wait on a blocking event: data sent or timeout

Returns:
True if the blocking event happened, False if a stop was
asked or an error was signaled
Return type:bool
after_target_feedback_retrieval
notify_blocking()
notify_data_ready()
notify_error()

Informs the probe of an error

stop()
wait_until_armed(timeout=None)
wait_until_ready(timeout=None)
class framework.monitor.Monitor

Bases: object

__init__()

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.monitor'
_get_probe_ref(probe)
_wait_for_specific_probes(probe_user_class, probe_user_wait_method, probes=None, timeout=None)

Wait for probes to trigger a specific event

Parameters:
  • probe_user_class (ProbeUser) – probe_user class that defines the method.
  • probe_user_wait_method (method) – name of the probe_user’s method that will be used to wait.
  • probes (list of ProbeUser) – probes to wait for. If None all probes will be concerned.
  • timeout (float) – maximum time to wait for in seconds.
add_probe(probe, blocking=False, after_target_feedback_retrieval=False)
configure_probe(probe, *args)
disable_hooks()
enable_hooks()
get_probe_delay(probe)
get_probe_status(probe)
get_probes_names()
is_probe_launched(probe)
is_probe_stuck(probe)
is_target_ok()
iter_probes()
notify_data_sending_event()
notify_error()
notify_imminent_data_sending()
notify_target_feedback_retrieval()
set_data_model(dm)
set_fmk_ops(fmk_ops)
set_logger(logger)
set_probe_delay(probe, delay)
set_targets(targets)
start()
start_probe(probe, related_tg=None)
stop()
stop_all_probes()
stop_probe(probe)
target_status
wait_for_probe_initialization()
wait_for_probe_status_retrieval()
class framework.monitor.Probe(delay=1.0)

Bases: object

__init__(delay=1.0)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.monitor'
__str__() <==> str(x)
_start(dm, target, logger)
_stop(dm, target, logger)
arm(dm, target, logger)

Only used by blocking probes. Called by the framework just before sending a data.

Parameters:
  • dm – the current data model
  • target – the current target
  • logger – the current logger
configure(*args)

(Optional method) To be overloaded with any signature that fits your needs Could be called by user code through framework.monitor.Monitor.configure_probe() Use case example is to call it from an framework.operator_helpers.Operator

Parameters:*args – anything that fits your needs
delay
main(dm, target, logger)

To be overloaded by user-code

In the case of a basic probe, this method will be called in loop following a period specified within the associated project file.

In the case of a blocking probe, this method will be called by the framework just after having sent a data (or a batch of data).

Parameters:
  • dm – the current data model
  • target – the current target
  • logger – the current logger
Returns:

negative status if something is wrong

Return type:

ProbeStatus

reset()

To be overloaded by user-code (if needed).

Called each time the probe status is retrieved by the framework (through Monitor.get_probe_status()). Useful especially for periodic probe that may need to be reset after each data sending.

Note: shall be stateless and reentrant.

start(dm, target, logger)

Probe initialization

Returns:may return a status or None
Return type:ProbeStatus
status
stop(dm, target, logger)
class framework.monitor.ProbeMem

Bases: framework.monitor.Probe

Generic probe that enables you to monitor the process memory (RSS…) consumption. It can be done by specifying a threshold and/or a tolerance ratio.

The monitoring can be done through different backend (e.g., SSH_Backend, Serial_Backend).

backend

backend to be used (e.g., SSH_Backend).

Type:Backend
process_name

name of the process to monitor.

Type:str
threshold

memory (RSS) threshold that the monitored process should not exceed. (dimension should be the same as what is provided by the ps command of the system under test)

Type:int
tolerance

tolerance expressed in percentage of the memory (RSS) the process was using at the beginning of the monitoring (or after each time the tolerance has been exceeded).

Type:int
command_pattern

format string for the ssh command. ‘{0:s}’ refer to the process name.

Type:str
__init__()

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.monitor'
_get_mem()
backend = None
command_pattern = 'ps -e -orss,comm | grep {0:s}'
main(dm, target, logger)

To be overloaded by user-code

In the case of a basic probe, this method will be called in loop following a period specified within the associated project file.

In the case of a blocking probe, this method will be called by the framework just after having sent a data (or a batch of data).

Parameters:
  • dm – the current data model
  • target – the current target
  • logger – the current logger
Returns:

negative status if something is wrong

Return type:

ProbeStatus

process_name = None
reset()

To be overloaded by user-code (if needed).

Called each time the probe status is retrieved by the framework (through Monitor.get_probe_status()). Useful especially for periodic probe that may need to be reset after each data sending.

Note: shall be stateless and reentrant.

start(dm, target, logger)

Probe initialization

Returns:may return a status or None
Return type:ProbeStatus
stop(dm, target, logger)
threshold = None
tolerance = 2
class framework.monitor.ProbePID

Bases: framework.monitor.Probe

Generic probe that enables you to monitor a process PID.

The monitoring can be done through different backend (e.g., SSH_Backend, Serial_Backend).

backend

backend to be used (e.g., SSH_Backend).

Type:Backend
process_name

name of the process to monitor.

Type:str
max_attempts

maximum number of attempts for getting the process ID.

Type:int
delay_between_attempts

delay in seconds between each attempt.

Type:float
delay

delay before retrieving the process PID.

Type:float
command_pattern

format string for the ssh command. ‘{0:s}’ refer to the process name.

Type:str
__init__()

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.monitor'
_get_pid(logger)
backend = None
command_pattern = 'pgrep {0:s}'
delay = 0.5
delay_between_attempts = 0.1
main(dm, target, logger)

To be overloaded by user-code

In the case of a basic probe, this method will be called in loop following a period specified within the associated project file.

In the case of a blocking probe, this method will be called by the framework just after having sent a data (or a batch of data).

Parameters:
  • dm – the current data model
  • target – the current target
  • logger – the current logger
Returns:

negative status if something is wrong

Return type:

ProbeStatus

max_attempts = 10
process_name = None
start(dm, target, logger)

Probe initialization

Returns:may return a status or None
Return type:ProbeStatus
stop(dm, target, logger)
class framework.monitor.ProbeStatus(status=None, info=None)

Bases: object

__init__(status=None, info=None)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.monitor'
get_private_info()
get_timestamp()
set_private_info(pv)
set_timestamp()
value
exception framework.monitor.ProbeTimeoutError(probe_name, timeout, blocking_methods=None)

Bases: exceptions.Exception

Raised when a probe is considered stuck

__init__(probe_name, timeout, blocking_methods=None)
Parameters:
  • probe_name (str) – name of the probe where the timeout occurred
  • timeout (float) – time the probe waited before its timeout
  • blocking_methods (list of str) – list of probe_methods where the timeout may have happened
__module__ = 'framework.monitor'
blocking_methods
probe_name
timeout
class framework.monitor.ProbeUser(probe)

Bases: object

__init__(probe)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.monitor'
_clear()

Clear all events

_go_on()
_handle_exception(context)
_notify_probe_started()
_run(*args, **kwargs)
_wait(delay)
_wait_for_probe(event, timeout=None)

Wait for the probe to trigger a specific event

get_probe_delay()
get_probe_status()
is_alive()
is_stuck()

Tells if the probe has to be considered stuck by the monitor: i.e. if it is really stuck or if its stop was not acknowledged

join(timeout=None)
probe
probe_init_timeout = 10.0
set_probe_delay(delay)
start(*args, **kwargs)
stop()
timeout = 5.0
wait_for_probe_init(timeout=None)
class framework.monitor.SSH_Backend(username, password, sshd_ip, sshd_port=22, codec='latin_1')

Bases: framework.monitor.Backend

Backend to execute command through a serial line.

__init__(username, password, sshd_ip, sshd_port=22, codec='latin_1')
Parameters:
  • sshd_ip (str) – IP of the SSH server.
  • sshd_port (int) – port of the SSH server.
  • username (str) – username to connect with.
  • password (str) – password related to the username.
  • codec (str) – codec used by the monitored system to answer.
__module__ = 'framework.monitor'
_exec_command(cmd)
_start()
_stop()
class framework.monitor.Serial_Backend(serial_port, baudrate=115200, bytesize=8, parity='N', stopbits=1, xonxoff=False, rtscts=False, dsrdtr=False, username=None, password=None, slowness_factor=5, cmd_notfound='command not found', codec='latin_1')

Bases: framework.monitor.Backend

Backend to execute command through a serial line.

__init__(serial_port, baudrate=115200, bytesize=8, parity='N', stopbits=1, xonxoff=False, rtscts=False, dsrdtr=False, username=None, password=None, slowness_factor=5, cmd_notfound='command not found', codec='latin_1')
Parameters:
  • serial_port (str) – path to the tty device file. (e.g., ‘/dev/ttyUSB0’)
  • baudrate (int) – baud rate of the serial line.
  • bytesize (int) – number of data bits. (5, 6, 7, or 8)
  • parity (str) – parity checking. (‘N’, ‘O, ‘E’, ‘M’, or ‘S’)
  • stopbits (int) – number of stop bits. (1, 1.5 or 2)
  • xonxoff (bool) – enable software flow control.
  • rtscts (bool) – enable hardware (RTS/CTS) flow control.
  • dsrdtr (bool) – enable hardware (DSR/DTR) flow control.
  • username (str) – username to connect with. If None, no authentication step will be attempted.
  • password (str) – password related to the username.
  • slowness_factor (int) – characterize the slowness of the monitored system. The scale goes from 1 (fastest) to 10 (slowest). This factor is a base metric to compute the time to wait for the authentication step to terminate (if username and password parameter are provided) and other operations involving to wait for the monitored system.
  • cmd_notfound (bytes) – pattern used to detect if the command does not exist on the monitored system.
  • codec (str) – codec used to send/receive information through the serial line
__module__ = 'framework.monitor'
_exec_command(cmd)
_read_serial(duration)
_start()
_stop()
class framework.monitor.Shell_Backend(timeout=None, codec='latin_1')

Bases: framework.monitor.Backend

Backend to execute shell commands locally

__init__(timeout=None, codec='latin_1')
Parameters:
  • timeout (float) – timeout in seconds for reading the result of the command
  • codec (str) – codec used by the monitored system to answer.
__module__ = 'framework.monitor'
_exec_command(cmd)
_start()
_stop()
framework.monitor.blocking_probe(project, after_target_feedback_retrieval=False)
framework.monitor.probe(project)

13.2.18. framework.tactics_helpers module

class framework.tactics_helpers.DataMaker

Bases: object

__init__()

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.tactics_helpers'
_args_desc = None
_modelwalker_user = False
knowledge_source = None
modelwalker_user
set_exportable_fmk_ops(fmkops)
class framework.tactics_helpers.DataMakerAttr
Active = 1
Controller = 2
HandOver = 3
NeedSeed = 5
SetupRequired = 4
__module__ = 'framework.tactics_helpers'
class framework.tactics_helpers.Disruptor

Bases: framework.tactics_helpers.DataMaker

__init__()

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.tactics_helpers'
_cleanup()
_setup(dm, user_input)
cleanup(fmkops)

–> Specific code

clear_attr(name)
disrupt_data(dm, target, prev_data)
is_attr_set(name)
set_attr(name)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.tactics_helpers.DynGenerator

Bases: framework.tactics_helpers.Generator

__module__ = 'framework.tactics_helpers'
_args_desc = {'determinist': ('make the data model determinist', False, <type 'bool'>), 'finite': ('make the data model finite', False, <type 'bool'>), 'random': ('make the data model random', False, <type 'bool'>)}
data_id = ''
generate_data(dm, monitor, target)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.tactics_helpers.DynGeneratorFromScenario

Bases: framework.tactics_helpers.Generator

_DynGeneratorFromScenario__handle_transition_callbacks(hook, feedback=None)
__module__ = 'framework.tactics_helpers'
_alter_data_step()
_alter_transition_conditions()
_args_desc = {'cond_fuzz': ('For each scenario step having guarded transitions, a new scenario is created where transition conditions are inverted. [compatible with ignore_timing]', False, <type 'bool'>), 'data_fuzz': ('For each scenario step that generates data, a new scenario is created where the data generated by the step is fuzzed.', False, <type 'bool'>), 'graph': ('Display the scenario and highlight the current step each time the generator is called.', False, <type 'bool'>), 'graph_format': ('Format to be used for displaying the scenario (e.g., xdot, pdf, png).', 'xdot', <type 'str'>), 'ignore_timing': ('For each scenario step enforcing a timing constraint, a new scenario is created where any timeout conditions are removed (i.e., set to 0 second). [compatible with cond_fuzz]', False, <type 'bool'>), 'init': ("Used in combination with 'data_fuzz', 'cond_fuzz', or 'ignore_timing'. Make the generator begin with the Nth corrupted scenario (where N is provided through this parameter).", 0, <type 'int'>), 'reset': ("If set, scenarios created by 'data_fuzz', 'cond_fuzz', or 'ignore_timing' will reinitialize the scenario after each corruption case, without waiting for the normal continuation of the scenario.", True, <type 'bool'>), 'stutter': ("For each scenario step that generates data, a new scenario is created where the step is altered to stutter 'stutter_max' times, meaning that data-sending steps would be triggered 'stutter_max' times.", False, <type 'bool'>), 'stutter_max': ("The number of times a step will stutter [to be used with 'stutter']", 2, <type 'int'>)}
_callback_cleanup_periodic()
_callback_dispatcher_after_fbk(fbk)

This callback is always called by the framework

_callback_dispatcher_after_sending()
_callback_dispatcher_before_sending_step1()
_callback_dispatcher_before_sending_step2()
_check_data_fuzz_completion_cbk(env, step)
_cleanup_walking_attrs()
_make_step_stutter()
_stutter_cbk(env, current_step, next_step)
cleanup(fmkops)

–> Specific code

generate_data(dm, monitor, target)
graph_scenario(fmt, select_current=False)
produced_seed
scenario = None
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.tactics_helpers.Generator

Bases: framework.tactics_helpers.DataMaker

__init__()

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.tactics_helpers'
_cleanup()
_setup(dm, user_input)
cleanup(fmkops)

–> Specific code

clear_attr(name)
generate_data(dm, monitor, target)
is_attr_set(name)
need_reset()
produced_seed = None
set_attr(name)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.tactics_helpers.StatefulDisruptor

Bases: framework.tactics_helpers.DataMaker

__init__()

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.tactics_helpers'
_cleanup()
_set_seed(prev_data)
_setup(dm, user_input)
cleanup(fmkops)

–> Specific code

clear_attr(name)
disrupt_data(dm, target, data)

@data: it is either equal to prev_data the first time disrupt_data() is called by the FMK, or it is a an empty data (that is Data()).

handover()
is_attr_set(name)
set_attr(name)
set_seed(prev_data)
setup(dm, user_input)

–> Specific code return True if setup has succeeded, otherwise return False

class framework.tactics_helpers.Tactics

Bases: object

_Tactics__clear_dmaker_clones(dmaker, dmaker_clones)
_Tactics__clone_dmaker(dmaker, dmaker_clones, dmaker_type, new_dmaker_type, dmaker_name=None, register_func=None)
_Tactics__get_random_data_maker(dict_var, dmaker_type, total_weight, valid)
_Tactics__register_new_data_maker(dict_var, name, obj, weight, dmaker_type, valid)
_Tactics__set_data_maker_weight(dict_var, dmaker_type, name, weight)
__init__()

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.tactics_helpers'
clear_disruptor_clones()
clear_generator_clones()
clone_disruptor(dmaker_type, new_dmaker_type=None, dmaker_name=None)
clone_generator(dmaker_type, new_dmaker_type=None, dmaker_name=None)
disruptor_types
generator_types
get_datatype_total_weight(dmaker_type)
get_disruptor_name(dmaker_type, obj)
get_disruptor_obj(dmaker_type, name)
get_disruptor_validness(dmaker_type, name)
get_disruptor_weight(dmaker_type, name)
get_disruptors_list(dmaker_type)
get_dmaker_type_total_weight(dmaker_type)
get_generator_name(dmaker_type, obj)
get_generator_obj(dmaker_type, name)
get_generator_validness(dmaker_type, name)
get_generator_weight(dmaker_type, name)
get_generators_list(dmaker_type)
get_info_from_obj(obj)
get_random_disruptor(dmaker_type, valid)
get_random_generator(dmaker_type, valid)
print_disruptor(dmaker_type, disruptor_name)
print_generator(dmaker_type, generator_name)
register_new_disruptor(name, obj, weight, dmaker_type, valid=False)
register_new_generator(name, obj, weight, dmaker_type, valid=False)
register_scenarios(*scenarios)
static scenario_ref_from(scenario)
set_disruptor_weight(dmaker_type, name, weight)
set_exportable_fmk_ops(fmkops)
set_generator_weight(dmaker_type, name, weight)
framework.tactics_helpers._handle_user_inputs(dmaker, user_input)
framework.tactics_helpers._restore_dmaker_internals(dmaker)
framework.tactics_helpers._user_input_conformity(self, user_input, _args_desc)
framework.tactics_helpers.disruptor(st, dtype, weight=1, valid=False, args=None, modelwalker_user=False)
class framework.tactics_helpers.dyn_generator(name, bases, attrs)

Bases: type

__init__(name, bases, attrs)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.tactics_helpers'
data_id = ''
class framework.tactics_helpers.dyn_generator_from_scenario(name, bases, attrs)

Bases: type

__init__(name, bases, attrs)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.tactics_helpers'
scenario = None
framework.tactics_helpers.generator(st, gtype, weight=1, valid=False, args=None, modelwalker_user=False)
framework.tactics_helpers.modelwalker_inputs_handling_helper(dmaker)

13.2.19. framework.fuzzing_primitives module

class framework.fuzzing_primitives.AltConfConsumer(max_runs_per_node=-1, min_runs_per_node=-1, respect_order=True, fuzz_magnitude=1.0, fix_constraints=False, **kwargs)

Bases: framework.fuzzing_primitives.NodeConsumerStub

Note: save_node()/restore_node() are not overloaded although default implementation can triggers overhead, because for some cases copying the Elt is the better (e.g., for alternate conf on nonterm nodes, that reuse same subnodes over the various confs).

__module__ = 'framework.fuzzing_primitives'
consume_node(node)

Use this method to modify/alter or just read information on @node. This function will be called for each node that satisfy the criteria. (to be implemented according to the implementation of need_reset())

Return True to say that you have correctly consumed the node. Return False, if despite your current criteria for node interest, you are in fact not interested

init_specific(**kwargs)
need_reset(node)
recover_node(node)

Generic way to recover a node

save_node(node)

Generic way to save a node (can impact performance)

still_interested_by(node)
wait_for_exhaustion(node)
  • return -1 to wait until exhaustion
  • return 0 to stop node iteration after consumption (and yielding a value once)
  • return N-1 to stop iteration after at most N step (or before if exhaustion triggers)
class framework.fuzzing_primitives.BasicVisitor(max_runs_per_node=-1, min_runs_per_node=-1, respect_order=True, fuzz_magnitude=1.0, fix_constraints=False, **kwargs)

Bases: framework.fuzzing_primitives.NodeConsumerStub

__module__ = 'framework.fuzzing_primitives'
consume_node(node)

Use this method to modify/alter or just read information on @node. This function will be called for each node that satisfy the criteria. (to be implemented according to the implementation of need_reset())

Return True to say that you have correctly consumed the node. Return False, if despite your current criteria for node interest, you are in fact not interested

init_specific(**kwargs)
need_reset(node)
recover_node(node)

Generic way to recover a node

save_node(node)

Generic way to save a node (can impact performance)

wait_for_exhaustion(node)
  • return -1 to wait until exhaustion
  • return 0 to stop node iteration after consumption (and yielding a value once)
  • return N-1 to stop iteration after at most N step (or before if exhaustion triggers)
class framework.fuzzing_primitives.ModelWalker(root_node, node_consumer, make_determinist=False, make_random=False, max_steps=-1, initial_step=1)

Bases: object

We walk through all states of the model and give opportunity to the Consumer to act on each node, and to be involved in the walking process in some extents.

The first rule of the walking process is to step up to a node exhaustion (which means that the consume_node() method of the Consumer won’t be called in-between)

Note: the change of a non-terminal node does not reset the indirect parents (just the direct parent), otherwise it could lead to a combinatorial explosion, with limited interest…

__init__(root_node, node_consumer, make_determinist=False, make_random=False, max_steps=-1, initial_step=1)

x.__init__(…) initializes x; see help(type(x)) for signature

__iter__()
__module__ = 'framework.fuzzing_primitives'
_do_reset(node)
node_consumer_helper(node, structure_has_changed, consumed_nodes, parent_node)
set_consumer(node_consumer)
walk_graph_rec(node_list, structure_has_changed, consumed_nodes, parent_node)
class framework.fuzzing_primitives.NodeConsumerStub(max_runs_per_node=-1, min_runs_per_node=-1, respect_order=True, fuzz_magnitude=1.0, fix_constraints=False, **kwargs)

Bases: object

__init__(max_runs_per_node=-1, min_runs_per_node=-1, respect_order=True, fuzz_magnitude=1.0, fix_constraints=False, **kwargs)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.fuzzing_primitives'
consume_node(node)

Use this method to modify/alter or just read information on @node. This function will be called for each node that satisfy the criteria. (to be implemented according to the implementation of need_reset())

Return True to say that you have correctly consumed the node. Return False, if despite your current criteria for node interest, you are in fact not interested

do_after_reset(node)
init_specific(**kwargs)
interested_by(node)
max_nb_runs_for(node)
need_reset(node)
preload(root_node)

Called by the ModelWalker when it initializes

Parameters:root_node – Root node of the modeled data

Returns: None

recover_node(node)

Generic way to recover a node

save_node(node)

Generic way to save a node (can impact performance)

set_node_interest(internals_criteria=None, semantics_criteria=None, owned_confs=None, path_regexp=None, conf=None)

@conf: criteria are applied for the provided conf if not None, otherwise current_conf is used Note: when all is None, NodeConsumer is interested by every node (that is interested_by() return always True)

still_interested_by(node)
wait_for_exhaustion(node)
  • return -1 to wait until exhaustion
  • return 0 to stop node iteration after consumption (and yielding a value once)
  • return N-1 to stop iteration after at most N step (or before if exhaustion triggers)
class framework.fuzzing_primitives.NonTermVisitor(max_runs_per_node=-1, min_runs_per_node=-1, respect_order=True, fuzz_magnitude=1.0, fix_constraints=False, **kwargs)

Bases: framework.fuzzing_primitives.BasicVisitor

__module__ = 'framework.fuzzing_primitives'
consume_node(node)

Use this method to modify/alter or just read information on @node. This function will be called for each node that satisfy the criteria. (to be implemented according to the implementation of need_reset())

Return True to say that you have correctly consumed the node. Return False, if despite your current criteria for node interest, you are in fact not interested

init_specific(**kwargs)
need_reset(node)
still_interested_by(node)
wait_for_exhaustion(node)
  • return -1 to wait until exhaustion
  • return 0 to stop node iteration after consumption (and yielding a value once)
  • return N-1 to stop iteration after at most N step (or before if exhaustion triggers)
class framework.fuzzing_primitives.SeparatorDisruption(max_runs_per_node=-1, min_runs_per_node=-1, respect_order=True, fuzz_magnitude=1.0, fix_constraints=False, **kwargs)

Bases: framework.fuzzing_primitives.NodeConsumerStub

__module__ = 'framework.fuzzing_primitives'
consume_node(node)

Use this method to modify/alter or just read information on @node. This function will be called for each node that satisfy the criteria. (to be implemented according to the implementation of need_reset())

Return True to say that you have correctly consumed the node. Return False, if despite your current criteria for node interest, you are in fact not interested

init_specific(separators=None)
class framework.fuzzing_primitives.TypedNodeDisruption(max_runs_per_node=-1, min_runs_per_node=-1, respect_order=True, fuzz_magnitude=1.0, fix_constraints=False, **kwargs)

Bases: framework.fuzzing_primitives.NodeConsumerStub

__module__ = 'framework.fuzzing_primitives'
_add_separator_cases(vt_node)
_populate_fuzzy_vt_list(vt_node, fuzz_magnitude)
consume_node(node)

Use this method to modify/alter or just read information on @node. This function will be called for each node that satisfy the criteria. (to be implemented according to the implementation of need_reset())

Return True to say that you have correctly consumed the node. Return False, if despite your current criteria for node interest, you are in fact not interested

init_specific(ignore_separator=False, enforce_determinism=True)
need_reset(node)
preload(root_node)

Called by the ModelWalker when it initializes

Parameters:root_node – Root node of the modeled data

Returns: None

recover_node(node)

Generic way to recover a node

save_node(node)

Generic way to save a node (can impact performance)

still_interested_by(node)
framework.fuzzing_primitives.fuzz_data_tree(top_node, paths_regexp=None)

13.2.20. framework.encoders module

class framework.encoders.BitReverse_Enc(encoding_arg=None)

Bases: framework.encoders.Encoder

__module__ = 'framework.encoders'
_reverse_bits(x, nb_bits=8)

Reverse bits order of x

decode(val)

To be overloaded. (Should be stateless.)

Parameters:val (bytes) – the encoded value
Returns:the decoded value
Return type:bytes
encode(val)

To be overloaded. (Should be stateless.)

Parameters:val (bytes) – the value
Returns:the encoded value
Return type:bytes
class framework.encoders.Encoder(encoding_arg=None)

Bases: object

__copy__()
__init__(encoding_arg=None)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.encoders'
decode(val)

To be overloaded. (Should be stateless.)

Parameters:val (bytes) – the encoded value
Returns:the decoded value
Return type:bytes
encode(val)

To be overloaded. (Should be stateless.)

Parameters:val (bytes) – the value
Returns:the encoded value
Return type:bytes
init_encoding_scheme(arg)

To be optionally overloaded by a subclass that deals with encoding, if encoding need to be initialized in some way. (called at init and in String.reset())

Parameters:arg – provided through the encoding_arg parameter of the String constructor
reset()
class framework.encoders.GSM7bitPacking_Enc(encoding_arg=None)

Bases: framework.encoders.Encoder

__module__ = 'framework.encoders'
decode(msg)

To be overloaded. (Should be stateless.)

Parameters:val (bytes) – the encoded value
Returns:the decoded value
Return type:bytes
encode(msg)

To be overloaded. (Should be stateless.)

Parameters:val (bytes) – the value
Returns:the encoded value
Return type:bytes
class framework.encoders.GSMPhoneNum_Enc(encoding_arg=None)

Bases: framework.encoders.Encoder

__module__ = 'framework.encoders'
decode(msg)

To be overloaded. (Should be stateless.)

Parameters:val (bytes) – the encoded value
Returns:the decoded value
Return type:bytes
encode(msg)

To be overloaded. (Should be stateless.)

Parameters:val (bytes) – the value
Returns:the encoded value
Return type:bytes
class framework.encoders.GZIP_Enc(encoding_arg=None)

Bases: framework.encoders.Encoder

__module__ = 'framework.encoders'
decode(val)

To be overloaded. (Should be stateless.)

Parameters:val (bytes) – the encoded value
Returns:the decoded value
Return type:bytes
encode(val)

To be overloaded. (Should be stateless.)

Parameters:val (bytes) – the value
Returns:the encoded value
Return type:bytes
init_encoding_scheme(arg=None)

To be optionally overloaded by a subclass that deals with encoding, if encoding need to be initialized in some way. (called at init and in String.reset())

Parameters:arg – provided through the encoding_arg parameter of the String constructor
class framework.encoders.Wrap_Enc(encoding_arg=None)

Bases: framework.encoders.Encoder

Encoder to be used as a mean to wrap a Node with a prefix and/or a suffix, without defining specific Nodes for that (meaning you don’t need to model that part and want to simplify your data description).

__module__ = 'framework.encoders'
decode(val)

To be overloaded. (Should be stateless.)

Parameters:val (bytes) – the encoded value
Returns:the decoded value
Return type:bytes
encode(val)

To be overloaded. (Should be stateless.)

Parameters:val (bytes) – the value
Returns:the encoded value
Return type:bytes
init_encoding_scheme(arg)

Take a list parameter specifying the prefix and the suffix to add to the value to encode, or to remove from an encoded value.

Parameters:arg (list) – Prefix and suffix character strings. Can be individually set to None

13.2.21. framework.database module

class framework.database.Database(fmkdb_path=None)

Bases: object

DDL_fname = 'fmk_db.sql'
DEFAULT_DM_NAME = '__DEFAULT_DATAMODEL'
DEFAULT_GEN_NAME = '__DEFAULT_GNAME'
DEFAULT_GTYPE_NAME = '__DEFAULT_GTYPE'
OUTCOME_DATA = 2
OUTCOME_ROWID = 1
__init__(fmkdb_path=None)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.database'
_get_color_function(colorized)
_handle_binary_content(content, sz_limit=None, raw=False, colorized=True)
_is_valid(connection, cursor)
_sql_handler()
_stop_sql_handler()
check_data_existence(data_id, colorized=True)
disable()
display_data_info(data_id, with_data=False, with_fbk=False, with_fmkinfo=True, with_analysis=True, fbk_src=None, limit_data_sz=None, page_width=100, colorized=True, raw=False, decoding_hints=None, dm_list=None)
display_data_info_by_date(start, end, with_data=False, with_fbk=False, with_fmkinfo=True, with_analysis=True, fbk_src=None, prj_name=None, limit_data_sz=None, raw=False, page_width=100, colorized=True, decoding_hints=None, dm_list=None)
display_data_info_by_range(first_id, last_id, with_data=False, with_fbk=False, with_fmkinfo=True, with_analysis=True, fbk_src=None, prj_name=None, limit_data_sz=None, raw=False, page_width=100, colorized=True, decoding_hints=None, dm_list=None)
display_stats(colorized=True)
enable()
execute_sql_statement(sql_stmt, params=None)
export_data(first, last=None, colorized=True)
fetch_data(start_id=1, end_id=-1)
flush_current_feedback()
get_data_with_impact(prj_name=None, fbk_src=None, display=True, verbose=False, raw_analysis=False, colorized=True)
get_data_with_specific_fbk(fbk, prj_name=None, fbk_src=None, display=True, colorized=True)
get_data_without_fbk(prj_name=None, fbk_src=None, display=True, colorized=True)
get_project_record(prj_name=None)
insert_analysis(data_id, content, date, impact=False)
insert_comment(data_id, content, date)
insert_data(dtype, dm_name, raw_data, sz, sent_date, ack_date, target_ref, prj_name, group_id=None)
insert_data_model(dm_name)
insert_dmaker(dm_name, dtype, name, is_gen, stateful, clone_type=None)
insert_feedback(data_id, source, timestamp, content, status_code=None)
insert_fmk_info(data_id, content, date, error=False)
insert_project(prj_name)
insert_steps(data_id, step_id, dmaker_type, dmaker_name, data_id_src, user_input, info)
iter_last_feedback_entries(source=None)
remove_data(data_id, colorized=True)
start()
stop()
submit_sql_stmt(stmt, params=None, outcome_type=None, error_msg='')

This method is the only one that should submit request to the threaded SQL handler. It is also synchronized to guarantee request order (especially needed when you wait for the outcomes of your submitted SQL statement).

Parameters:
  • stmt (str) – SQL statement
  • params (tuple) – parameters
  • outcome_type (int) – type of the expected outcomes. If None, no outcomes are expected
  • error_msg (str) – specific error message to display in case of an error
Returns:

None or the expected outcomes

class framework.database.FeedbackGate(database)

Bases: object

__bool__()
__init__(database)
Parameters:database (Database) – database to be associated with
__iter__()
__module__ = 'framework.database'
__nonzero__()
get_feedback_from(source)
iter_entries(source=None)

Iterate over feedback entries that are related to the last data which has been sent by the framework.

Parameters:source (FeedbackSource) – feedback source to consider
Returns:A generator that iterates over all the requested feedback entries and provides for each:
  • the triplet: (status, timestamp, content) if source is associated to a specific feedback source
  • the 4-uplet: (source, status, timestamp, content) if source is None
Return type:python generator
sources_names()

Return a list of the feedback source names related to the last data which has been sent by the framework.

Returns:names of the feedback sources
Return type:list
framework.database.regexp(expr, item)
framework.database.regexp_bin(expr, item)

13.2.22. framework.scenario module

class framework.scenario.DataProcess(process, seed=None, auto_regen=False, vtg_ids=None)

Bases: object

__copy__()
__init__(process, seed=None, auto_regen=False, vtg_ids=None)

Describe a process to generate a data.

Parameters:
  • process (list) – List of disruptors (possibly complemented by parameters) to apply to a seed. However, if the list begin with a generator, the disruptor chain will apply to the outcome of the generator. The generic form for a process is: [action_1, (action_2, generic_UI_2, specific_UI_2), ... action_n] where action_N can be either: dmaker_type_N or (dmaker_type_N, dmaker_name_N)
  • seed – (Optional) Can be a registered framework.data_model.Node name or a framework.data_model.Data. Will be provided to the first disruptor in the disruptor chain (described by the parameter process) if it does not begin with a generator.
  • auto_regen (boolean) – If True, the data process will notify the framework to rerun the data maker chain after a disruptor has yielded (meaning it is exhausted with the data that has been provided to it). It will make the chain going on with new data coming either from the first non-exhausted disruptor (preceding the exhausted one), or from the generator if all disruptors are exhausted. If False, the data process won’t notify the framework to rerun the data maker chain, thus triggering the end of the scenario that embeds this data process.
  • vtg_ids (list) – Virtual ID list of the targets to which the outcomes of this data process will be sent. If None, the outcomes will be sent to the first target that has been enabled.
__module__ = 'framework.scenario'
__repr__() <==> repr(x)
append_new_process(process)

Append a new process to the list.

formatted_str(oneliner=False)
make_blocked()
make_free()
next_process()
process
process_qty
reset()
class framework.scenario.FinalStep(data_desc=None, final=False, fbk_timeout=None, fbk_mode=None, set_periodic=None, clear_periodic=None, step_desc=None, do_before_data_processing=None, valid=True, vtg_ids=None)

Bases: framework.scenario.Step

__init__(data_desc=None, final=False, fbk_timeout=None, fbk_mode=None, set_periodic=None, clear_periodic=None, step_desc=None, do_before_data_processing=None, valid=True, vtg_ids=None)

Step objects are the building blocks of Scenarios.

Parameters:
  • data_desc
  • final
  • fbk_timeout
  • fbk_mode
  • set_periodic
  • clear_periodic
  • step_desc
  • do_before_data_processing
  • do_before_sending
  • valid
  • vtg_ids (list, int) – Virtual ID list of the targets to which the outcomes of this data process will be sent. If None, the outcomes will be sent to the first target that has been enabled. If data_desc is a list, this parameter should be a list where each item is the vtg_ids of the corresponding item in the data_desc list.
__module__ = 'framework.scenario'
class framework.scenario.NoDataStep(data_desc=None, final=False, fbk_timeout=None, fbk_mode=None, set_periodic=None, clear_periodic=None, step_desc=None, do_before_data_processing=None, valid=True, vtg_ids=None)

Bases: framework.scenario.Step

__init__(data_desc=None, final=False, fbk_timeout=None, fbk_mode=None, set_periodic=None, clear_periodic=None, step_desc=None, do_before_data_processing=None, valid=True, vtg_ids=None)

Step objects are the building blocks of Scenarios.

Parameters:
  • data_desc
  • final
  • fbk_timeout
  • fbk_mode
  • set_periodic
  • clear_periodic
  • step_desc
  • do_before_data_processing
  • do_before_sending
  • valid
  • vtg_ids (list, int) – Virtual ID list of the targets to which the outcomes of this data process will be sent. If None, the outcomes will be sent to the first target that has been enabled. If data_desc is a list, this parameter should be a list where each item is the vtg_ids of the corresponding item in the data_desc list.
__module__ = 'framework.scenario'
make_free()
class framework.scenario.Periodic(data, period=None, vtg_ids=None)

Bases: object

__init__(data, period=None, vtg_ids=None)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.scenario'
__str__() <==> str(x)
class framework.scenario.Scenario(name, anchor=None, reinit_anchor=None, user_context=None)

Bases: object

__copy__()
__init__(name, anchor=None, reinit_anchor=None, user_context=None)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.scenario'
__str__() <==> str(x)
_graph_setup(init_step, steps, transitions)
_init_main_properties()
_init_reinit_seq_properties()
_view_linux(filepath, graph_filename)

Open filepath in the user’s preferred application (linux).

_view_windows(filepath, graph_filename)

Start filepath with its associated application (windows).

anchor
branch_to_reinit(step, prepend=True)
clone(new_name)
current_step
env
graph(fmt='pdf', select_current=False, display_ucontext=True)
periodic_to_clear
reinit_steps
reinit_transitions
reset()
set_anchor(anchor, current=None)
set_data_model(dm)
set_reinit_anchor(reinit_anchor)
set_target(target)
set_user_context(user_context)
steps
transitions
walk_to(step)
walk_to_reinit()
class framework.scenario.ScenarioEnv

Bases: object

__copy__()
__init__()

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.scenario'
dm
knowledge_source = None
scenario
target
user_context
class framework.scenario.Step(data_desc=None, final=False, fbk_timeout=None, fbk_mode=None, set_periodic=None, clear_periodic=None, step_desc=None, do_before_data_processing=None, do_before_sending=None, valid=True, vtg_ids=None)

Bases: object

__copy__()
__hash__() <==> hash(x)
__init__(data_desc=None, final=False, fbk_timeout=None, fbk_mode=None, set_periodic=None, clear_periodic=None, step_desc=None, do_before_data_processing=None, do_before_sending=None, valid=True, vtg_ids=None)

Step objects are the building blocks of Scenarios.

Parameters:
  • data_desc
  • final
  • fbk_timeout
  • fbk_mode
  • set_periodic
  • clear_periodic
  • step_desc
  • do_before_data_processing
  • do_before_sending
  • valid
  • vtg_ids (list, int) – Virtual ID list of the targets to which the outcomes of this data process will be sent. If None, the outcomes will be sent to the first target that has been enabled. If data_desc is a list, this parameter should be a list where each item is the vtg_ids of the corresponding item in the data_desc list.
__module__ = 'framework.scenario'
__str__() <==> str(x)
_handle_data_desc(data_desc)
cleanup()
connect_to(step, cbk_after_sending=None, cbk_after_fbk=None, prepend=False)
content

Provide the atom of the step if possible. In the case of a DataProcess, if it has been carried out, then the resulting atom is returned, otherwise the seed atom is returned if it exists.

Provide an atom list if the step contain multiple atom

data_desc
do_before_data_processing()
do_before_sending()
feedback_mode
feedback_timeout
get_data()
get_description()
get_periodic_description()
get_periodic_ref()
has_dataprocess()
is_blocked()
is_periodic_cleared()
is_periodic_set()
make_blocked()
make_free()
periodic_to_clear
periodic_to_set
set_scenario_env(env)
set_transitions(transitions)
transitions
class framework.scenario.Transition(step, cbk_after_sending=None, cbk_after_fbk=None)

Bases: object

__copy__()
__hash__() <==> hash(x)
__init__(step, cbk_after_sending=None, cbk_after_fbk=None)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.scenario'
__str__() <==> str(x)
has_callback()
has_callback_pending()
invert_conditions()
is_crossable()
make_uncrossable()
register_callback(callback, hook=<HOOK.after_fbk: 5>)
run_callback(current_step, feedback=None, hook=<HOOK.after_fbk: 5>)
set_scenario_env(env)
step

13.2.23. framework.dmhelpers.generic module

framework.dmhelpers.generic.COPY_VALUE(path, depth=None, vt=None, set_attrs=None, clear_attrs=None, after_encoding=True)

Return a generator that retrieves the value of another node, and then return a vt node with this value. The other node is selected:

  • either directly by following the provided relative path from the given generator-parameter node.
  • or indirectly (if depth is provided) where a base node is first selected automatically, based on our current index within our own parent node (or the nth-ancestor, depending on the parameter depth), and then the targeted node is selected by following the provided relative path from the base node.
Parameters:
  • path (str) – relative path to the node whose value will be picked.
  • depth (int) – depth of our nth-ancestor used as a reference to compute automatically the targeted base node position.
  • vt (type) – value type used for node generation (refer to framework.value_types).
  • set_attrs (list) – attributes that will be set on the generated node.
  • clear_attrs (list) – attributes that will be cleared on the generated node.
  • after_encoding (bool) – if False, copy the raw value, otherwise the encoded one. Can be set to False only if node arguments support encoding.
framework.dmhelpers.generic.CRC(vt=<class 'framework.value_types.INT_str'>, poly=4374732215, init_crc=0, xor_out=4294967295, rev=True, set_attrs=None, clear_attrs=None, after_encoding=True, freezable=False, base=16, letter_case='upper', reverse_str=False)

Return a generator that returns the CRC (in the chosen type) of all the node parameters. (Default CRC is PKZIP CRC32)

Parameters:
  • vt (type) – value type used for node generation (refer to framework.value_types)
  • poly (int) – CRC polynom
  • init_crc (int) – initial value used to start the CRC calculation.
  • xor_out (int) – final value to XOR with the calculated CRC value.
  • rev (bool) – bit reversed algorithm when True.
  • set_attrs (list) – attributes that will be set on the generated node.
  • clear_attrs (list) – attributes that will be cleared on the generated node.
  • after_encoding (bool) – if False compute the CRC before any encoding. Can be set to False only if node arguments support encoding.
  • freezable (bool) – if False make the generator unfreezable in order to always provide the right value. (Note that tTYPE will still be able to corrupt the generator.)
  • base (int) – Relevant when vt is INT_str. Numerical base to use for string representation
  • letter_case (str) – Relevant when vt is INT_str. Letter case for string representation (‘upper’ or ‘lower’)
  • reverse_str (bool) – Reverse the order of the string if set to True.
framework.dmhelpers.generic.CYCLE(vals, depth=1, vt=<class 'framework.value_types.String'>, set_attrs=None, clear_attrs=None)

Return a generator that iterates other the provided value list and returns at each step a vt node corresponding to the current value.

Parameters:
  • vals (list) – the value list to iterate on.
  • depth (int) – depth of our nth-ancestor used as a reference to iterate. By default, it is the parent node. Thus, in this case, depending on the drawn quantity of parent nodes, the position within the grand-parent determines the index of the value to use in the provided list, modulo the quantity.
  • vt (type) – value type used for node generation (refer to framework.value_types).
  • set_attrs (list) – attributes that will be set on the generated node.
  • clear_attrs (list) – attributes that will be cleared on the generated node.
framework.dmhelpers.generic.LEN(vt=<class 'framework.value_types.INT_str'>, base_len=0, set_attrs=None, clear_attrs=None, after_encoding=True, freezable=False)

Return a generator that returns the length of a node parameter.

Parameters:
  • vt (type) – value type used for node generation (refer to framework.value_types).
  • base_len (int) – this base length will be added to the computed length.
  • set_attrs (list) – attributes that will be set on the generated node.
  • clear_attrs (list) – attributes that will be cleared on the generated node.
  • after_encoding (bool) – if False compute the length before any encoding. Can be set to False only if node arguments support encoding.
  • freezable (bool) – If False make the generator unfreezable in order to always provide the right value. (Note that tTYPE will still be able to corrupt the generator.)
class framework.dmhelpers.generic.MH

Bases: object

Define constants and generator templates for data model description.

class Attr
Abs_Postpone = 6
DEBUG = 30
Determinist = 3
Finite = 4
Freezable = 1
LOCKED = 50
Mutable = 2
Separator = 15
__module__ = 'framework.dmhelpers.generic'
class Charset
ASCII = 1
ASCII_EXT = 2
UNICODE = 3
__module__ = 'framework.dmhelpers.generic'
Copy = 'u'
class Custo
class Func
CloneExtNodeArgs = 2
FrozenArgs = 1
__module__ = 'framework.dmhelpers.generic'
class Gen
CloneExtNodeArgs = 2
ForwardConfChange = 1
ResetOnUnfreeze = 3
TriggerLast = 4
__module__ = 'framework.dmhelpers.generic'
class NTerm
CollapsePadding = 3
FrozenCopy = 2
MutableClone = 1
__module__ = 'framework.dmhelpers.generic'
__module__ = 'framework.dmhelpers.generic'
FullyRandom = '=.'
Generator = 2
Leaf = 3
NonTerminal = 1
Ordered = '>'
Pick = '=+'
Random = '=..'
RawNode = 4
Regex = 5
ZeroCopy = 's'
__module__ = 'framework.dmhelpers.generic'
static _handle_attrs(n, set_attrs, clear_attrs)
static _validate_int_vt(vt)
static _validate_vt(vt)
framework.dmhelpers.generic.OFFSET(use_current_position=True, depth=1, vt=<class 'framework.value_types.INT_str'>, set_attrs=None, clear_attrs=None, after_encoding=True, freezable=False)

Return a generator that computes the offset of a child node within its parent node.

If use_current_position is True, the child node is selected automatically, based on our current index within our own parent node (or the nth-ancestor, depending on the parameter depth). Otherwise, the child node has to be provided in the node parameters just before its parent node.

Besides, if there are N node parameters, the first N-1 (or N-2 if use_current_position is False) nodes are used for adding a fixed amount (the length of their concatenated values) to the offset (determined thanks to the node in the last position of the node parameters).

The generator returns the result wrapped in a vt node.

Parameters:
  • use_current_position (bool) – automate the computation of the child node position
  • depth (int) – depth of our nth-ancestor used as a reference to compute automatically the targeted child node position. Only relevant if use_current_position is True.
  • vt (type) – value type used for node generation (refer to framework.value_types).
  • set_attrs (list) – attributes that will be set on the generated node.
  • clear_attrs (list) – attributes that will be cleared on the generated node.
  • after_encoding (bool) – if False compute the fixed amount part of the offset before any encoding. Can be set to False only if node arguments support encoding.
  • freezable (bool) – If False make the generator unfreezable in order to always provide the right value. (Note that tTYPE will still be able to corrupt the generator.)
framework.dmhelpers.generic.QTY(node_name, vt=<class 'framework.value_types.INT_str'>, set_attrs=None, clear_attrs=None, freezable=False)

Return a generator that returns the quantity of child node instances (referenced by name) of the node parameter provided to the generator.

Parameters:
  • vt (type) – value type used for node generation (refer to framework.value_types)
  • node_name (str) – name of the child node whose instance amount will be returned by the generator
  • set_attrs (list) – attributes that will be set on the generated node.
  • clear_attrs (list) – attributes that will be cleared on the generated node.
  • freezable (bool) – If False make the generator unfreezable in order to always provide the right value. (Note that tTYPE will still be able to corrupt the generator.)
framework.dmhelpers.generic.TIMESTAMP(time_format='%H%M%S', utc=False, set_attrs=None, clear_attrs=None)

Return a generator that returns the current time (in a String node).

Parameters:
  • time_format (str) – time format to be used by the generator.
  • set_attrs (list) – attributes that will be set on the generated node.
  • clear_attrs (list) – attributes that will be cleared on the generated node.
framework.dmhelpers.generic.WRAP(func, vt=<class 'framework.value_types.String'>, set_attrs=None, clear_attrs=None, after_encoding=True, freezable=False)

Return a generator that returns the result (in the chosen type) of the provided function applied on the concatenation of all the node parameters.

Parameters:
  • func (function) – function applied on the concatenation
  • vt (type) – value type used for node generation (refer to framework.value_types)
  • set_attrs (list) – attributes that will be set on the generated node.
  • clear_attrs (list) – attributes that will be cleared on the generated node.
  • after_encoding (bool) – if False, execute func on node arguments before any encoding. Can be set to False only if node arguments support encoding.
  • freezable (bool) – If False make the generator unfreezable in order to always provide the right value. (Note that tTYPE will still be able to corrupt the generator.)

13.2.24. framework.dmhelpers.xml module

class framework.dmhelpers.xml.TAG_TYPE

Bases: enum.Enum

__module__ = 'framework.dmhelpers.xml'
comment = 2
proc_instr = 3
standard = 1
framework.dmhelpers.xml.tag_builder(tag_name, params=None, refs=None, contents=None, node_name=None, codec='latin-1', tag_name_mutable=True, struct_mutable=True, determinist=True, condition=None, absorb_regexp=None, specific_fuzzy_vals=None, tag_type=<TAG_TYPE.standard: 1>, nl_prefix=False, nl_suffix=False)

Helper for modeling an XML tag.

Parameters:
  • tag_name (str) – name of the XML tag.
  • params (dict) – optional attributes to be added in the XML tag
  • refs (dict) – if provided it should contain for at least one parameter key (provided in params dict) the name to be used for the node representing the corresponding value. Useful when the parameter condition is in use and needs to relate to the value of specific parameters.
  • contents – can be either None (empty tag), a framework.data_model.Node, a dictionary (Node description), a string or a string list (string-Node values).
  • node_name (str) – name of the node to be created.
  • codec (str) – codec to be used for generating the XML tag.
  • tag_name_mutable (bool) – if False, the tag name will not be mutable, meaning that its Mutable attribute will be cleared.
  • struct_mutable (bool) – if False the XML structure “will not” be mutable, meaning that each node related to the structure will have its Mutable attribute cleared.
  • determinist (bool) – if False, the attribute order could change from one retrieved data to another.
  • condition (tuple) – optional existence condition for the tag. If not None a keyword exists_if will be added to the root node with this parameter as a value.
  • absorb_regexp (str) – regex for contents absorption
  • tag_type (TAG_TYPE) – specify the type of notation
  • specific_fuzzy_vals (dict) – if provided it should contain for at least one parameter key (provided in params dict) a list of specific values that will be used by some generic disruptors like tTYPE.
  • nl_prefix (bool) – add a new line character before the tag
  • nl_suffix (bool) – add a new line character after the tag
Returns:

Node-description of the XML tag.

Return type:

dict

framework.dmhelpers.xml.xml_decl_builder(determinist=True)

13.2.25. framework.evolutionary_helpers module

class framework.evolutionary_helpers.DefaultIndividual(fmk, node)

Bases: framework.evolutionary_helpers.Individual

Provide a default implementation of the Individual class

__init__(fmk, node)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.evolutionary_helpers'
mutate(nb)
class framework.evolutionary_helpers.DefaultPopulation(fmk, *args, **kwargs)

Bases: framework.evolutionary_helpers.Population

Provide a default implementation of the Population base class

__module__ = 'framework.evolutionary_helpers'
_compute_probability_of_survival()

Normalize fitness scores between 0 and 1

_compute_scores()

Compute the scores of each individuals

_crossover()

Compensates the kills through the usage of the tCOMB disruptor

_initialize(init_process, size=100, max_generation_nb=50)

Configure the population

Parameters:
  • init_process (string) – individuals that compose this population will be built using the provided process. The generic form for a process is: [action_1, (action_2, UI_2), ... action_n] where action_N can be either: dmaker_type_N or (dmaker_type_N, dmaker_name_N)
  • size (integer) – size of the population to manipulate
  • max_generation_nb (integer) – criteria used to stop the evolution process
_kill()

Simply rolls the dice

_mutate()

Operates three bit flips on each individual

evolve()

Describe the evolutionary process

is_final()

Check if the population can still evolve or not

reset()

Generate the first generation of individuals in a random way

class framework.evolutionary_helpers.EvolutionaryScenariosFactory

Bases: object

__module__ = 'framework.evolutionary_helpers'
static build(fmk, name, population_cls, args)

Create a scenario that takes advantage of an evolutionary approach :param fmk: reference to FmkPlumbing :type fmk: FmkPlumbing :param name: name of the scenario to create :type name: string :param population_cls: population class to instantiate :type population_cls: classobj :param args (dict of str: object): arguments that will be used to instantiate a population

Returns:evolutionary scenario
Return type:Scenario
class framework.evolutionary_helpers.Individual(fmk, node)

Bases: object

Represents a population member

__init__(fmk, node)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.evolutionary_helpers'
class framework.evolutionary_helpers.Population(fmk, *args, **kwargs)

Bases: object

Population to be used within an evolutionary scenario

__delitem__(key)
__getitem__(key)
__init__(fmk, *args, **kwargs)

x.__init__(…) initializes x; see help(type(x)) for signature

__iter__()
__len__()
__module__ = 'framework.evolutionary_helpers'
__next__()
__setitem__(key, value)
_initialize(*args, **kwargs)

Initialize the population Only called once during the creating of the Population instance

evolve()

Describe the evolutionary process

is_final()

Check if the population can still evolve or not

next()
reset()

Reset the population Called before each evolutionary process

13.2.26. framework.knowledge.feedback_collector module

class framework.knowledge.feedback_collector.FeedbackCollector

Bases: object

__init__()

x.__init__(…) initializes x; see help(type(x)) for signature

__iter__()
__module__ = 'framework.knowledge.feedback_collector'
add_fbk_from(ref, fbk, status=0)
cleanup()
fbk_lock = <thread.lock object>
get_bytes()
get_error_code()
get_timestamp()
has_fbk_collector()
iter_and_cleanup_collector()
set_bytes(bstring)
set_error_code(err_code)
class framework.knowledge.feedback_collector.FeedbackSource(src, subref=None, reliability=None, related_tg=None)

Bases: object

__eq__(other)

x.__eq__(y) <==> x==y

__hash__() <==> hash(x)
__init__(src, subref=None, reliability=None, related_tg=None)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.knowledge.feedback_collector'
__str__() <==> str(x)
obj
related_tg

13.2.27. framework.knowledge.feedback_handler module

class framework.knowledge.feedback_handler.FeedbackHandler(new_window=False, new_window_title=None, xterm_prg_name='x-terminal-emulator')

Bases: object

A feedback handler extract information from binary data.

__init__(new_window=False, new_window_title=None, xterm_prg_name='x-terminal-emulator')
Parameters:
  • new_window – If True, a new terminal emulator is created, enabling the decoder to use it for display via the methods print() and print_nl()
  • xterm_prg_name – name of the terminal emulator program to be started
__module__ = 'framework.knowledge.feedback_handler'
_start()
_stop()
collect_data(s)
estimate_last_data_impact_uniqueness()

* To be overloaded *

Estimate the similarity of the consequences triggered by the current data sending from previous sending. Estimation can be computed with provided feedback.

Returns:provide an estimation of impact similarity
Return type:SimilarityMeasure
extract_info_from_feedback(current_dm, source, timestamp, content, status)

* To be overloaded *

Parameters:
Returns:

a set of information.Info or only one

Return type:

Info

flush_collector()
notify_data_sending(current_dm, data_list, timestamp, target)

* To be overloaded *

This function is called when data have been sent. It enables to process feedback relatively to previously sent data.

Parameters:
print(msg)
print_nl(msg)
process_feedback(current_dm, source, timestamp, content, status)
class framework.knowledge.feedback_handler.SimilarityMeasure(level=0)

Bases: object

__add__(other)
__eq__(other)

x.__eq__(y) <==> x==y

__ge__(other)

x.__ge__(y) <==> x>=y

__gt__(other)

x.__gt__(y) <==> x>y

__init__(level=0)

x.__init__(…) initializes x; see help(type(x)) for signature

__le__(other)

x.__le__(y) <==> x<=y

__lt__(other)

x.__lt__(y) <==> x<y

__module__ = 'framework.knowledge.feedback_handler'
__ne__(other)

x.__ne__(y) <==> x!=y

value
class framework.knowledge.feedback_handler.TestFbkHandler(new_window=False, new_window_title=None, xterm_prg_name='x-terminal-emulator')

Bases: framework.knowledge.feedback_handler.FeedbackHandler

__module__ = 'framework.knowledge.feedback_handler'
extract_info_from_feedback(current_dm, source, timestamp, content, status)

* To be overloaded *

Parameters:
Returns:

a set of information.Info or only one

Return type:

Info

13.2.28. framework.knowledge.information module

class framework.knowledge.information.Hardware(val)

Bases: framework.knowledge.information.Info

ARM = 11
PowerPc = 10
Unknown = 12
X86_32 = 9
X86_64 = 8
__module__ = 'framework.knowledge.information'
class framework.knowledge.information.Info(val)

Bases: enum.Enum

__init__(val)

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.knowledge.information'
decrease_trust(inc=1)
increase_trust(inc=1)
reset_trust()
trust_level
trust_value
class framework.knowledge.information.InformationCollector

Bases: object

__bool__()
__init__()

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'framework.knowledge.information'
__nonzero__()
__str__() <==> str(x)
add_information(info, initial_trust_value=0)
is_assumption_valid(info)
is_info_class_represented(info_class)
reset_information()
class framework.knowledge.information.InputHandling(val)

Bases: framework.knowledge.information.Info

Ctrl_Char_Set = 16
Unknown = 17
__module__ = 'framework.knowledge.information'
class framework.knowledge.information.Language(val)

Bases: framework.knowledge.information.Info

C = 13
Pascal = 14
Unknown = 15
__module__ = 'framework.knowledge.information'
class framework.knowledge.information.OS(val)

Bases: framework.knowledge.information.Info

Android = 6
Linux = 4
Unknown = 7
Windows = 5
__module__ = 'framework.knowledge.information'
class framework.knowledge.information.TrustLevel

Bases: enum.Enum

Maximum = 1
Medium = 2
Minimum = 3
__module__ = 'framework.knowledge.information'
framework.knowledge.information.auto()