Skip to content
Snippets Groups Projects
Unverified Commit 5ac10566 authored by Piotr Maślanka's avatar Piotr Maślanka Committed by GitHub
Browse files

Docs (#34)

fixed documentation
parent ec3ae411
No related branches found
No related tags found
No related merge requests found
Showing
with 326 additions and 64 deletions
CallableGroup
-------------
.. autoclass:: satella.coding.concurrent.CallableGroup
:members:
LockedDataset
-------------
.. autoclass:: satella.coding.concurrent.LockedDataset
:members:
# debug module
debug module is used during development. If Python's \_\_debug__ variable is set,
debug functions become operational.
If it's not (Python was launched with -O), they will do their best not to affect
performance, including removing themselves from code.
## Type checking
```python
from satella.coding.debug import typed
@typed(int, int)
def add_two_numbers(a, b):
return a+b
```
If you want to check for None-ness, you can pass None as well. Types for particular
arguments can also be tuples or lists, in that case if any of these types matches,
it's OK.
If you don't want to check a particular argument for type, pass None as type.
Conversely, to check for being None, pass (None, ).
If type check fails, TypeError will be raised.
......@@ -5,7 +5,7 @@ This essentially allows you to have a heap object that will pretty much
behave like the `heapq <https://docs.python.org/2/library/heapq.html>` library.
.. autoclass:: satella.coding.Heap
:members:
:members:
TimeBasedHeap
---------
......@@ -14,7 +14,7 @@ Time-based heap is a good structure if you have many callbacks set to fire at a
time in the future. It functions very like a normal Heap.
.. autoclass:: satella.coding.TimeBasedHeap
:members:
:members:
typednamedtuple
---------------
......@@ -23,3 +23,9 @@ It's a named tuple, but it has typed fields. You will get a TypeError if you
try to assign something else there.
.. autofunction:: satella.coding.typednamedtuple
Singleton
---------
.. autofunction:: satella.coding.Singleton
......@@ -16,9 +16,9 @@
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
# import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))
import os
import sys
sys.path.insert(0, os.path.abspath('..'))
# -- General configuration ------------------------------------------------
......
Configuration
=============
Satella provides a rich functionality to:
1. Load data from particular sources_ (defined using JSON)_
2. Validate_ that config data and standardize it as far as types are concerned (defined using JSON)_
.. _sources: sources.html
.. _Validate: schema.html
You can craft them either out of Python objects at runtime, or load them using an elegant JSON-based schema.
Satella treats your config files as a huge dictionary, at the topmost level,
although you could possibly make them anything you want (including plain strings, although in this case
it would be a plain bother to use Satella for that).
Schema validation
=================
As noted in index_, your configuration is mostly supposed to be a dict. To validate your schema,
you should instantiate a Descriptor. Descriptor reflects how your config is nested.
.. _index: index.html
.. autoclass:: satella.configuration.schema.Boolean
.. autoclass:: satella.configuration.schema.Float
.. autoclass:: satella.configuration.schema.Integer
.. autoclass:: satella.configuration.schema.String
.. autoclass:: satella.configuration.schema.IPv4
.. autoclass:: satella.configuration.schema.List
.. autoclass:: satella.configuration.schema.Dict
Then there is a descriptor that makes it possible for a value to have one of two types:
.. autoclass:: satella.configuration.schema.Union
You can use the following to declare your own descriptors:
.. autoclass:: satella.configuration.schema.Descriptor
:members:
.. autoclass:: satella.configuration.schema.Regexp
Just remember to decorate them with
.. autofunction:: satella.configuration.schema.register_custom_descriptor
If you want them loadable by the JSON-schema loader.
You use the descriptors by calling them on respective values, eg.
::
>>> List(Integer())(['1', '2', 3.0])
[1, 2, 3]
JSON schema
-----------
The JSON schema is pretty straightforward. Assuming the top-level is a dict, it contains keys. A key name is the
name of the corresponding key, and value can have two types. Either it is a string, which is a short-hand for a descriptor,
or a dict containing following values:
::
{
"type": "string_type",
"optional": True/False,
"default": "default_value" - providing this implies optional=True
}
Note that providing a short-hand, string type is impossible for descriptors that take required arguments.
Available string types are:
* **int** - Integer
* **str** - String
* **list** - List
* **dict** - Dict
* **ipv4** - IPv4
* **any** - Descriptor
* **bool** - Boolean
* **union** - Union
Lists you define as following
::
{
"type": "list",
"of": {
.. descriptor type that this list has to have ..
}
}
Unions you define the following
::
{
"type": "union",
"of": [
.. descriptor type 1 ..
.. descriptor type 2 ..
]
}
Dicts are more simple. Each key contains the key that should be present in the dict, and value is it's descriptor
- again, either in a short form (if applicable) or a long one (dict with ``type`` key).
You load it using the following function:
.. autofunction:: satella.configuration.schema.descriptor_from_dict
Sources
=======
At the core of your config files, there are Sources. A Source is a single source of configuration - it could be
an environment variable, or a particular file, or a directory full of these files.
.. autoclass:: satella.configuration.sources.StaticSource
:members:
.. autoclass:: satella.configuration.sources.StaticSource
:members:
.. autoclass:: satella.configuration.sources.EnvironmentSource
:members:
.. autoclass:: satella.configuration.sources.EnvVarsSource
:members:
.. autoclass:: satella.configuration.sources.FileSource
:members:
.. autoclass:: satella.configuration.sources.DirectorySource
:members:
Then there are abstract sources of configuration.
.. autoclass:: satella.configuration.sources.AlternativeSource
:members:
.. autoclass:: satella.configuration.sources.OptionalSource
:members:
.. autoclass:: satella.configuration.sources.MergingSource
:members:
JSON schema
-----------
The JSON schema consists of defining particular sources, embedded in one another.
::
{
"type": "ClassNameOfTheSource",
"args": [
],
"kwarg_1": ...,
"kwarg_2": ...,
}
If an argument consists of a dict with ``type`` key, it will be also loaded and passed internally as a source.
Two reserved types are ``lambda``, which expects to have a key of ``operation``. This will be appended to
``lambda x: `` and ``eval()``-uated.
Always you can provide a key called ``optional`` with a value of True, this will wrap given Source in OptionalSource.
The second reserved type if ``binary``. This will encode the ``value`` key with ``encoding`` encoding (default is ascii).
To instantiate the schema, use the following functions:
.. autofunction:: satella.configuration.sources.load_source_from_dict
.. autofunction:: satella.configuration.sources.load_source_from_list
......@@ -5,13 +5,14 @@ Welcome to satella's documentation!
:maxdepth: 2
:caption: Contents:
configuration/index
configuration/schema
configuration/sources
coding/monitor
coding/debug
coding/typechecking
coding/structures
coding/concurrent
instrumentation/traceback
instrumentation/metrics
source/modules
posix
recipes
......
......@@ -22,14 +22,16 @@ DEBUG, which will cause more data to be registered. If a metric
is in state INHERIT, it will inherit the metric level from it's
parent, traversing the tree if required.
You can switch the metric anytime by calling it's `switch_level`
method.
You can switch the metric anytime by calling it's ``switch_level``
method, or by specifying it's metric level during a call to ``getMetric()``.
You obtain metrics using `getMetric()` as follows:
The call to ``getMetric()`` is specified as follows
```python
metric = getMetric(__name__+'.StringMetric', 'string', RUNTIME, **kwargs)
```
.. autofunction:: satella.instrumentation.metrics.getMetric
You obtain metrics using ``getMetric()`` as follows:
``metric = getMetric(__name__+'.StringMetric', 'string', RUNTIME, **kwargs)``
Where the second argument is a metric type. Following metric types
are available:
......@@ -41,6 +43,8 @@ are available:
* cps - will count given amount of calls to handle() during last
time period, as specified by user
.. autoclass :: satella.instrumentation.metrics.metric_types.cps.ClicksPerTimeUnitMetric
Third parameter is optional. If set, all child metrics created
during this metric's instantiation will receive such metric level.
If the metric already exists, it's level will be set to provided
......@@ -48,7 +52,7 @@ metric level, if passed.
All child metrics (going from the root metric to 0) will be initialized
with the value that you just passed. In order to keep them in order,
an additional parameter passed to `getMetric()`, `metric_level`, if
an additional parameter passed to ``getMetric()``, ``metric_level``, if
specified, will set given level upon returning the even existing
metric.
......
......@@ -11,6 +11,9 @@ in case of an exception. It preserves:
It also allows to pretty print the exception. Traceback is picklable, so you
can safely do so and analyze the exception at your own leisure.
Unpickling _Traceback_ objects in any environment is safe. However, obtaining
variable values via _load_value_ might be not.
Usage:
```python
from satella.instrumentation import Traceback
......@@ -28,5 +31,12 @@ except:
_Traceback_ should be created in the exception it is supposed to capture,
as it captures exception info from _sys.exc_info()_.
Unpickling _Traceback_ objects in any environment is safe. However, obtaining
variable values via _load_value_ might be not.
Alternatively, you can pass a `<frame>` object to Traceback, in order to serialize it, for example:
```python
import sys
frame_1 = next(iter(sys._current_frames().values()))
tb = Traceback(frame_1)
```
suicide
----
-------
Kill your process (and your process group)
......
......@@ -9,6 +9,7 @@ __all__ = [
'Descriptor',
'Integer', 'Float', 'String', 'Boolean',
'IPv4',
'Regexp',
'List', 'Dict', 'Union',
'create_key',
'must_be_type',
......@@ -16,6 +17,7 @@ __all__ = [
'CheckerCondition',
'ConfigDictValue',
'descriptor_from_dict',
'register_custom_descriptor',
]
ConfigDictValue = tp.Optional[tp.Union[int, float, str, dict, list, bool]]
......@@ -51,6 +53,9 @@ def must_be_one_of(*items):
class Descriptor(object):
"""
Base class for a descriptor
"""
BASIC_MAKER = staticmethod(lambda v: v)
MY_EXCEPTIONS = [TypeError, ValueError] # a list of Exception classes
CHECKERS = [] # a list of CheckerCondition
......@@ -104,22 +109,40 @@ def _make_boolean(v: tp.Any) -> bool:
class Boolean(Descriptor):
"""
This value must be a boolean, or be converted to one
"""
BASIC_MAKER = _make_boolean
class Integer(Descriptor):
"""
This value must be an integer, or be converted to one
"""
BASIC_MAKER = int
class Float(Descriptor):
"""
This value must be a float, or be converted to one
"""
BASIC_MAKER = float
class String(Descriptor):
"""
This value must be a string, or be converted to one
"""
BASIC_MAKER = str
class Regexp(String):
"""
Base class for declaring regexp-based descriptors. Overload it's attribute REGEXP. Use as following:
class IPv6(Regexp):
REGEXP = '(\A([0-9a-f]{1,4}:)' ...
"""
REGEXP = r'.*'
def __init__(self):
......@@ -138,10 +161,16 @@ class Regexp(String):
class IPv4(Regexp):
"""
This must be a valid IPv4 address (no hostnames allowed)
"""
REGEXP = r'\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}'
class List(Descriptor):
"""
This must be a list, made of entries of a descriptor (this is optional)
"""
CHECKERS = [must_be_type(list, tuple)]
BASIC_MAKER = list
......@@ -168,6 +197,20 @@ def create_key(descriptor: Descriptor, name: str, optional: bool = False,
class Dict(Descriptor):
"""
This entry must be a dict, having at least specified keys.
Use like:
Dict([
create_key(String(), 'key_s'),
create_key(Integer(), 'key_i'),
create_key(Float(), 'key_f'),
create_key(String(), 'key_not_present', optional=True,
default='hello world'),
create_key(IPv4(), 'ip_addr')
])
"""
BASIC_MAKER = dict
CHECKERS = [must_be_type(dict)]
......@@ -263,6 +306,24 @@ def _get_descriptor_for(key: str, value: tp.Any) -> Descriptor:
raise ConfigurationSchemaError('invalid schema, unrecognized config object %s' % (value, ))
def register_custom_descriptor(name: str):
"""
A decorator used for registering custom descriptors in order to be loadable via descriptor_from_dict
Use like:
@register_custom_descriptor('ipv6')
class IPv6(Regexp):
REGEXP = '(\A([0-9a-f]{1,4}:)' ...
name -- name under which it is supposed to be invokable
"""
def inner(cls):
BASE_LOOKUP_TABLE[name] = cls
return cls
return inner
def descriptor_from_dict(dct: dict) -> Descriptor:
"""
Giving a Python dictionary-defined schema of the configuration, return a Descriptor-based one
......
......@@ -9,10 +9,10 @@ __all__ = [
class AlternativeSource(BaseSource):
"""
If first source of configuration fails with ConfigurationError, use the next one instead, ad nauseam.
"""
def __init__(self, *sources: BaseSource):
"""
If one fails, use the next
"""
self.sources = sources
def provide(self) -> dict:
......@@ -31,17 +31,18 @@ class AlternativeSource(BaseSource):
class OptionalSource(AlternativeSource):
def __init__(self, source: BaseSource):
"""
This will substitute for empty dict if underlying config would fail.
"""
This will substitute for empty dict if underlying config would fail.
Apply this to your sources if you expect that they will fail.
Apply this to your sources if you expect that they will fail.
Use as
Use as
OptionalSource(SomeOtherSource1)
OptionalSource(SomeOtherSource1)
"""
"""
def __init__(self, source: BaseSource):
super(OptionalSource, self).__init__(source, BaseSource())
......
......@@ -21,6 +21,10 @@ class EnvironmentSource(BaseSource):
"""
def __init__(self, env_name: str, config_name: tp.Optional[str] = None, cast_to=lambda v: v):
"""
env_name -- name of the environment variable to check for
config_name -- name of the env_name in the dictionary to return
"""
super(EnvironmentSource, self).__init__()
self.env_name = env_name
self.config_name = config_name or env_name
......@@ -34,6 +38,9 @@ class EnvironmentSource(BaseSource):
class EnvVarsSource(JSONSource):
"""
Return a dictionary that is the JSON encoded within a particular environment variable
"""
def __init__(self, env_name: str):
super(EnvVarsSource, self).__init__('',
encoding=sys.getfilesystemencoding())
......
......@@ -8,12 +8,12 @@ from satella.exceptions import ConfigurationError
"""
If a dict has a field "type" then it will be treated specially:
"binary" - it is a binary value of "value" to be encoded with "encoding" (default ascii)
"lambda" - it allows expressing the simplest filters there can be
name of a source class - it will be instantated with arguments "args".
rest keys will be kwargs.
* "binary" - it is a binary value of "value" to be encoded with "encoding" (default ascii)
* "lambda" - it allows expressing the simplest filters there can be
name of a source class - it will be instantated with arguments "args".
rest keys will be kwargs.
Special key is "optional" to be bool - if so, the source will be decorated as optional
Special key is "optional" to be bool - if so, the source will be decorated as optional
See the unit test for more in-depth knowledge
"""
......
......@@ -17,7 +17,8 @@ metrics_lock = threading.Lock()
def getMetric(metric_name: str, metric_type: str = 'base', metric_level: tp.Optional[str] = None, **kwargs):
"""
Obtain a metric of given name.
:param metric_name: must be a module name
metric_name -- must be a module name
"""
metric_level_to_set_for_children = metric_level or INHERIT
name = metric_name.split('.')
......
......@@ -5,13 +5,13 @@ import collections
class ClicksPerTimeUnitMetric(Metric):
"""
This tracks the amount of calls to handle() during the last time periods, as specified by time_unit_vectors
(in seconds). You may specify multiple time periods as consequent entries in the list.
"""
CLASS_NAME = 'cps'
def __init__(self, *args, time_unit_vectors: tp.Optional[tp.List[float]] = None, **kwargs):
"""
:param time_unit_vectors: time units (in seconds) to count the clicks in between.
Default - track a single value, amount of calls to .handle() in last second
"""
super().__init__(*args, **kwargs)
time_unit_vectors = time_unit_vectors or [1]
self.last_clicks = collections.deque()
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment