diff --git a/docs/.nojekyll b/docs/.nojekyll new file mode 100644 index 0000000..e69de29 diff --git a/docs/_generated/home.html b/docs/_generated/home.html new file mode 100644 index 0000000..a3275de --- /dev/null +++ b/docs/_generated/home.html @@ -0,0 +1,707 @@ + + + + + + + + + + + Home — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +
+

Home#

+

FLUXUS is a Python framework designed by BCG X to +streamline the development of complex data processing pipelines (called flows), +enabling users to quickly and efficiently build, test, and deploy data workflows, +making complex operations more manageable.

+
+

Introducing Flows#

+

A flow in fluxus represents a Directed Acyclic Graph (DAG) where each node performs +a specific operation on the data. These nodes, called conduits, are the building +blocks of a flow, and the data elements that move through the flow are referred to as +products. The conduits are connected to ensure that products are processed and +transferred correctly from one stage to another.

+

Within a fluxus flow, there are three main types of conduits:

+
    +
  • Producers: These conduits generate or gather raw data from various sources such as +databases, APIs, or sensors. They are the entry points of the flow, feeding initial +products into the system.

  • +
  • Transformers: These conduits take the products from producers and transform +them. This can involve filtering, aggregating, enriching, or changing the data to fit +the required output or format.

  • +
  • Consumers: Consumers represent the endpoints of the flow. Each flow has exactly +one consumer, which handles the final processed products. The consumer may store the +data, display it in a user interface, or send it to another system.

  • +
+
+
+

A Simple Example#

+

Consider a simple flow that takes a greeting message, converts it to different cases +(uppercase, lowercase), and then annotates each message with the case change that +has been applied. The flow looks like this:

+"Hello World" flow diagram +

With fluxus, we can define this flow as follows:

+
from fluxus.functional import step, passthrough, run
+
+input_data = [
+    dict(greeting="Hello, World!"),
+    dict(greeting="Bonjour!"),
+]
+
+def lower(greeting: str) -> dict[str, str]:
+    # Convert the greeting to lowercase and keep track of the case change
+    return dict(
+        greeting=greeting.lower(),
+        case="lower",
+    )
+
+def upper(greeting: str) -> dict[str, str]:
+    # Convert the greeting to uppercase and keep track of the case change
+    return dict(
+        greeting=greeting.upper(),
+        tone="upper",
+    )
+
+def annotate(greeting: str, case: str = "original") -> dict[str, str]:
+    # Annotate the greeting with the case change; default to "original"
+    return dict(greeting=f"{greeting!r} ({case})")
+
+flow = (
+    step("input", input_data)  # initial producer step
+    >> ( # 3 parallel steps: upper, lower, and passthrough
+        step("lower", lower)
+        & step("upper", upper)
+        & passthrough()  # passthrough the original input data
+    )
+    >> step("annotate", annotate) # annotate all outputs
+)
+
+# Draw the flow diagram
+flow.draw()
+
+
+

Note the passthrough() step in the flow. This step is a special type of conduit that +simply passes the input data along without modification. This is useful when you want to +run multiple transformations in parallel but still want to preserve the original data +for further processing.

+

You may have noted that the above code does not define a final consumer step. This is +because the run function automatically adds a consumer step to the end of the flow +to collect the final output. Custom consumers come into play when you start building +more customised flows using the object-oriented API instead of the simpler functional +API we are using here.

+

We run the flow with

+
result = run(flow)
+
+
+

This gives us the following output in result:

+
RunResult(
+    [
+        {
+            'input': {'greeting': 'Hello, World!'},
+            'lower': {'greeting': 'hello, world!', 'case': 'lower'},
+            'annotate': {'greeting': "'hello, world!' (lower)"}
+        },
+        {
+            'input': {'greeting': 'Bonjour!'},
+            'lower': {'greeting': 'bonjour!', 'case': 'lower'},
+            'annotate': {'greeting': "'bonjour!' (lower)"}
+        }
+    ],
+    [
+        {
+            'input': {'greeting': 'Hello, World!'},
+            'upper': {'greeting': 'HELLO, WORLD!', 'tone': 'upper'},
+            'annotate': {'greeting': "'HELLO, WORLD!' (original)"}
+        },
+        {
+            'input': {'greeting': 'Bonjour!'},
+            'upper': {'greeting': 'BONJOUR!', 'tone': 'upper'},
+            'annotate': {'greeting': "'BONJOUR!' (original)"}
+        }
+    ],
+    [
+        {
+            'input': {'greeting': 'Hello, World!'},
+            'annotate': {'greeting': "'Hello, World!' (original)"}
+        },
+        {
+            'input': {'greeting': 'Bonjour!'},
+            'annotate': {'greeting': "'Bonjour!' (original)"}
+        }
+    ]
+)
+
+
+

Here’s what happened: The flow starts with a single input data item, which is then +passed along three parallel paths. Each path applies different transformations to the +data. The flow then combines the results of these transformations into a single output, +the RunResult.

+

Note that the result contains six outputs—one for each of the two input data items along +each of the three paths through the flow. Also note that the results are grouped as +separate lists for each path.

+

The run result not only gives us the final product of the annotate step but also the +inputs and intermediate products of the lower and upper steps. We refer to this +extended view of the flow results as the lineage of the flow.

+

For a more thorough introduction to FLUXUS, please visit our User Guide and +Examples!

+
+
+

Why fluxus?#

+

The complexity of data processing tasks demands tools that streamline operations and +ensure efficiency. fluxus addresses these needs by offering a structured approach to +creating flows that handle various data sources and processing requirements. Key +motivations for using fluxus include:

+
    +
  • Organisation and Structure: fluxus offers a clear, structured approach to data +processing, breaking down complex operations into manageable steps.

  • +
  • Maintainability: Its modular design allows individual components to be developed, +tested, and debugged independently, simplifying maintenance and updates.

  • +
  • Reusability: Components in fluxus can be reused across different projects, +reducing development time and effort.

  • +
  • Efficiency: By supporting concurrent processing, fluxus ensures optimal use of +system resources, speeding up data processing tasks.

  • +
  • Ease of Use: fluxus provides a functional API that abstracts away the +complexities of data processing, making it accessible to developers of all levels. +More experienced users can also leverage the advanced features of its underlying +object-oriented implementation for customisation and optimisation (see +Advanced Features for more details).

  • +
+
+
+

Concurrent Processing in fluxus#

+

A standout feature of fluxus is its support for concurrent processing, allowing +multiple operations to run simultaneously. This is essential for:

+
    +
  • Performance: Significantly reducing data processing time by executing multiple +data streams or tasks in parallel.

  • +
  • Resource Utilisation: Maximising the use of system resources by distributing the +processing load across multiple processes or threads.

  • +
+

fluxus leverages Python techniques such as threading and asynchronous programming to +achieve concurrent processing.

+

By harnessing the capabilities of fluxus, developers can build efficient, scalable, +and maintainable data processing systems that meet the demands of contemporary +applications.

+
+
+
+

Getting started#

+ +
+

User Installation#

+

Install using pip:

+
pip install fluxus
+
+
+

or conda:

+
conda install -c bcgx fluxus
+
+
+
+

Optional dependencies#

+

To enable visualizations of flow diagrams, install GraphViz +and ensure it is in your system’s PATH variable:

+ +
+
+
+

Environment Setup#

+
+

Virtual environment#

+

We recommend working in a dedicated environment, e.g., using venv:

+
python -m venv fluxus
+source fluxus/bin/activate
+
+
+

or conda:

+
conda env create -f environment.yml
+conda activate fluxus
+
+
+
+
+
+

Contributing#

+

Contributions to ARTKIT are welcome and appreciated! Please see the Contributing section for information.

+
+
+

License#

+

This project is under the Apache License 2.0, allowing free use, modification, and distribution with added protections against patent litigation. +See the LICENSE file for more details or visit Apache 2.0.

+
+
+ + +
+ + + + + +
+ + + + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_generated/release_notes.html b/docs/_generated/release_notes.html new file mode 100644 index 0000000..4d3b17d --- /dev/null +++ b/docs/_generated/release_notes.html @@ -0,0 +1,468 @@ + + + + + + + + + + + Release Notes — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +
+

Release Notes#

+
+

fluxus 1.0#

+
+

fluxus 1.0.0#

+
    +
  • Initial release of fluxus.

  • +
+
+
+
+ + +
+ + + + + +
+ + + +
+ + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_images/flow-hello-world.svg b/docs/_images/flow-hello-world.svg new file mode 100644 index 0000000..cade8b8 --- /dev/null +++ b/docs/_images/flow-hello-world.svg @@ -0,0 +1,80 @@ + + + + + + +Flow + + + +Step_5224164640 + +lower + + + +Step_5224832352 + +annotate + + + +Step_5224164640->Step_5224832352 + + + + + +Step_5224930896 + +upper + + + +Step_5224930896->Step_5224832352 + + + + + +DictProducer_5224445520 + +input + + + +DictProducer_5224445520->Step_5224164640 + + + + + +DictProducer_5224445520->Step_5224930896 + + + + + +DictProducer_5224445520->Step_5224832352 + + + + + +_EndNode_5224990864 + + +|| + + + +Step_5224832352->_EndNode_5224990864 + + + + + diff --git a/docs/_modules/fluxus/_consumer.html b/docs/_modules/fluxus/_consumer.html new file mode 100644 index 0000000..ddf6e78 --- /dev/null +++ b/docs/_modules/fluxus/_consumer.html @@ -0,0 +1,604 @@ + + + + + + + + + + fluxus._consumer — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus._consumer

+"""
+Implementation of conduit base classes.
+"""
+
+from __future__ import annotations
+
+import logging
+from abc import ABCMeta, abstractmethod
+from collections.abc import AsyncIterable, Iterable
+from typing import Generic, TypeVar, final
+
+from pytools.asyncio import arun, iter_sync_to_async
+
+from .core import AtomicConduit, SerialProcessor
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "AsyncConsumer",
+    "Consumer",
+]
+
+#
+# Type variables
+#
+# Naming convention used here:
+# _ret for covariant type variables used in return positions
+# _arg for contravariant type variables used in argument positions
+#
+
+T_SourceProduct_arg = TypeVar("T_SourceProduct_arg", contravariant=True)
+T_Output_ret = TypeVar("T_Output_ret", covariant=True)
+
+
+#
+# Classes
+#
+
+
+
+[docs] +class Consumer( + AtomicConduit[T_Output_ret], + SerialProcessor[T_SourceProduct_arg, T_Output_ret], + Generic[T_SourceProduct_arg, T_Output_ret], + metaclass=ABCMeta, +): + """ + Consumes products from a producer or group of producers, and returns a single + object. + """ + +
+[docs] + @final + def process(self, input: Iterable[T_SourceProduct_arg]) -> T_Output_ret: + """ + Consume the given products. + + :param input: the products to consume + :return: the resulting object + """ + from .simple import SimpleProducer + + return ( + SimpleProducer[self.input_type](input) >> self # type: ignore[name-defined] + ).run()
+ + +
+[docs] + @final + async def aprocess(self, input: AsyncIterable[T_SourceProduct_arg]) -> T_Output_ret: + """ + Consume the given products asynchronously. + + :param input: the products to consume + :return: the resulting object + """ + from .simple import SimpleAsyncProducer + + return await ( + SimpleAsyncProducer[self.input_type](input) # type: ignore[name-defined] + >> self + ).arun()
+ + +
+[docs] + @abstractmethod + def consume( + self, products: Iterable[tuple[int, T_SourceProduct_arg]] + ) -> T_Output_ret: + """ + Consume products from a producer. + + :param products: an iterable of tuples (`i`, `product`) where `i` is the index + of the concurrent producer, and `product` is a product from that producer + :return: the resulting object + """
+ + +
+[docs] + async def aconsume( + self, products: AsyncIterable[tuple[int, T_SourceProduct_arg]] + ) -> T_Output_ret: + """ + Consume products from an asynchronous producer. + + By default, defers to the synchronous variant, :meth:`.consume`. + + :param products: an iterable of tuples (`i`, `product`) where `i` is the index + of the concurrent producer, and `product` is a product from that producer + :return: the resulting object + """ + return self.consume([products async for products in products])
+
+ + + +
+[docs] +class AsyncConsumer( + Consumer[T_SourceProduct_arg, T_Output_ret], + Generic[T_SourceProduct_arg, T_Output_ret], + metaclass=ABCMeta, +): + """ + A consumer designed for asynchronous I/O. + + Synchronous iteration is supported but discouraged, as it creates a new event loop + and blocks the current thread until the iteration is complete. It is preferable to + use asynchronous iteration instead. + """ + +
+[docs] + @final + def consume( + self, products: Iterable[tuple[int, T_SourceProduct_arg]] + ) -> T_Output_ret: + """ + Consume products from a producer. + + This method is implemented for compatibility with synchronous code, but + preferably, :meth:`aconsume` should be used instead and called from within + an event loop. + + When called from outside an event loop, this method will create an event loop + using :meth:`arun`, collect the source products from :meth:`aconsume` + and block the current thread until the iteration is complete. The resulting + object will then be returned. + + :param products: an iterable of tuples (`i`, `product`) where `i` is the index + of the concurrent producer, and `product` is a product from that producer + :return: the resulting object + """ + return arun(self.aconsume(iter_sync_to_async(products)))
+ + +
+[docs] + @abstractmethod + async def aconsume( + self, + products: AsyncIterable[tuple[int, T_SourceProduct_arg]], + ) -> T_Output_ret: + """ + Consume products from an asynchronous producer. + + :param products: an iterable of tuples (`i`, `product`) where `i` is the index + of the concurrent producer, and `product` is a product from that producer + :return: the resulting object + """
+
+ +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/_flow.html b/docs/_modules/fluxus/_flow.html new file mode 100644 index 0000000..f47229c --- /dev/null +++ b/docs/_modules/fluxus/_flow.html @@ -0,0 +1,499 @@ + + + + + + + + + + fluxus._flow — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus._flow

+"""
+Implementation of conduit base classes.
+"""
+
+from __future__ import annotations
+
+import logging
+from abc import ABCMeta, abstractmethod
+from typing import Any, Generic, TypeVar
+
+from ._consumer import Consumer
+from .core import Conduit
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "Flow",
+]
+
+#
+# Type variables
+#
+# Naming convention used here:
+# _ret for covariant type variables used in return positions
+# _arg for contravariant type variables used in argument positions
+#
+
+T_Output_ret = TypeVar("T_Output_ret", covariant=True)
+
+
+#
+# Classes
+#
+
+
+
+[docs] +class Flow(Conduit[T_Output_ret], Generic[T_Output_ret], metaclass=ABCMeta): + """ + A flow is a sequence of producers, transformers, and consumers that can be + executed to produce a result. + """ + + @property + @abstractmethod + def final_conduit(self) -> Consumer[Any, T_Output_ret]: + """ + The final conduit in the flow; this is the consumer that terminates the flow. + """ + +
+[docs] + @abstractmethod + def run(self) -> T_Output_ret: + """ + Run the flow. + + :return: the result of the flow + """
+ + +
+[docs] + @abstractmethod + async def arun(self) -> T_Output_ret: + """ + Run the flow asynchronously. + + :return: the result of the flow + """
+
+ +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/_passthrough.html b/docs/_modules/fluxus/_passthrough.html new file mode 100644 index 0000000..3f357d4 --- /dev/null +++ b/docs/_modules/fluxus/_passthrough.html @@ -0,0 +1,514 @@ + + + + + + + + + + fluxus._passthrough — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus._passthrough

+"""
+Implementation of conduit and subconduit base classes
+"""
+
+from __future__ import annotations
+
+import logging
+from collections.abc import Collection, Iterator
+from typing import Any, Never, final
+
+from pytools.api import inheritdoc
+from pytools.meta import SingletonABCMeta
+
+# Special import from private submodule to avoid circular imports
+# noinspection PyProtectedMember
+from .core._conduit import SerialConduit
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "Passthrough",
+]
+
+
+#
+# Classes
+#
+
+
+
+[docs] +@final +@inheritdoc(match="[see superclass]") +class Passthrough(SerialConduit[Any], metaclass=SingletonABCMeta): + """ + Can be used in place of a transformer when the flow should pass through the input + without modification. + + May only be grouped concurrently with other transformers whose input type is a + subtype of the output type. + + Chaining a pass-through object with a transformer yields the original transformer. + """ + + @property + def is_chained(self) -> bool: + """ + ``False``, since a passthrough is not a composition of multiple conduits. + """ + return False + + @property + def final_conduit(self) -> Never: + """[see superclass]""" + raise NotImplementedError("Final conduit is not defined for passthroughs") + +
+[docs] + def get_final_conduits(self) -> Iterator[Never]: + """ + Returns an empty iterator since passthroughs do not define a final conduit. + + :return: an empty iterator + """ + # Passthrough conduits are transparent, so we return an empty iterator + yield from ()
+ + + @property + def _has_passthrough(self) -> bool: + """[see superclass]""" + return True + +
+[docs] + def get_connections(self, *, ingoing: Collection[SerialConduit[Any]]) -> Never: + """ + Fails with a :class:`NotImplementedError` since passthroughs are transparent in + flows and therefore connections are not defined. + + :param ingoing: the ingoing conduits (ignored) + :return: nothing; passthroughs do not define connections + :raises NotImplementedError: passthroughs do not define connections + """ + raise NotImplementedError("Connections are not defined for passthroughs")
+
+ +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/_producer.html b/docs/_modules/fluxus/_producer.html new file mode 100644 index 0000000..8657402 --- /dev/null +++ b/docs/_modules/fluxus/_producer.html @@ -0,0 +1,540 @@ + + + + + + + + + + fluxus._producer — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus._producer

+"""
+Implementation of producers.
+"""
+
+from __future__ import annotations
+
+import logging
+from abc import ABCMeta, abstractmethod
+from collections.abc import AsyncIterator, Collection, Iterator
+from typing import Any, Generic, TypeVar, final
+
+from pytools.api import inheritdoc
+from pytools.asyncio import arun, iter_async_to_sync
+
+from .core import AtomicConduit, SerialConduit
+from .core.producer import SerialProducer
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "AsyncProducer",
+    "Producer",
+]
+
+#
+# Type variables
+#
+# Naming convention used here:
+# _ret for covariant type variables used in return positions
+# _arg for contravariant type variables used in argument positions
+#
+
+T_Product_ret = TypeVar("T_Product_ret", covariant=True)
+
+
+#
+# Classes
+#
+
+
+
+[docs] +@inheritdoc(match="[see superclass]") +class Producer( + AtomicConduit[T_Product_ret], + SerialProducer[T_Product_ret], + Generic[T_Product_ret], + metaclass=ABCMeta, +): + """ + Generates objects of a specific type that may be retrieved locally or remotely, or + created dynamically. + + It can run synchronously or asynchronously. + """ + +
+[docs] + def get_connections( + self, *, ingoing: Collection[SerialConduit[Any]] + ) -> Iterator[tuple[SerialConduit[Any], SerialConduit[Any]]]: + """[see superclass]""" + if ingoing: + raise ValueError( + f"Producers cannot have ingoing connections, but got: {ingoing}" + ) + yield from ()
+
+ + + +
+[docs] +@inheritdoc(match="[see superclass]") +class AsyncProducer(Producer[T_Product_ret], Generic[T_Product_ret], metaclass=ABCMeta): + """ + A producer designed for asynchronous I/O. + + Synchronous iteration is supported but discouraged, as it creates a new event loop + and blocks the current thread until the iteration is complete. It is preferable to + use asynchronous iteration instead. + """ + +
+[docs] + @final + def iter(self) -> Iterator[T_Product_ret]: + """ + Generate new products, optionally using an existing producer as input. + + This method is implemented for compatibility with synchronous code, but + preferably, :meth:`.aiter` should be used instead and called from within an + event loop. + + When called from outside an event loop, this method will create an event loop + using :meth:`arun`, collect the products from :meth:`aiter` and block the + current thread until the iteration is complete. The products will then be + returned as a list. + + :return: the new products + :raises RuntimeError: if called from within an event loop + """ + + return arun(iter_async_to_sync(self.aiter()))
+ + +
+[docs] + @abstractmethod + def aiter(self) -> AsyncIterator[T_Product_ret]: + """[see superclass]"""
+
+ +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/_transformer.html b/docs/_modules/fluxus/_transformer.html new file mode 100644 index 0000000..46b9c7b --- /dev/null +++ b/docs/_modules/fluxus/_transformer.html @@ -0,0 +1,563 @@ + + + + + + + + + + fluxus._transformer — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus._transformer

+"""
+Implementation of transformers.
+"""
+
+from __future__ import annotations
+
+import logging
+from abc import ABCMeta, abstractmethod
+from collections.abc import AsyncIterator, Iterator
+from typing import Any, Generic, TypeVar, final
+
+from pytools.asyncio import arun, iter_async_to_sync
+from pytools.typing import issubclass_generic
+
+from ._passthrough import Passthrough
+from .core import AtomicConduit
+from .core.transformer import SerialTransformer
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "AsyncTransformer",
+    "Transformer",
+]
+
+#
+# Type variables
+#
+# Naming convention used here:
+# _ret for covariant type variables used in return positions
+# _arg for contravariant type variables used in argument positions
+#
+
+T_SourceProduct_arg = TypeVar("T_SourceProduct_arg", contravariant=True)
+T_Product_arg = TypeVar("T_Product_arg", contravariant=True)
+T_Product_ret = TypeVar("T_Product_ret", covariant=True)
+T_TransformedProduct_ret = TypeVar("T_TransformedProduct_ret", covariant=True)
+
+
+#
+# Classes
+#
+
+
+
+[docs] +class Transformer( + AtomicConduit[T_TransformedProduct_ret], + SerialTransformer[T_SourceProduct_arg, T_TransformedProduct_ret], + Generic[T_SourceProduct_arg, T_TransformedProduct_ret], + metaclass=ABCMeta, +): + """ + An atomic transformer that generates new products from the products of a producer. + """
+ + + +
+[docs] +class AsyncTransformer( + Transformer[T_SourceProduct_arg, T_Product_ret], + Generic[T_SourceProduct_arg, T_Product_ret], + metaclass=ABCMeta, +): + """ + A transformer designed for asynchronous I/O. + + Synchronous iteration is supported but discouraged, as it creates a new event loop + and blocks the current thread until the iteration is complete. It is preferable to + use asynchronous iteration instead. + """ + +
+[docs] + @final + def transform(self, source_product: T_SourceProduct_arg) -> Iterator[T_Product_ret]: + """ + Generate a new product, using an existing product as input. + + This method is implemented for compatibility with synchronous code, but + preferably, :meth:`.atransform` should be used instead and called from + within an event loop. + + When called from outside an event loop, this method will create an event loop + using :meth:`arun`, transform the product using :meth:`atransform` and + block the current thread until the iteration is complete. The new product will + then be returned. + + :param source_product: the existing product to use as input + :return: the new product + """ + return arun(iter_async_to_sync(self.atransform(source_product)))
+ + +
+[docs] + @abstractmethod + def atransform( + self, source_product: T_SourceProduct_arg + ) -> AsyncIterator[T_Product_ret]: + """ + Generate a new product asynchronously, using an existing product as input. + + :param source_product: the existing product to use as input + :return: the new product + """
+
+ + + +# +# Auxiliary functions +# + + +def _validate_concurrent_passthrough( + conduit: SerialTransformer[Any, Any] | Passthrough +) -> None: + """ + Validate that the given conduit is valid as a concurrent conduit with a passthrough. + + To be valid, its input type must be a subtype of its product type. + + :param conduit: the conduit to validate + """ + + if not ( + isinstance(conduit, Passthrough) + or issubclass_generic(conduit.input_type, conduit.product_type) + ): + raise TypeError( + "Conduit is not a valid concurrent conduit with a passthrough because its " + f"input type {conduit.input_type} is not a subtype of its product type " + f"{conduit.product_type}:\n{conduit}" + ) +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/_warning.html b/docs/_modules/fluxus/_warning.html new file mode 100644 index 0000000..93ebbfc --- /dev/null +++ b/docs/_modules/fluxus/_warning.html @@ -0,0 +1,443 @@ + + + + + + + + + + fluxus._warning — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus._warning

+"""
+A warning class specific to the `flow` package.
+"""
+
+__all__ = [
+    "FlowWarning",
+]
+
+
+
+[docs] +class FlowWarning(Warning): + """ + A warning specific to flows. + """
+ +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/core/_base.html b/docs/_modules/fluxus/core/_base.html new file mode 100644 index 0000000..733535b --- /dev/null +++ b/docs/_modules/fluxus/core/_base.html @@ -0,0 +1,619 @@ + + + + + + + + + + fluxus.core._base — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus.core._base

+"""
+Implementation of `source` and `processor` base classes.
+
+- Producers and transformers are `sources` because they generate outputs.
+- Transformers and consumers are `processors` because they consume inputs.
+"""
+
+from __future__ import annotations
+
+import logging
+from abc import ABCMeta, abstractmethod
+from collections.abc import AsyncIterable, Collection, Iterable, Iterator
+from typing import Any, Generic, Self, TypeVar, cast
+
+from pytools.api import inheritdoc
+from pytools.typing import get_common_generic_base, issubclass_generic
+
+from ._conduit import Conduit, SerialConduit
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "SerialSource",
+    "SerialProcessor",
+    "Source",
+    "Processor",
+]
+
+#
+# Type variables
+#
+# Naming convention used here:
+# _ret for covariant type variables used in return positions
+# _arg for contravariant type variables used in argument positions
+#
+
+T_SourceProduct_arg = TypeVar("T_SourceProduct_arg", contravariant=True)
+T_Product_ret = TypeVar("T_Product_ret", covariant=True)
+T_Output_ret = TypeVar("T_Output_ret", covariant=True)
+
+
+
+[docs] +class Source(Conduit[T_Product_ret], Generic[T_Product_ret], metaclass=ABCMeta): + """ + A conduit that produces or transforms products. + + This is a base class for :class:`.Producer` and :class:`.Transformer`. + """ + + @property + def product_type(self) -> type[T_Product_ret]: + """ + The type of the products produced by this conduit. + """ + from .. import Passthrough + + return get_common_generic_base( + cast(SerialSource[T_Product_ret], source).product_type + for source in self.iter_concurrent_conduits() + if not isinstance(source, Passthrough) + )
+ + + +
+[docs] +class Processor( + Conduit[T_Output_ret], + Generic[T_SourceProduct_arg, T_Output_ret], + metaclass=ABCMeta, +): + """ + A transformer or consumer that attaches to a producer or transformer to process its + products. + + This is a base class for :class:`.Transformer` and :class:`.Consumer`. + """ + + @property + def input_type(self) -> type[T_SourceProduct_arg]: + """ + The type of the input processed by this conduit. + """ + return cast(type[T_SourceProduct_arg], self._get_type_arguments(Processor)[0]) + +
+[docs] + @abstractmethod + def process( + self, input: Iterable[T_SourceProduct_arg] + ) -> list[T_Output_ret] | T_Output_ret: + """ + Generate new products from the given input. + + :param input: the input products + :return: the generated output or outputs + """
+ + +
+[docs] + @abstractmethod + async def aprocess( + self, input: AsyncIterable[T_SourceProduct_arg] + ) -> list[T_Output_ret] | T_Output_ret: + """ + Generate new products asynchronously from the given input. + + :param input: the input products + :return: the generated output or outputs + """
+ + +
+[docs] + def is_valid_source( + self, + source: SerialConduit[T_SourceProduct_arg], + ) -> bool: + """ + Check if the given producer or transformer is a valid source for this conduit. + + The precursor is the final conduit of the source producer or group. + Returning ``False`` will cause a :class:`TypeError` to be raised. + + :param source: the source conduit to check + :return: ``True`` if the given conduit is valid source for this conduit, + ``False`` otherwise + """ + from .. import Passthrough + + if not isinstance(source, SerialSource): + return False + + ingoing_product_type = source.product_type + return all( + issubclass_generic( + ingoing_product_type, + cast(Self, processor).input_type, + ) + for processor in self.iter_concurrent_conduits() + if not isinstance(processor, Passthrough) + )
+
+ + + +
+[docs] +class SerialSource( + SerialConduit[T_Product_ret], + Source[T_Product_ret], + Generic[T_Product_ret], + metaclass=ABCMeta, +): + """ + A conduit that produces or transforms products. + """ + + @property + def product_type(self) -> type[T_Product_ret]: + """ + The type of the product produced by this conduit. + """ + return cast(type[T_Product_ret], self._get_type_arguments(SerialSource)[0])
+ + + +
+[docs] +@inheritdoc(match="[see superclass]") +class SerialProcessor( + Processor[T_SourceProduct_arg, T_Product_ret], + SerialConduit[T_Product_ret], + Generic[T_SourceProduct_arg, T_Product_ret], + metaclass=ABCMeta, +): + """ + A processor that processes products sequentially. + """ + +
+[docs] + def get_connections( + self, *, ingoing: Collection[SerialConduit[Any]] + ) -> Iterator[tuple[SerialConduit[Any], SerialConduit[Any]]]: + """[see superclass]""" + for conduit in ingoing: + yield conduit, self
+
+ +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/core/_concurrent.html b/docs/_modules/fluxus/core/_concurrent.html new file mode 100644 index 0000000..80f1be8 --- /dev/null +++ b/docs/_modules/fluxus/core/_concurrent.html @@ -0,0 +1,501 @@ + + + + + + + + + + fluxus.core._concurrent — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus.core._concurrent

+"""
+Implementation of ``ConcurrentConduit``.
+"""
+
+from __future__ import annotations
+
+import logging
+from abc import ABCMeta
+from typing import Any, Generic, Self, TypeVar, final
+
+from pytools.api import inheritdoc
+
+from ._conduit import Conduit
+
+log = logging.getLogger(__name__)
+
+
+__all__ = [
+    "ConcurrentConduit",
+]
+
+#
+# Type variables
+#
+# Naming convention used here:
+# _ret for covariant type variables used in return positions
+# _arg for contravariant type variables used in argument positions
+#
+
+T_Product_ret = TypeVar("T_Product_ret", covariant=True)
+
+#
+# Classes
+#
+
+
+
+[docs] +@inheritdoc(match="[see superclass]") +class ConcurrentConduit( + Conduit[T_Product_ret], Generic[T_Product_ret], metaclass=ABCMeta +): + """ + A conduit made up of multiple concurrent conduits. + + This includes concurrent producers (see :class:`.ConcurrentProducer`) and concurrent + transformers (see :class:`.ConcurrentTransformer`). + """ + + @property + @final + def is_concurrent(self) -> bool: + """ + ``True``, since this is a group of concurrent conduits. + """ + return True + + @property + def is_chained(self) -> bool: + """[see superclass]""" + return any(conduit.is_chained for conduit in self.iter_concurrent_conduits()) + + @property + def final_conduit(self) -> Self: + """ + ``self``, since this is a group of concurrent conduits and has no final + conduit on a more granular level. + """ + return self
+ + + +# assign Any to suppress "unused import" warning +_ = Any +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/core/_conduit.html b/docs/_modules/fluxus/core/_conduit.html new file mode 100644 index 0000000..1185fbd --- /dev/null +++ b/docs/_modules/fluxus/core/_conduit.html @@ -0,0 +1,810 @@ + + + + + + + + + + fluxus.core._conduit — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus.core._conduit

+"""
+Implementation of conduit and subconduit base classes
+"""
+
+from __future__ import annotations
+
+import logging
+from abc import ABCMeta, abstractmethod
+from collections.abc import AsyncIterator, Collection, Iterator, Mapping
+from typing import Any, Generic, Self, TypeVar, final
+
+from pytools.api import get_init_params, inheritdoc
+from pytools.expression import (
+    Expression,
+    HasExpressionRepr,
+    expression_from_init_params,
+)
+from pytools.expression.atomic import Id
+from pytools.typing import get_type_arguments
+
+from ..util import simplify_repr_attributes
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "AtomicConduit",
+    "Conduit",
+    "SerialConduit",
+]
+
+#
+# Type variables
+#
+# Naming convention used here:
+# _ret for covariant type variables used in return positions
+# _arg for contravariant type variables used in argument positions
+#
+
+
+T_Product_ret = TypeVar("T_Product_ret", covariant=True)
+T_Output_ret = TypeVar("T_Output_ret", covariant=True)
+
+#
+# Classes
+#
+
+
+
+[docs] +class Conduit(HasExpressionRepr, Generic[T_Output_ret], metaclass=ABCMeta): + """ + An element of a flow, which can be a producer, a transformer, a consumer, + or a sequential or concurrent composition of these. + """ + + @property + def name(self) -> str: + """ + The name of this conduit. + """ + return type(self).__name__ + + @property + @abstractmethod + def is_chained(self) -> bool: + """ + ``True`` if this conduit contains a composition of conduits chained together + sequentially, ``False`` otherwise. + """ + + @property + @abstractmethod + def is_concurrent(self) -> bool: + """ + ``True`` if this conduit is a group of concurrent conduits, ``False`` + otherwise. + """ + + @property + @final + def is_atomic(self) -> bool: + """ + ``True`` if this conduit is atomic, ``False`` if it is a chained or concurrent + composition of other conduits. + """ + return not (self.is_chained or self.is_concurrent) + + @property + @abstractmethod + def final_conduit(self) -> Conduit[T_Output_ret]: + """ + The final conduit if this conduit is a sequential composition of conduits; + ``self`` if this conduit is atomic or a group of concurrent conduits. + """ + +
+[docs] + @abstractmethod + def get_final_conduits(self) -> Iterator[SerialConduit[T_Output_ret]]: + """ + Get an iterator yielding the final atomic conduit or conduits of the (sub)flow + represented by this conduit. + + If this conduit is atomic, yields the conduit itself. + + If this conduit is a sequential composition of conduits, yields the final + conduit of that sequence. + + If this conduit is a group of concurrent conduits, yields the final conduits of + each of the concurrent conduits. + + :return: an iterator yielding the final conduits + """
+ + + @property + @abstractmethod + def n_concurrent_conduits(self) -> int: + """ + The number of concurrent conduits in this conduit. + """ + +
+[docs] + @abstractmethod + def iter_concurrent_conduits(self) -> Iterator[SerialConduit[T_Output_ret]]: + """ + Iterate over the concurrent conduits that make up this conduit. + + :return: an iterator over the concurrent conduits + """
+ + +
+[docs] + async def aiter_concurrent_conduits( + self, + ) -> AsyncIterator[SerialConduit[T_Output_ret]]: + """ + Asynchronously iterate over the concurrent conduits that make up this conduit. + + :return: an asynchronous iterator over the concurrent conduits + """ + for conduit in self.iter_concurrent_conduits(): + yield conduit
+ + +
+[docs] + def draw(self, style: str = "graph") -> None: + """ + Draw the flow. + + :param style: the style to use for drawing the flow, see :class:`.FlowDrawer` + for available styles (defaults to "graph") + """ + from ..viz import FlowDrawer + + FlowDrawer(style=style).draw(self, title="Flow")
+ + + @property + def _has_passthrough(self) -> bool: + """ + ``True`` if this conduit contains a passthrough, ``False`` otherwise. + """ + return False + +
+[docs] + @abstractmethod + def get_connections( + self, *, ingoing: Collection[SerialConduit[Any]] + ) -> Iterator[tuple[SerialConduit[Any], SerialConduit[Any]]]: + """ + Get the connections between conduits in this conduit. + + :param ingoing: the ingoing conduits, if any + :return: an iterator yielding connections between conduits + """
+ + + def _repr_svg_(self) -> str: # pragma: no cover + """ + Get the SVG representation of the flow. + + :return: the SVG representation + """ + # create a Bytes buffer to write the SVG to + from io import BytesIO + + svg = BytesIO() + + from ..viz import FlowDrawer, FlowGraphStyle + + FlowDrawer(style=FlowGraphStyle(file=svg, format="svg")).draw( + self, title="Flow" + ) + return svg.getvalue().decode("utf-8") + + def _repr_html_(self) -> str: # pragma: no cover + """[see superclass]""" + + try: + import graphviz # noqa F401 + except ImportError: + # Graphviz is not available, so we keep the default representation + return super()._repr_html_() + else: + # Graphviz is available, so we can add the SVG representation + return self._repr_svg_() + +
+[docs] + @abstractmethod + def to_expression(self, *, compact: bool = False) -> Expression: + """ + Make an expression representing this conduit. + + :param compact: if ``True``, use a compact representation using only the subset + of conduit attributes from :meth:`~.SerialConduit.get_repr_attributes`; + if ``False``, generate the full representation using all attributes + :return: the expression representing this conduit + """
+ + + def _get_type_arguments(self, base: type) -> tuple[type, ...]: + """ + Get the type arguments of this conduit with respect to the given base class. + + :param base: the base class to get the type arguments for + :return: the type arguments + :raises TypeError: if the type arguments are ambiguous due to multiple + inheritance + """ + args = get_type_arguments(self, base) + if len(args) > 1: + raise TypeError( + f"Ambiguous type arguments for {self.name} with respect to " + f"base class {base.__name__}: " + ", ".join(map(str, args)) + ) + return args[0] + + def __str__(self) -> str: + """[see superclass]""" + return str(self.to_expression(compact=True))
+ + + +
+[docs] +@inheritdoc(match="[see superclass]") +class SerialConduit(Conduit[T_Product_ret], Generic[T_Product_ret], metaclass=ABCMeta): + """ + A conduit that is either atomic, or a sequential composition of conduits. + """ + + @property + @final + def is_concurrent(self) -> bool: + """ + ``False``, since this is a serial conduit and therefore is not made up of + concurrent conduits. + """ + return False + + @property + def final_conduit(self) -> SerialConduit[T_Product_ret]: + """ + The final conduit if this conduit is a sequential composition of conduits; + ``self`` if this conduit is atomic. + """ + return next(self.get_final_conduits()) + +
+[docs] + @abstractmethod + def get_final_conduits(self) -> Iterator[SerialConduit[T_Product_ret]]: + """[see superclass]"""
+ + + @property + def n_concurrent_conduits(self) -> int: + """ + The number of concurrent conduits in this conduit. Returns `1`, since a serial + conduit is not made up of concurrent conduits. + """ + return 1 + +
+[docs] + def iter_concurrent_conduits(self) -> Iterator[Self]: + """ + Yields ``self``, since this is a serial conduit and is not made up of concurrent + conduits. + + :return: an iterator with ``self`` as the only element + """ + yield self
+ + +
+[docs] + async def aiter_concurrent_conduits(self: Self) -> AsyncIterator[Self]: + """ + Yields ``self``, since this is a serial conduit and is not made up of concurrent + conduits. + + :return: an asynchronous iterator with ``self`` as the only element + """ + yield self
+ + + @property + def chained_conduits(self) -> Iterator[SerialConduit[T_Product_ret]]: + """ + An iterator yielding the chained conduits that make up this conduit, starting + with the initial conduit and ending with the final conduit. + + For atomic conduit, yields the conduit itself. + """ + yield self.final_conduit + +
+[docs] + def get_repr_attributes(self) -> Mapping[str, Any]: + """ + Get attributes of this conduit to be used in representations. + + :return: a dictionary mapping attribute names to their values + """ + + return get_init_params(self, ignore_default=True, ignore_missing=True)
+ + +
+[docs] + def to_expression(self, *, compact: bool = False) -> Expression: + """[see superclass]""" + if compact: + return Id(self.name)(**simplify_repr_attributes(self.get_repr_attributes())) + else: + return expression_from_init_params(self)
+
+ + + +
+[docs] +@inheritdoc(match="[see superclass]") +class AtomicConduit( + SerialConduit[T_Product_ret], Generic[T_Product_ret], metaclass=ABCMeta +): + """ + An atomic conduit that is not a composition of other conduits. + """ + + @property + @final + def is_chained(self) -> bool: + """ + ``False``, since this is an atomic conduit and is not a composition of multiple + conduits. + """ + return False + + @property + @final + def final_conduit(self) -> Self: + """ + ``self``, since this is an atomic conduit and has no final conduit on a more + granular level. + """ + return self + +
+[docs] + @final + def get_final_conduits(self) -> Iterator[Self]: + """[see superclass]""" + yield self
+
+ +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/core/producer/_producer_base.html b/docs/_modules/fluxus/core/producer/_producer_base.html new file mode 100644 index 0000000..c76a967 --- /dev/null +++ b/docs/_modules/fluxus/core/producer/_producer_base.html @@ -0,0 +1,656 @@ + + + + + + + + + + fluxus.core.producer._producer_base — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus.core.producer._producer_base

+"""
+Implementation of conduit base classes.
+"""
+
+from __future__ import annotations
+
+import logging
+from abc import ABCMeta, abstractmethod
+from collections.abc import AsyncIterator, Iterator
+from typing import Generic, TypeVar, cast, final
+
+from pytools.api import inheritdoc
+from pytools.asyncio import async_flatten
+from pytools.typing import get_common_generic_base
+
+from ..._consumer import Consumer
+from ..._flow import Flow
+from .. import ConcurrentConduit, SerialSource, Source
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "BaseProducer",
+    "ConcurrentProducer",
+    "SerialProducer",
+]
+
+#
+# Type variables
+#
+# Naming convention used here:
+# _ret for covariant type variables used in return positions
+# _arg for contravariant type variables used in argument positions
+#
+
+T_Product_ret = TypeVar("T_Product_ret", covariant=True)
+T_Output_ret = TypeVar("T_Output_ret", covariant=True)
+
+#
+# Classes
+#
+
+
+
+[docs] +@inheritdoc(match="[see superclass]") +class BaseProducer(Source[T_Product_ret], Generic[T_Product_ret], metaclass=ABCMeta): + """ + A source that generates products from scratch – this is either a + :class:`.Producer` or a :class:`.ConcurrentProducer`. + """ + +
+[docs] + @abstractmethod + def iter(self) -> Iterator[T_Product_ret]: + """ + Generate new products. + + :return: the new products + """
+ + +
+[docs] + @abstractmethod + def aiter(self) -> AsyncIterator[T_Product_ret]: + """ + Generate new products asynchronously. + + :return: the new products + """
+ + +
+[docs] + @abstractmethod + def iter_concurrent_conduits(self) -> Iterator[SerialProducer[T_Product_ret]]: + """[see superclass]"""
+ + +
+[docs] + @abstractmethod + def aiter_concurrent_conduits(self) -> AsyncIterator[SerialProducer[T_Product_ret]]: + """[see superclass]"""
+ + + @final + def __iter__(self) -> Iterator[T_Product_ret]: + return self.iter() + + @final + def __aiter__(self) -> AsyncIterator[T_Product_ret]: + return self.aiter() + + def __and__( + self, other: BaseProducer[T_Product_ret] + ) -> ConcurrentProducer[T_Product_ret]: + + if isinstance(other, BaseProducer): + from . import SimpleConcurrentProducer + + # We determine the type hint at runtime, and use a type cast to + # indicate the type for static type checks + return cast( + ConcurrentProducer[T_Product_ret], + SimpleConcurrentProducer[ # type: ignore[misc] + get_common_generic_base((self.product_type, other.product_type)) + ](self, other), + ) + else: + return NotImplemented + + def __rshift__( + self, + other: Consumer[T_Product_ret, T_Output_ret], + ) -> Flow[T_Output_ret]: + if isinstance(other, Consumer): + # We import locally to avoid circular imports + from ._chained_ import _ProducerGroupFlow + + return _ProducerGroupFlow(producer=self, consumer=other) + else: + return NotImplemented
+ + + +
+[docs] +@inheritdoc(match="[see superclass]") +class SerialProducer( + BaseProducer[T_Product_ret], + SerialSource[T_Product_ret], + Generic[T_Product_ret], + metaclass=ABCMeta, +): + """ + Generates objects of a specific type that may be retrieved locally or remotely, or + created dynamically. + + It can run synchronously or asynchronously. + """ + +
+[docs] + def iter_concurrent_conduits(self) -> Iterator[SerialProducer[T_Product_ret]]: + """[see superclass]""" + yield self
+ + +
+[docs] + async def aiter_concurrent_conduits( + self, + ) -> AsyncIterator[SerialProducer[T_Product_ret]]: + """[see superclass]""" + yield self
+ + +
+[docs] + async def aiter(self) -> AsyncIterator[T_Product_ret]: + """ + Generate new products asynchronously. + + By default, defers to the synchronous variant, :meth:`.iter`. + + :return: the new products + """ + for product in self.iter(): + yield product
+ + + def __rshift__( + self, + other: Consumer[T_Product_ret, T_Output_ret], + ) -> Flow[T_Output_ret]: + if isinstance(other, Consumer): + # We import locally to avoid circular imports + from ._chained_ import _ProducerFlow + + return _ProducerFlow(producer=self, consumer=other) + else: + return NotImplemented
+ + + +
+[docs] +class ConcurrentProducer( + ConcurrentConduit[T_Product_ret], + BaseProducer[T_Product_ret], + Generic[T_Product_ret], + metaclass=ABCMeta, +): + """ + A collection of one or more producers. + """ + +
+[docs] + def iter(self) -> Iterator[T_Product_ret]: + """ + Generate new products from all producers in this group. + + :return: an iterator of the new products + """ + for producer in self.iter_concurrent_conduits(): + yield from producer
+ + +
+[docs] + def aiter(self) -> AsyncIterator[T_Product_ret]: + """ + Generate new products from all producers in this group asynchronously. + + :return: an async iterator of the new products + """ + # create tasks for each producer - these need to be coroutines that materialize + # the producers + + # noinspection PyTypeChecker + return async_flatten( + producer.aiter() async for producer in self.aiter_concurrent_conduits() + )
+
+ +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/core/producer/_simple.html b/docs/_modules/fluxus/core/producer/_simple.html new file mode 100644 index 0000000..6f47a26 --- /dev/null +++ b/docs/_modules/fluxus/core/producer/_simple.html @@ -0,0 +1,591 @@ + + + + + + + + + + fluxus.core.producer._simple — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus.core.producer._simple

+"""
+Implementation of unions.
+"""
+
+from __future__ import annotations
+
+import functools
+import itertools
+import logging
+import operator
+from collections.abc import AsyncIterator, Collection, Iterator
+from typing import Any, Generic, TypeVar, cast
+
+from pytools.api import inheritdoc, to_tuple
+from pytools.asyncio import async_flatten, iter_sync_to_async
+from pytools.expression import Expression
+
+from ... import Passthrough
+from .. import SerialConduit
+from ._producer_base import BaseProducer, ConcurrentProducer, SerialProducer
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "SimpleConcurrentProducer",
+]
+
+#
+# Type variables
+#
+# Naming convention used here:
+# _ret for covariant type variables used in return positions
+# _arg for contravariant type variables used in argument positions
+
+T_SourceProduct_ret = TypeVar("T_SourceProduct_ret", covariant=True)
+
+
+#
+# Constants
+#
+
+# The passthrough singleton instance.
+_PASSTHROUGH = Passthrough()
+
+#
+# Classes
+#
+
+
+
+[docs] +@inheritdoc(match="[see superclass]") +class SimpleConcurrentProducer( + ConcurrentProducer[T_SourceProduct_ret], Generic[T_SourceProduct_ret] +): + """ + A simple group that manages a collection of producers. + """ + + #: The response sources this producer provides. + producers: tuple[BaseProducer[T_SourceProduct_ret], ...] + + def __init__( + self, + *producers: BaseProducer[T_SourceProduct_ret], + ) -> None: + """ + :param producers: the response producer(s) this producer uses to generate + response groups + """ + super().__init__() + + self.producers = to_tuple( + itertools.chain(*map(_flatten_concurrent_producers, producers)), + element_type=cast( + tuple[ + type[BaseProducer[T_SourceProduct_ret]], + ..., + ], + BaseProducer, + ), + arg_name="producers", + ) + + @property + def n_concurrent_conduits(self) -> int: + """[see superclass]""" + return sum(producer.n_concurrent_conduits for producer in self.producers) + +
+[docs] + def get_final_conduits(self) -> Iterator[SerialConduit[T_SourceProduct_ret]]: + """[see superclass]""" + for producer in self.producers: + yield from producer.get_final_conduits()
+ + +
+[docs] + def get_connections( + self, *, ingoing: Collection[SerialConduit[Any]] + ) -> Iterator[tuple[SerialConduit[Any], SerialConduit[Any]]]: + """[see superclass]""" + assert not ingoing, "Producer groups cannot have ingoing conduits" + for producer in self.producers: + yield from producer.get_connections(ingoing=ingoing)
+ + +
+[docs] + def iter_concurrent_conduits( + self, + ) -> Iterator[SerialProducer[T_SourceProduct_ret]]: + """[see superclass]""" + for prod in self.producers: + yield from prod.iter_concurrent_conduits()
+ + +
+[docs] + def aiter_concurrent_conduits( + self, + ) -> AsyncIterator[SerialProducer[T_SourceProduct_ret]]: + """[see superclass]""" + + # noinspection PyTypeChecker + return async_flatten( + prod.aiter_concurrent_conduits() + async for prod in iter_sync_to_async(self.producers) + )
+ + +
+[docs] + def to_expression(self, *, compact: bool = False) -> Expression: + """[see superclass]""" + return functools.reduce( + operator.and_, + (producer.to_expression(compact=compact) for producer in self.producers), + )
+
+ + + +# +# Auxiliary functions +# + + +def _flatten_concurrent_producers( + producer: BaseProducer[T_SourceProduct_ret], +) -> Iterator[BaseProducer[T_SourceProduct_ret]]: + """ + Iterate over the given producer or its sub-producers, if they are contained in a + (possibly nested) simple concurrent producer. + + :param producer: the producer to flatten + :return: an iterator over the given producer or its sub-producers + """ + if isinstance(producer, SimpleConcurrentProducer): + for producer in producer.producers: + yield from _flatten_concurrent_producers(producer) + else: + yield producer +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/core/transformer/_simple.html b/docs/_modules/fluxus/core/transformer/_simple.html new file mode 100644 index 0000000..8ae2447 --- /dev/null +++ b/docs/_modules/fluxus/core/transformer/_simple.html @@ -0,0 +1,614 @@ + + + + + + + + + + fluxus.core.transformer._simple — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus.core.transformer._simple

+"""
+Implementation of unions.
+"""
+
+from __future__ import annotations
+
+import functools
+import itertools
+import logging
+import operator
+from collections.abc import AsyncIterator, Collection, Iterator
+from typing import Any, Generic, TypeVar, cast
+
+from pytools.api import inheritdoc, to_tuple
+from pytools.asyncio import async_flatten, iter_sync_to_async
+from pytools.expression import Expression
+
+from ... import Passthrough
+from .. import SerialConduit
+from ._transformer_base import BaseTransformer, ConcurrentTransformer, SerialTransformer
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "SimpleConcurrentTransformer",
+]
+
+#
+# Type variables
+#
+# Naming convention used here:
+# _ret for covariant type variables used in return positions
+# _arg for contravariant type variables used in argument positions
+
+T_SourceProduct_arg = TypeVar("T_SourceProduct_arg", contravariant=True)
+T_TransformedProduct_ret = TypeVar("T_TransformedProduct_ret", covariant=True)
+
+
+#
+# Constants
+#
+
+# The passthrough singleton instance.
+_PASSTHROUGH = Passthrough()
+
+#
+# Classes
+#
+
+
+
+[docs] +@inheritdoc(match="[see superclass]") +class SimpleConcurrentTransformer( + ConcurrentTransformer[T_SourceProduct_arg, T_TransformedProduct_ret], + Generic[T_SourceProduct_arg, T_TransformedProduct_ret], +): + """ + A collection of one or more transformers, operating in parallel. + """ + + #: The transformers in this group. + transformers: tuple[ + BaseTransformer[T_SourceProduct_arg, T_TransformedProduct_ret] | Passthrough, + ..., + ] + + def __init__( + self, + *transformers: ( + BaseTransformer[T_SourceProduct_arg, T_TransformedProduct_ret] | Passthrough + ), + ) -> None: + """ + :param transformers: the transformers in this group + """ + self.transformers = to_tuple( + itertools.chain(*map(_flatten_concurrent_transformers, transformers)), + element_type=cast( + tuple[ + type[ + BaseTransformer[T_SourceProduct_arg, T_TransformedProduct_ret] + | Passthrough + ], + ..., + ], + (BaseTransformer, Passthrough), + ), + ) + + @property + def n_concurrent_conduits(self) -> int: + """[see superclass]""" + return sum( + transformer.n_concurrent_conduits for transformer in self.transformers + ) + +
+[docs] + def get_final_conduits(self) -> Iterator[SerialConduit[T_TransformedProduct_ret]]: + """[see superclass]""" + for transformer in self.transformers: + yield from transformer.get_final_conduits()
+ + + @property + def _has_passthrough(self) -> bool: + """[see superclass]""" + return any(transformer._has_passthrough for transformer in self.transformers) + +
+[docs] + def get_connections( + self, *, ingoing: Collection[SerialConduit[Any]] + ) -> Iterator[tuple[SerialConduit[Any], SerialConduit[Any]]]: + """[see superclass]""" + for transformer in self.transformers: + if transformer is not _PASSTHROUGH: + yield from transformer.get_connections(ingoing=ingoing)
+ + +
+[docs] + def iter_concurrent_conduits( + self, + ) -> Iterator[ + SerialTransformer[T_SourceProduct_arg, T_TransformedProduct_ret] | Passthrough + ]: + """[see superclass]""" + for transformer in self.transformers: + yield from transformer.iter_concurrent_conduits()
+ + +
+[docs] + def aiter_concurrent_conduits( + self, + ) -> AsyncIterator[ + SerialTransformer[T_SourceProduct_arg, T_TransformedProduct_ret] | Passthrough + ]: + """[see superclass]""" + # noinspection PyTypeChecker + return async_flatten( + transformer.aiter_concurrent_conduits() + async for transformer in iter_sync_to_async(self.transformers) + )
+ + +
+[docs] + def to_expression(self, *, compact: bool = False) -> Expression: + """[see superclass]""" + return functools.reduce( + operator.and_, + ( + transformer.to_expression(compact=compact) + for transformer in self.transformers + ), + )
+
+ + + +# +# Auxiliary functions +# + + +def _flatten_concurrent_transformers( + transformer: ( + BaseTransformer[T_SourceProduct_arg, T_TransformedProduct_ret] | Passthrough + ) +) -> Iterator[ + BaseTransformer[T_SourceProduct_arg, T_TransformedProduct_ret] | Passthrough +]: + """ + Iterate over the given transformer or its sub-transformers, if they are contained in + a (possibly nested) simple concurrent transformer. + + :param transformer: the transformer to flatten + :return: an iterator over the given transformer or its sub-transformers + """ + if isinstance(transformer, SimpleConcurrentTransformer): + for transformer in transformer.transformers: + yield from _flatten_concurrent_transformers(transformer) + else: + yield transformer +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/core/transformer/_transformer_base.html b/docs/_modules/fluxus/core/transformer/_transformer_base.html new file mode 100644 index 0000000..9e8263c --- /dev/null +++ b/docs/_modules/fluxus/core/transformer/_transformer_base.html @@ -0,0 +1,873 @@ + + + + + + + + + + fluxus.core.transformer._transformer_base — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus.core.transformer._transformer_base

+"""
+Implementation of conduit base classes.
+"""
+
+from __future__ import annotations
+
+import logging
+from abc import ABCMeta, abstractmethod
+from collections.abc import AsyncIterable, AsyncIterator, Iterable, Iterator
+from typing import Any, Generic, Self, TypeVar, final, overload
+
+from pytools.api import inheritdoc
+from pytools.asyncio import async_flatten
+from pytools.typing import (
+    get_common_generic_base,
+    get_common_generic_subclass,
+    issubclass_generic,
+)
+
+from ..._passthrough import Passthrough
+from .. import ConcurrentConduit, Processor, SerialProcessor, SerialSource, Source
+from ..producer import BaseProducer, SerialProducer
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "BaseTransformer",
+    "ConcurrentTransformer",
+    "SerialTransformer",
+]
+
+#
+# Type variables
+#
+# Naming convention used here:
+# _ret for covariant type variables used in return positions
+# _arg for contravariant type variables used in argument positions
+#
+
+T_SourceProduct_arg = TypeVar("T_SourceProduct_arg", contravariant=True)
+T_Product_ret = TypeVar("T_Product_ret", covariant=True)
+T_TransformedProduct_ret = TypeVar("T_TransformedProduct_ret", covariant=True)
+
+
+#
+# Classes
+#
+
+
+
+[docs] +@inheritdoc(match="[see superclass]") +class BaseTransformer( + Processor[T_SourceProduct_arg, T_TransformedProduct_ret], + Source[T_TransformedProduct_ret], + Generic[T_SourceProduct_arg, T_TransformedProduct_ret], + metaclass=ABCMeta, +): + """ + A conduit that transforms products from a source – this is either a + :class:`.SerialTransformer` or a :class:`.ConcurrentTransformer`. + """ + +
+[docs] + @abstractmethod + def iter_concurrent_conduits( + self, + ) -> Iterator[ + SerialTransformer[T_SourceProduct_arg, T_TransformedProduct_ret] | Passthrough + ]: + """[see superclass]"""
+ + +
+[docs] + @abstractmethod + def aiter_concurrent_conduits( + self, + ) -> AsyncIterator[ + SerialTransformer[T_SourceProduct_arg, T_TransformedProduct_ret] | Passthrough + ]: + """[see superclass]"""
+ + +
+[docs] + @final + def process( + self, input: Iterable[T_SourceProduct_arg] + ) -> list[T_TransformedProduct_ret]: + """ + Transform the given products. + + :param input: the products to transform + :return: the transformed products + """ + from ...simple import SimpleProducer + + return list( + SimpleProducer[self.input_type](input) >> self # type: ignore[name-defined] + )
+ + +
+[docs] + @final + async def aprocess( + self, input: AsyncIterable[T_SourceProduct_arg] + ) -> list[T_TransformedProduct_ret]: + """ + Transform the given products asynchronously. + + :param input: the products to transform + :return: the transformed products + """ + from ...simple import SimpleAsyncProducer + + return [ + product + async for product in ( + SimpleAsyncProducer[self.input_type]( # type: ignore[name-defined] + input + ) + >> self + ) + ]
+ + + def __and__( + self, + other: ( + BaseTransformer[T_SourceProduct_arg, T_TransformedProduct_ret] | Passthrough + ), + ) -> BaseTransformer[T_SourceProduct_arg, T_TransformedProduct_ret]: + input_type: type[T_SourceProduct_arg] + product_type: type[T_TransformedProduct_ret] + + if isinstance(other, Passthrough): + for transformer in self.iter_concurrent_conduits(): + _validate_concurrent_passthrough(transformer) + input_type = self.input_type + product_type = self.product_type + elif not isinstance(other, BaseTransformer): + return NotImplemented + else: + input_type = get_common_generic_subclass( + (self.input_type, other.input_type) + ) + product_type = get_common_generic_base( + (self.product_type, other.product_type) + ) + from . import SimpleConcurrentTransformer + + return SimpleConcurrentTransformer[ + input_type, product_type # type: ignore[valid-type] + ](self, other) + + def __rand__( + self, other: Passthrough + ) -> BaseTransformer[T_SourceProduct_arg, T_TransformedProduct_ret]: + if isinstance(other, Passthrough): + for transformer in self.iter_concurrent_conduits(): + _validate_concurrent_passthrough(transformer) + + from . import SimpleConcurrentTransformer + + return SimpleConcurrentTransformer[ + self.input_type, self.product_type # type: ignore[name-defined] + ](other, self) + else: + return NotImplemented + + @overload + def __rshift__( + self, + other: SerialTransformer[T_TransformedProduct_ret, T_Product_ret], + ) -> ( + BaseTransformer[T_SourceProduct_arg, T_Product_ret] + | SerialTransformer[T_SourceProduct_arg, T_Product_ret] + ): + pass # pragma: no cover + + @overload + def __rshift__( + self, + other: BaseTransformer[T_TransformedProduct_ret, T_Product_ret], + ) -> BaseTransformer[T_SourceProduct_arg, T_Product_ret]: + pass # pragma: no cover + + def __rshift__( + self, + other: ( + BaseTransformer[T_TransformedProduct_ret, T_Product_ret] + | SerialTransformer[T_TransformedProduct_ret, T_Product_ret] + ), + ) -> ( + BaseTransformer[T_SourceProduct_arg, T_Product_ret] + | SerialTransformer[T_SourceProduct_arg, T_Product_ret] + ): + if isinstance(other, BaseTransformer): + from ._chained_ import _ChainedConcurrentTransformer + + return _ChainedConcurrentTransformer(self, other) + else: + return NotImplemented + + @overload + def __rrshift__( + self, other: SerialProducer[T_SourceProduct_arg] + ) -> ( + SerialProducer[T_TransformedProduct_ret] + | BaseProducer[T_TransformedProduct_ret] + ): + pass + + @overload + def __rrshift__( + self, other: BaseProducer[T_SourceProduct_arg] + ) -> BaseProducer[T_TransformedProduct_ret]: + pass + + def __rrshift__( + self, + other: BaseProducer[T_SourceProduct_arg], + ) -> BaseProducer[T_TransformedProduct_ret] | Self: + if isinstance(other, SerialProducer): + from ._chained_ import _ChainedConcurrentTransformedProducer + + # noinspection PyTypeChecker + return _ChainedConcurrentTransformedProducer( + source=other, transformer_group=self + ) + elif isinstance(other, BaseProducer): + from ._chained_ import _ChainedConcurrentProducer + + return _ChainedConcurrentProducer(source=other, transformer=self) + else: + return NotImplemented
+ + + +
+[docs] +@inheritdoc(match="[see superclass]") +class SerialTransformer( + SerialProcessor[T_SourceProduct_arg, T_TransformedProduct_ret], + SerialSource[T_TransformedProduct_ret], + BaseTransformer[T_SourceProduct_arg, T_TransformedProduct_ret], + Generic[T_SourceProduct_arg, T_TransformedProduct_ret], +): + """ + A transformer that generates new products from the products of a producer. + """ + +
+[docs] + @final + def iter_concurrent_conduits( + self, + ) -> Iterator[SerialTransformer[T_SourceProduct_arg, T_TransformedProduct_ret]]: + """[see superclass]""" + yield self
+ + +
+[docs] + @final + async def aiter_concurrent_conduits( + self, + ) -> AsyncIterator[ + SerialTransformer[T_SourceProduct_arg, T_TransformedProduct_ret] + ]: + """[see superclass]""" + yield self
+ + +
+[docs] + @abstractmethod + def transform( + self, source_product: T_SourceProduct_arg + ) -> Iterator[T_TransformedProduct_ret]: + """ + Generate a new product from an existing product. + + :param source_product: an existing product to use as input + :return: the new product + """
+ + +
+[docs] + async def atransform( + self, source_product: T_SourceProduct_arg + ) -> AsyncIterator[T_TransformedProduct_ret]: + """ + Generate a new product asynchronously, using an existing product as input. + + By default, defers to the synchronous variant, :meth:`transform`. + + :param source_product: the existing product to use as input + :return: the new product + """ + for tx in self.transform(source_product): + yield tx
+ + +
+[docs] + def iter( + self, source: Iterable[T_SourceProduct_arg] + ) -> Iterator[T_TransformedProduct_ret]: + """ + Generate new products, using an existing producer as input. + + :param source: an existing producer to use as input (optional) + :return: the new products + """ + for product in source: + yield from self.transform(product)
+ + +
+[docs] + def aiter( + self, source: AsyncIterable[T_SourceProduct_arg] + ) -> AsyncIterator[T_TransformedProduct_ret]: + """ + Generate new products asynchronously, using an existing producer as input. + + :param source: an existing producer to use as input (optional) + :return: the new products + """ + # noinspection PyTypeChecker + return async_flatten(self.atransform(product) async for product in source)
+ + + @overload + def __rshift__( + self, + other: SerialTransformer[T_TransformedProduct_ret, T_Product_ret], + ) -> SerialTransformer[T_SourceProduct_arg, T_Product_ret]: + pass # pragma: no cover + + @overload + def __rshift__( + self, + other: BaseTransformer[T_TransformedProduct_ret, T_Product_ret], + ) -> BaseTransformer[T_SourceProduct_arg, T_Product_ret]: + pass # pragma: no cover + + def __rshift__( + self, + other: ( + BaseTransformer[T_TransformedProduct_ret, T_Product_ret] + | SerialTransformer[T_TransformedProduct_ret, T_Product_ret] + ), + ) -> ( + BaseTransformer[T_SourceProduct_arg, T_Product_ret] + | SerialTransformer[T_SourceProduct_arg, T_Product_ret] + ): + # Create a combined transformer where the output of this transformer is used as + # the input of the other transformer + if isinstance(other, SerialTransformer): + # We import locally to avoid circular imports + from ._chained_ import _ChainedTransformer + + return _ChainedTransformer(self, other) + + return super().__rshift__(other) + + @overload + def __rrshift__( + self, other: SerialProducer[T_SourceProduct_arg] + ) -> SerialProducer[T_TransformedProduct_ret]: + pass # pragma: no cover + + @overload + def __rrshift__( + self, other: BaseProducer[T_SourceProduct_arg] + ) -> BaseProducer[T_TransformedProduct_ret]: + pass # pragma: no cover + + def __rrshift__( + self, other: BaseProducer[T_SourceProduct_arg] + ) -> BaseProducer[T_TransformedProduct_ret]: + if isinstance(other, SerialProducer): + # We import locally to avoid circular imports + from ._chained_ import _ChainedProducer + + return _ChainedProducer(producer=other, transformer=self) + else: + return super().__rrshift__(other)
+ + + +# +# Auxiliary functions +# + + +def _validate_concurrent_passthrough( + conduit: SerialTransformer[Any, Any] | Passthrough +) -> None: + """ + Validate that the given conduit is valid as a concurrent conduit with a passthrough. + + To be valid, its input type must be a subtype of its product type. + + :param conduit: the conduit to validate + """ + + if not ( + isinstance(conduit, Passthrough) + or (issubclass_generic(conduit.input_type, conduit.product_type)) + ): + raise TypeError( + "Conduit is not a valid concurrent conduit with a passthrough because its " + f"input type {conduit.input_type} is not a subtype of its product type " + f"{conduit.product_type}:\n{conduit}" + ) + + +
+[docs] +@inheritdoc(match="[see superclass]") +class ConcurrentTransformer( + BaseTransformer[T_SourceProduct_arg, T_TransformedProduct_ret], + ConcurrentConduit[T_TransformedProduct_ret], + Generic[T_SourceProduct_arg, T_TransformedProduct_ret], + metaclass=ABCMeta, +): + """ + A collection of one or more transformers, operating in parallel. + """ + + @property + def input_type(self) -> type[T_SourceProduct_arg]: + """[see superclass]""" + return get_common_generic_subclass( + transformer.input_type + for transformer in self.iter_concurrent_conduits() + if not isinstance(transformer, Passthrough) + )
+ +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/functional/_functions.html b/docs/_modules/fluxus/functional/_functions.html new file mode 100644 index 0000000..7715db1 --- /dev/null +++ b/docs/_modules/fluxus/functional/_functions.html @@ -0,0 +1,1197 @@ + + + + + + + + + + fluxus.functional._functions — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus.functional._functions

+"""
+Implementation of public functions of the functional API for the flow module.
+"""
+
+from __future__ import annotations
+
+import functools
+import itertools
+import logging
+import operator
+from collections.abc import (
+    AsyncIterable,
+    Awaitable,
+    Callable,
+    Collection,
+    Iterable,
+    Iterator,
+    Mapping,
+)
+from types import FunctionType
+from typing import Any, TypeVar, cast, overload
+
+from pytools.asyncio import arun
+
+from .. import Passthrough
+from ..core import Conduit
+from ..core.producer import BaseProducer, ConcurrentProducer, SimpleConcurrentProducer
+from ..core.transformer import (
+    BaseTransformer,
+    ConcurrentTransformer,
+    SerialTransformer,
+    SimpleConcurrentTransformer,
+)
+from ._result import RunResult
+from .conduit import DictConsumer, DictProducer, Step
+from .product import DictProduct
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "chain",
+    "parallel",
+    "passthrough",
+    "run",
+    "step",
+]
+
+#
+# Type variables
+#
+
+T_Input = TypeVar("T_Input")
+T_Product = TypeVar("T_Product")
+T_Conduit = TypeVar("T_Conduit", bound=Conduit[Any])
+
+#
+# 'step' function, defining an individual step in the flow
+#
+
+
+@overload
+def step(
+    _name: str,
+    _function: Callable[
+        [],
+        Mapping[str, Any]
+        | Iterable[dict[str, Any]]
+        | AsyncIterable[dict[str, Any]]
+        | Awaitable[Mapping[str, Any]],
+    ],
+    /,
+    **kwargs: Any,
+) -> DictProducer:
+    """[see below]"""
+
+
+@overload
+def step(  # type: ignore[misc]
+    _name: str,
+    # Technically, the function signature of _function is a superset of the previous
+    # one, but in Python there is no way of specifying that we need at least one
+    # arbitrary keyword argument.
+    # It still works as intended, because the previous overload is more specific.
+    # Still, for mypy we have to ignore the error; see the 'ignore' comment above.
+    _function: Callable[
+        ...,
+        Mapping[str, Any]
+        | Iterable[dict[str, Any]]
+        | AsyncIterable[dict[str, Any]]
+        | Awaitable[Mapping[str, Any]],
+    ],
+    /,
+    **kwargs: Any,
+) -> Step:
+    """[see below]"""
+
+
+@overload
+def step(
+    _name: str,
+    _data: (
+        Mapping[str, Any] | Iterable[Mapping[str, Any]] | AsyncIterable[dict[str, Any]]
+    ),
+    /,
+) -> DictProducer:
+    """[see below]"""
+
+
+
+[docs] +def step( + _name: str, + _function_or_data: ( + Callable[ + ..., + Mapping[str, Any] + | Iterable[dict[str, Any]] + | AsyncIterable[dict[str, Any]] + | Awaitable[Mapping[str, Any]], + ] + | Iterable[Mapping[str, Any]] + | AsyncIterable[dict[str, Any]] + | Mapping[str, Any] + ), + /, + **kwargs: Any, +) -> DictProducer | Step: + # noinspection GrazieInspection + """ + Create a step in a flow. + + Takes two positional-only arguments, followed by an arbitrary number of fixed + keyword arguments. + + The first positional argument is the name of the step, and the second positional + argument is either a function, or a mapping or iterable of mappings. + + If the second argument is a mapping or an iterable of mappings, then the step is + a `producer` step that produces the given data. + + If the second argument is a function, then that function must return a single + mapping, or a synchronous or asynchronous iterable of mappings. If the function + takes no arguments, then the step is a `producer` step that produces the data + returned by the function. If the function takes one or more keyword arguments, then + the step is a `transformer` step that applies the function to the input data and + produces one or more dictionaries based on the input data. + + All arguments to the function must be keyword arguments. The values passed for these + arguments are determined by the dictionary items produced by the preceding steps in + the flow that match the names of the arguments, plus any fixed keyword arguments + associated with any of the preceding steps or the current step. If multiple + preceding steps use the same names for fixed keyword arguments or for dictionary + items produced by their functions, then the value of the latest step in the flow + takes precedence for that attribute. + + Code examples: + + .. code-block:: python + + # Create a producer step that produces a single dictionary + producer = step( + "input", + dict(a=1, b=2, c=3), + ) + + # Create a producer step that produces multiple dictionaries + producer = step( + "input", + [ + dict(a=1, b=2, c=3), + dict(a=4, b=5, c=6), + ], + ) + + # Create a producer step that dynamically produces dictionaries + producer = step( + "input", + lambda: ( + dict(x=i) for i in range(3) + ) + ) + + # Create a transformer step that applies a function to the input + transform_step = step( + "transform_data", + lambda number, increment: dict(incremented_number=number + increment), + increment=1 + ) + + # Create a transformer step that produces multiple dictionaries for each input + transform_step = step( + "transform_data", + lambda number, increment: ( + dict(incremented_number=number + i) for i in range(max_increment) + ), + max_increment=1 + ) + + + :param _name: the name of the step + :param _function_or_data: the function that the step applies to the source product, + and returns a single mapping, or a synchronous or asynchronous iterable of + mappings + :param kwargs: additional keyword arguments to pass to the function + :return: the step object + :raises TypeError: if the signature of the function is invalid, or if additional + keyword arguments are provided while the function does not accept any + """ + + # Count the number of arguments that the function accepts + + if isinstance(_function_or_data, Mapping): + return DictProducer(lambda: _function_or_data, name=_name) + elif isinstance(_function_or_data, Iterable): + return DictProducer( + lambda: iter(cast(Iterable[Mapping[str, Any]], _function_or_data)), + name=_name, + ) + elif isinstance(_function_or_data, AsyncIterable): + return DictProducer( + lambda: aiter(cast(AsyncIterable[Mapping[str, Any]], _function_or_data)), + name=_name, + ) + elif ( + not isinstance(_function_or_data, FunctionType) + or _function_or_data.__code__.co_varnames + ): + return Step(_name, _function_or_data, **kwargs) + else: + return DictProducer(_function_or_data, name=_name)
+ + + +# +# 'passthrough' function, defining a passthrough step in the flow +# + + +
+[docs] +def passthrough() -> Passthrough: + # noinspection GrazieInspection + """ + Create a `passthrough` step in a flow. + + A `passthrough` is a special step that passes the input through without + modification. It is invisible in the flow, and does not affect the data in any way. + + Use a passthrough as part of a :func:`parallel` composition of steps to ensure that, + in addition to applying all parallel steps concurrently, the input is also passed + without modification to the subsequent steps. + + Chaining a passthrough with another step is not permitted; it would have no effect. + + Examples: + + .. code-block:: python + + # Create a parallel composition of steps + parallel_steps = chain( + # Create a producer step that produces a single dictionary + step( + "input", + dict(a=1, b=2, c=3), + ), + # Create a group of parallel steps, including a passthrough + parallel( + # Create a transformer step that increments the value of 'a' + step( + "increment_a", + lambda a: dict(a=a + 1), + ), + # Create a passthrough step that passes the input through + passthrough(), + ), + # Create a transformer step that doubles the value of 'a' + step( + "double_a", + lambda a: dict(a=a * 2), + ), + ) + + # Run the parallel composition of steps + result = run(parallel_steps) + + The results will include the original input dictionary, as well as the dictionary + produced by the 'increment_a' step: + + .. code-block:: python + + [ + # First output, including the original input and the result of the + # 'increment_a' and 'double_a' steps + { + "input": dict(a=1, b=2, c=3), + "increment_a": dict(a=2), + "double_a": dict(a=4), + }, + # Second output, including the original input, bypassing the 'increment_a' + # step, including the result of the 'double_a' step + { + "input": dict(a=1, b=2, c=3), + "double_a": dict(a=2), + }, + ] + + :return: the passthrough step + """ + return Passthrough()
+ + + +# +# 'chain' function, defining a sequence of steps in the flow +# + + +@overload +def chain( + start: BaseProducer[T_Product], + /, + *steps: BaseTransformer[T_Product, T_Product], +) -> BaseProducer[T_Product]: + """[see below]""" + + +@overload +def chain( + start: BaseTransformer[T_Product, T_Product], + /, + *steps: BaseTransformer[T_Product, T_Product], +) -> BaseTransformer[T_Product, T_Product]: + """[see below]""" + + +
+[docs] +def chain( + start: BaseProducer[T_Product] | BaseTransformer[T_Product, T_Product], + /, + *steps: BaseTransformer[T_Product, T_Product], +) -> BaseProducer[T_Product] | BaseTransformer[T_Product, T_Product]: + """ + Combine multiple transformers with an optional leading producer to be executed + sequentially. + + Combining a producer with one or more transformers results in a chained producer. + Combining one or more transformers results in a chained transformer. + + Examples: + + .. code-block:: python + + chained_steps = chain( + # Start with an initial producer that generates the initial data + step("initial_producer", lambda: dict(value=1)), + # Apply a transformer that increments the value + step("increment_transformer", lambda data: dict(value=data["value"] + 1)), + # Apply a transformer that doubles the value + step("double_transformer", lambda data: dict(value=data["value"] * 2)), + # Apply a transformer that squares the value + step("square_transformer", lambda data: dict(value=data["value"] ** 2)) + ) + + chained_steps = chain( + [ + # Create three chained steps that increment the value by 1, 2, and 3 + step("increment_transformer", lambda data, + i=i: dict(value=data["value"] + i)) for i in range(1, 4) + ], + [ + # Create three chained steps that multiply the value by 1, 2, and 3 + step("multiply_transformer", lambda data, + i=i: dict(value=data["value"] * i)) for i in range(1, 4) + ], + # Include a passthrough step that passes the input through unchanged + passthrough() + ) + + :param start: the initial producer or transformer in the chain + :param steps: additional transformers in the chain + :return: the chained producer or transformer + """ + return functools.reduce(operator.rshift, (start, *steps))
+ + + +# +# 'parallel' function, defining a concurrent execution of steps in the flow +# + +SimpleIterable = Collection[T_Conduit] | Iterator[T_Conduit] + + +@overload +def parallel( + first: BaseProducer[T_Product] | SimpleIterable[BaseProducer[T_Product]], + second: BaseProducer[T_Product] | SimpleIterable[BaseProducer[T_Product]], + /, + *more: BaseProducer[T_Product] | SimpleIterable[BaseProducer[T_Product]], +) -> ConcurrentProducer[T_Product]: + """[see below]""" + + +@overload +def parallel( + inputs: SimpleIterable[BaseProducer[T_Product]], + /, +) -> ConcurrentProducer[T_Product] | BaseProducer[T_Product]: + """[see below]""" + + +@overload +def parallel( + first: ( + BaseTransformer[T_Input, T_Product] + | SimpleIterable[BaseTransformer[T_Input, T_Product] | Passthrough] + | Passthrough + ), + second: ( + BaseTransformer[T_Input, T_Product] + | SimpleIterable[BaseTransformer[T_Input, T_Product] | Passthrough] + | Passthrough + ), + /, + *more: ( + BaseTransformer[T_Input, T_Product] + | SimpleIterable[BaseTransformer[T_Input, T_Product] | Passthrough] + | Passthrough + ), +) -> ConcurrentTransformer[T_Input, T_Product]: + """[see below]""" + + +@overload +def parallel( + steps: SimpleIterable[BaseTransformer[T_Input, T_Product] | Passthrough], + /, +) -> ConcurrentTransformer[T_Input, T_Product] | BaseTransformer[T_Input, T_Product]: + """[see below]""" + + +
+[docs] +def parallel( + first: ( + BaseProducer[T_Product] + | SimpleIterable[BaseProducer[T_Product]] + | BaseTransformer[T_Input, T_Product] + | SimpleIterable[BaseTransformer[T_Input, T_Product] | Passthrough] + | Passthrough + ), + /, + *more: BaseProducer[T_Product] + | SimpleIterable[BaseProducer[T_Product]] + | BaseTransformer[T_Input, T_Product] + | SimpleIterable[BaseTransformer[T_Input, T_Product] | Passthrough] + | Passthrough, +) -> ( + ConcurrentProducer[T_Product] + | ConcurrentTransformer[T_Input, T_Product] + | BaseProducer[T_Product] + | BaseTransformer[T_Input, T_Product] +): + # noinspection GrazieInspection + """ + Combine two or more sub-flows, to be executed concurrently. + + All sub-flows must have compatible input and output types. + + If all arguments are producers, then the result is a `concurrent producer`. + + If all arguments are transformers, with or without an additional passthrough step, + then the result is a `concurrent transformer`. At most one passthrough step is + allowed in a group of parallel steps, since additional passthroughs would result in + duplicate output. + + Mixing producers and transformers is not allowed. + + Arguments can be provided as individual steps or sub-flows, or as iterables of steps + or sub-flows. If an iterable is provided, its elements are unpacked and combined + with the other arguments. + + Examples: + + .. code-block:: python + + concurrent_steps = parallel( + # Create two concurrent steps that increment 'a' and 'b' by 1 + step("increment_a", lambda a: dict(a=a + 1)), + step("increment_b", lambda b: dict(b=b + 1)), + ) + + concurrent_steps = parallel( + [ + # Create three concurrent steps that increment 'a' by 1, 2, and 3 + step("increment_a", lambda a: dict(a=a + i)) for i in range(1, 4) + ], + [ + # Create three concurrent steps that increment 'b' by 1, 2, and 3 + step("increment_b", lambda b: dict(b=b + i)) for i in range(1, 4) + ], + # Include a passthrough step that passes the input through unchanged + passthrough(), + ) + + :param first: the first step to combine + :param more: additional steps to combine + :return: the combined steps for concurrent execution + :raises TypeError: if no steps are provided + """ + + # Unpack iterable arguments + steps = cast( + list[ + BaseProducer[T_Product] | BaseTransformer[T_Input, T_Product] | Passthrough + ], + list( + itertools.chain.from_iterable( + ( + [step_] + # Check if the first argument is an iterable of steps. + # Producers are iterable, so we need to exclude them here. + if isinstance(step_, (BaseProducer, BaseTransformer, Passthrough)) + or not isinstance(step_, Iterable) + else step_ + ) + for step_ in (first, *more) + ) + ), + ) + + # Check for invalid step types + invalid_steps = [ + step_ + for step_ in steps + if not isinstance(step_, (BaseProducer, BaseTransformer, Passthrough)) + ] + if invalid_steps: + raise TypeError( + "parallel() can only be used with producer, transformer, or passthrough " + "steps, but got: " + ", ".join(str(step_) for step_ in invalid_steps) + ) + + n_passthrough = sum(1 for step_ in steps if isinstance(step_, Passthrough)) + if n_passthrough > 1: + raise TypeError( + f"parallel() may include at most one passthrough step, but got " + f"{n_passthrough}" + ) + + elif not steps: + raise TypeError("parallel() must be called with at least one step") + + elif len(steps) == 1: + single_step = steps[0] + # We have a single step, so don't need to combine anything + if isinstance(single_step, Passthrough): + # We don't allow a single Passthrough step, because that might lead to + # chaining a Passthrough with another step, which is not allowed. + raise TypeError("parallel() cannot be used with a single passthrough step") + else: + return single_step + + elif isinstance(steps[0], BaseProducer): + # We have multiple producers. Combine them into a single concurrent producer. + return SimpleConcurrentProducer(*cast(list[BaseProducer[T_Product]], steps)) + + else: + # We have multiple transformers. Combine them into a single concurrent + # transformer. + return SimpleConcurrentTransformer( + *cast(list[BaseTransformer[T_Input, T_Product] | Passthrough], steps) + )
+ + + +# +# 'run' function, running the steps in the flow +# + + +@overload +def run(steps: BaseProducer[DictProduct], *, timestamps: bool = False) -> RunResult: + """[see below]""" + + +# noinspection PyShadowingNames +@overload +def run( + steps: SerialTransformer[DictProduct, DictProduct], + *, + input: Iterable[Mapping[str, Any]] | Mapping[str, Any] | None = None, + timestamps: bool = False, +) -> RunResult: + """[see below]""" + + +@overload +def run( + steps: ConcurrentTransformer[DictProduct, DictProduct], + *, + input: Iterable[Mapping[str, Any]] | Mapping[str, Any] | None = None, + timestamps: bool = False, +) -> RunResult: + """[see below]""" + + +@overload +def run( + steps: BaseTransformer[DictProduct, DictProduct], + *, + input: Iterable[Mapping[str, Any]] | Mapping[str, Any] | None = None, + timestamps: bool = False, +) -> RunResult: + """[see below]""" + + +# noinspection PyShadowingNames, PyTestUnpassedFixture +
+[docs] +def run( + steps: BaseProducer[DictProduct] | BaseTransformer[DictProduct, DictProduct], + *, + input: Iterable[Mapping[str, Any]] | Mapping[str, Any] | None = None, + timestamps: bool = False, +) -> RunResult: + """ + Run the given steps. + + If the given steps do not include a leading :func:`input`, then the input + can be provided as an additional argument. If the flow requires an input but none + is provided, then the flow will be run with an empty dictionary as input. + + See :class:`.RunResult` for details on the output format. + + Examples: + + .. code-block:: python + + # Example 1: A sequence of steps + chained_steps = chain( + # Start with an initial input producer + step("initial_producer", lambda: dict(value=1)), + # Increment the value + step("increment_transformer", lambda value: dict(value=value + 1)), + # Double the value + step("double_transformer", lambda value: dict(value=value * 2)), + # Square the value + step("square_transformer", lambda value: dict(value=value ** 2)) + ) + + # Run the steps + result = run(chained_steps) + + generates the following output: + + .. code-block:: python + + RunResult( + [ + { + 'initial_producer': {'value': 1}, + 'increment_transformer': {'value': 2}, + 'double_transformer': {'value': 4}, + 'square_transformer': {'value': 16} + } + ] + ) + + and + + .. code-block:: python + + # Example 2: Parallel steps + parallel_steps = parallel( + # Create two concurrent steps that increment 'a' and 'b' by 1 + step("increment_a", lambda a: dict(a=a + 1)), + step("increment_b", lambda b: [dict(b=b + 1), dict(b=b + 2)]), + ) + + # Run the steps with input + result = run(parallel_steps, input={"a": 1, "b": 2}) + + generates the following output: + + .. code-block:: python + + RunResult( + [{'input': {'a': 1, 'b': 2}, 'increment_a': {'a': 2}}], + [ + {'input': {'a': 1, 'b': 2}, 'increment_b': {'b': 3}}, + {'input': {'a': 1, 'b': 2}, 'increment_b': {'b': 4}} + ] + ) + + Parallel and serial steps can be combined in a single flow: + + .. code-block:: python + + # Example 3: Running complex parallel steps + complex_parallel_steps = parallel( + [ + # Create three concurrent steps that increment 'a' by 1, 2, and 3 + step("increment_a", lambda a, i=i: dict(a=a+i)) for i in range(1, 4) + ], + [ + # Create three concurrent steps that increment 'b' by 1, 2, and 3 + step("increment_b", lambda b, i=i: dict(b=b+i)) for i in range(1, 4) + ], + # Include a passthrough step that passes the input through unchanged + passthrough() + ) + + # Run the steps with input + result = run(complex_parallel_steps, input=dict(a=1, b=2)) + + generates the following output: + + .. code-block:: python + + RunResult( + [{'input': {'a': 1, 'b': 2}, 'increment_a': {'a': 2}}], + [{'input': {'a': 1, 'b': 2}, 'increment_a': {'a': 3}}], + [{'input': {'a': 1, 'b': 2}, 'increment_a': {'a': 4}}], + [{'input': {'a': 1, 'b': 2}, 'increment_b': {'b': 3}}], + [{'input': {'a': 1, 'b': 2}, 'increment_b': {'b': 4}}], + [{'input': {'a': 1, 'b': 2}, 'increment_b': {'b': 5}}], + [{'input': {'a': 1, 'b': 2}}] + ) + + :param steps: the steps to run + :param input: one or more inputs to the steps (optional) + :param timestamps: if ``True``, include start and end timestamps for each step in + the output; if ``False``, do not include timestamps + :return: the dictionaries produced by the final step or steps + """ + + consumer = DictConsumer(timestamps=timestamps) + + if isinstance(steps, BaseProducer): + if input is not None: + raise TypeError( + "arg input cannot be specified when the given steps include a leading " + "producer step" + ) + return arun((steps >> consumer).arun()) + + elif isinstance(steps, BaseTransformer): + return arun( + ( + DictProducer(lambda: {} if input is None else input, name="input") + >> steps + >> consumer + ).arun() + ) + + else: + message = ( + f"arg steps must be a step or composition of steps, but got a " + f"{type(steps).__qualname__}" + ) + if ( + isinstance(steps, tuple) + and len(steps) == 1 + and isinstance(steps[0], Conduit) + ): + message += "; did you leave a comma before a closing parenthesis?" + raise TypeError(message)
+ +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/functional/_result.html b/docs/_modules/fluxus/functional/_result.html new file mode 100644 index 0000000..d7c9e13 --- /dev/null +++ b/docs/_modules/fluxus/functional/_result.html @@ -0,0 +1,810 @@ + + + + + + + + + + fluxus.functional._result — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus.functional._result

+"""
+Implementation of ``RunResult``.
+"""
+
+from __future__ import annotations
+
+import logging
+from collections.abc import Iterable, Iterator, Mapping, Sequence
+from types import NoneType
+from typing import Any, Literal, TextIO, cast
+
+import pandas as pd
+
+from pytools.api import inheritdoc
+from pytools.expression import Expression, HasExpressionRepr
+from pytools.expression.atomic import Id
+
+from .product import DictProduct
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "RunResult",
+]
+
+
+
+[docs] +@inheritdoc(match="""[see superclass]""") +class RunResult(HasExpressionRepr): + # noinspection GrazieInspection + """ + The result of running a flow. + + A run result represents the output of a flow, as a list of dictionaries for each + distinct path through the flow. + + Each dictionary represents the output of the flow for a single input. The dictionary + is in the form of a nested dictionary, where the keys of the outer dictionary are + the names of the steps, and the values are attribute-value mappings produced by + each of the steps. + + For example: + + .. code-block:: python + + # Create a producer step that produces a single dictionary + producer = step( + "input", + dict(a=1, b=2, c=3), + ) + + # Define an increment function + def increment(a: int, by: int) -> dict[str, int]: + return dict(a=a + by) + + # Create transformers step that increment the value of 'a' + increment_by_1 = step( + "increment", + increment, + by=1, + ) + + increment_by_2 = step( + "increment", + increment, + by=2, + ) + + # Construct and run the flow + run(producer >> (increment_by_1 & increment_by_2)) + + The output will be: + + .. code-block:: python + + RunResult( + [{'input': {'a': 1, 'b': 2, 'c': 3}, 'increment': {'a': 2}}], + [{'input': {'a': 1, 'b': 2, 'c': 3}, 'increment': {'a': 3}}] + ) + + To access the results of running the flow, use the :meth:`get_outputs` method. + + To access the results grouped by path, use the :meth:`get_outputs_per_path` method. + + The run result can be converted to a data frame using the :meth:`to_frame` method. + """ + + _step_results: tuple[list[dict[str, dict[str, Any]]], ...] + + def __init__(self, *step_results: list[dict[str, dict[str, Any]]]) -> None: + """ + :param step_results: the results of running the flow, as a list of nested + dictionaries for each distinct path through the flow + """ + for step_result in step_results: + if not isinstance(step_result, list): + raise TypeError( + f"expected a list of dictionaries but got: {step_result!r}" + ) + for flow_output in step_result: + if not isinstance(flow_output, dict): + raise TypeError(f"expected a dictionary but got: {flow_output!r}") + for step_name, step_output in flow_output.items(): + if not isinstance(step_output, dict): + raise TypeError( + f"expected a dictionary for step {step_name!r} but got: " + f"{step_output!r}" + ) + self._step_results = step_results + +
+[docs] + def get_outputs(self) -> Iterator[Mapping[str, Mapping[str, Any]]]: + """ + Get the results of running the flow, for all inputs across all paths. + + :return: an iterator over all results + """ + for path in self._step_results: + yield from path
+ + +
+[docs] + def get_outputs_per_path( + self, + ) -> list[Iterator[Mapping[str, Mapping[str, Any]]]]: + """ + Get the results of running the flow, as a list of outputs for each path. + + Returns a nested list, where the outer list corresponds to the paths through the + flow, and the inner list corresponds to the outputs along that path for all + inputs. + + Note that the inner list may be empty if the path has no outputs, or may contain + multiple outputs for individual inputs. + + :return: the results of running the flow, as a list of output iterators for each + path + """ + return [iter(path) for path in self._step_results]
+ + +
+[docs] + def to_frame( + self, *, path: int | None = None, simplify: bool = False + ) -> pd.DataFrame: + """ + Convert this run result to a frame. + + If the path is specified, the output will be a frame for that path only. + + :param path: the index of the path to convert to a frame; if not specified, the + output will be a frame for all paths + :param simplify: if ``True``, convert complex types to strings using + :func:`.simplify_complex_types`; if ``False``, leave objects with complex + types as they are (default: ``False``) + :return: this run result + """ + if path is not None: + if path < 0 or path >= len(self._step_results): + raise IndexError(f"path index out of bounds: {path}") + outputs_for_path = self._step_results[path] + if not outputs_for_path: + raise ValueError(f"path {path} has no outputs; cannot convert to frame") + return _dicts_to_frame(outputs_for_path, simplify=simplify or False) + else: + if not self._step_results: + raise TypeError("RunResult has no outputs; cannot convert to frame") + return _dicts_to_frame(self._step_results, simplify=simplify or False)
+ + +
+[docs] + def draw_timeline( + self, + *, + style: Literal["text", "matplot", "matplot_dark"] = "matplot", + out: TextIO | None = None, + ) -> None: + """ + Draw a timeline of the flow execution. + + :param style: the style to use for rendering the timeline + :param out: the output stream to write the timeline to, if style is ``"text"`` + (defaults to :obj:`sys.stdout`) + """ + + if not any( + DictProduct.KEY_START_TIME in step_output + and DictProduct.KEY_END_TIME in step_output + for path in self._step_results + for execution_output in path + for step_output in execution_output.values() + ): + raise ValueError( + "Results do not include timestamps; re-run the flow with using " + "function 'run' with argument 'timestamps=True'" + ) + from ..viz import TimelineDrawer, TimelineTextStyle + + style_arg: TimelineTextStyle | str + + if out is not None: + text_style_name = TimelineTextStyle.get_default_style_name() + if style == text_style_name: + style_arg = TimelineTextStyle(out=out) + else: + raise ValueError( + f"arg out is only supported with arg style={text_style_name!r}" + ) + else: + permissible_style_names = TimelineDrawer.get_named_styles() + if style not in permissible_style_names: + raise ValueError( + "arg style must be one of: " + + ", ".join(map(repr, permissible_style_names)) + ) + style_arg = style + + TimelineDrawer(style=style_arg).draw(self, title="Timeline")
+ + +
+[docs] + def to_expression(self) -> Expression: + """[see superclass]""" + return Id(type(self))(*self._step_results)
+ + + def __eq__(self, other: Any) -> bool: + return ( + isinstance(other, RunResult) and self._step_results == other._step_results + )
+ + + +# +# Auxiliary functions +# + + +def _dicts_to_frame( + dicts: Sequence[Any], *, simplify: bool, max_levels: int | None = None +) -> pd.DataFrame: + """ + Convert a (possibly nested) iterable of dictionaries or mappings to a DataFrame, + using the dictionary keys as column names. + + Nested lists are converted to multi-level indices, where the first level holds the + outermost list indices and subsequent index levels represent the nested list + indices. + + Nested dictionaries are converted to multi-level columns, where the first level + holds the outermost dictionary keys and subsequent index levels represent the + nested dictionary keys. + + :param dicts: the dictionaries or mappings to convert, either as a single iterable + or as a nested iterable + :param simplify: if ``True``, convert complex types to strings using + :func:`.simplify_complex_types`; if ``False``, leave objects with complex types + as they are + :param max_levels: the maximum number of levels in the resulting multi-level index; + if ``None``, the number of levels is not limited (default: ``None``) + :return: the data frame + """ + + element_types = {type(d) for d in dicts} + if len(element_types) > 1: + raise TypeError("arg dicts must not contain mixed types of elements") + element_type = element_types.pop() + + if issubclass(element_type, Mapping): + return ( + pd.concat( + ( + _dict_to_series(d, simplify=simplify, max_levels=max_levels) + for d in dicts + ), + axis=1, + ignore_index=True, + ) + .T.convert_dtypes() + .rename_axis(index="item") + ) + elif issubclass(element_type, Sequence): + # We have a sequence of sequences + sub_frames = [ + # We iterate over the inner sequences and convert them to data + # frames, which are then concatenated along the column axis. + # We skip empty sequences. + _dicts_to_frame(d, simplify=simplify, max_levels=max_levels) + for d in dicts + if d # Skip empty sequences + ] + if not sub_frames: + raise ValueError("arg dicts must not contain only empty sequences") + return pd.concat( + cast( + Iterable[pd.DataFrame], + sub_frames, + ), + axis=0, + # We add an index level for the outer sequence + names=["group"], + ) + else: + raise TypeError( + "arg dicts must be a sequence or nested sequence of dictionaries" + ) + + +def _dict_to_series( + d: Mapping[str, Any], *, simplify: bool, max_levels: int | None = None +) -> pd.Series[Any]: + """ + Convert a dictionary to a Series, using the keys as index labels. + + Nested dictionaries are converted to multi-level indices, where the first level + holds the outermost dictionary keys and subsequent index levels represent the nested + dictionary keys. + + :param d: the dictionary or mapping to convert + :param simplify: if ``True``, convert complex types to strings using + :func:`.simplify_complex_types`; if ``False``, leave objects with complex types + as they are + :param max_levels: the maximum number of levels in the resulting multi-level index; + if ``None``, the number of levels is not limited (default: ``None``) + :return: the series + """ + + if max_levels is not None and max_levels <= 0: + raise ValueError( + f"arg max_levels must be a positive integer or None, but got: {max_levels}" + ) + + def _flatten( + sub_dict: Mapping[str, Any], + level: int, + parent_key: tuple[str, ...] | str | None = None, + ) -> Iterator[tuple[tuple[str, ...] | str, Any]]: + new_key: tuple[str, ...] | str + + for k, v in sub_dict.items(): + if not parent_key: + # The first level of the index is a single string + new_key = k + elif isinstance(parent_key, tuple): + # Subsequent levels are tuples of strings + new_key = (*parent_key, k) + else: + # The second level of the index is the first to be a tuple + new_key = (parent_key, k) + if isinstance(v, dict) and level != 1: + yield from _flatten(v, level - 1, new_key) + else: + # We are at the last level of the index and will stop flattening + yield new_key, _simplify_complex_types(v) if simplify else v + + sr: pd.Series[Any] = pd.Series(dict(_flatten(d, max_levels or 0))) + return sr + + +# +# Auxiliary functions +# + + +def _simplify_complex_types( + value: Any, +) -> bool | int | float | str | bytes | complex | None: + """ + Convert instances of complex types to strings. + + :param value: the value to convert + :return: the value, with complex types converted to strings + """ + if isinstance(value, (bool, int, float, str, bytes, complex, NoneType)): + return value + else: + return str(value) +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/functional/conduit/_consumer.html b/docs/_modules/fluxus/functional/conduit/_consumer.html new file mode 100644 index 0000000..8949fed --- /dev/null +++ b/docs/_modules/fluxus/functional/conduit/_consumer.html @@ -0,0 +1,561 @@ + + + + + + + + + + fluxus.functional.conduit._consumer — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus.functional.conduit._consumer

+"""
+Implementation of ``DictConsumer``.
+"""
+
+from __future__ import annotations
+
+import logging
+import time
+from collections import defaultdict
+from collections.abc import AsyncIterable
+from typing import Any, cast
+
+from pytools.api import inheritdoc
+from pytools.expression.repr import ListWithExpressionRepr
+
+from ... import AsyncConsumer
+from .._result import RunResult
+from ..product import DictProduct
+from ._conduit import DictConduit
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "DictConsumer",
+]
+
+
+#
+# Classes
+#
+
+
+
+[docs] +@inheritdoc(match="""[see superclass]""") +class DictConsumer(DictConduit, AsyncConsumer[DictProduct, RunResult]): + """ + A consumer of dictionary products. + + Combines all dictionaries into one or more lists of dictionaries. + + If the ingoing flow is sequential, the output will be a list of dictionaries. + + If the ingoing flow is concurrent, the output will be a list of lists of + dictionaries, where each list corresponds to one distinct path through the + flow. + """ + + #: If ``True``, include start and end timestamps for each step in the lineage + #: attributes; if ``False``, do not include timestamps. + timestamps: bool + + def __init__(self, *, name: str = "output", timestamps: bool = False) -> None: + """ + :param name: the name of this consumer + :param timestamps: if ``True``, include start and end timestamps for each step + in the lineage attributes; if ``False``, do not include timestamps + """ + super().__init__(name=name) + self.timestamps = timestamps + +
+[docs] + async def aconsume( + self, products: AsyncIterable[tuple[int, DictProduct]] + ) -> RunResult: + """[see superclass]""" + + lineage_by_group: dict[int, list[dict[str, dict[str, Any]]]] = defaultdict( + ListWithExpressionRepr + ) + + # Get the timestamps flag + timestamps = self.timestamps + + # Initialize summary statistics + run_start = time.perf_counter() + cumulative_time = 0.0 + + async for group, end_product in products: + # Get the lineage attributes for the end product, which is a dictionary + # of dictionaries. The outer dictionary maps product names to their + # attributes, and the inner dictionaries map attribute names to their + # values. + attributes = end_product.get_lineage_attributes() + + # Iterate over products and their attribute dicts + for product, product_attributes in zip( + cast(list[DictProduct], end_product.get_lineage()), attributes.values() + ): + + # Get the time stamps from the product + start_time = product.start_time + end_time = product.end_time + + # Update summary statistics + cumulative_time += end_time - start_time + + if timestamps: + # Add the time stamps to the attributes + product_attributes[DictProduct.KEY_START_TIME] = ( + start_time - run_start + ) + product_attributes[DictProduct.KEY_END_TIME] = end_time - run_start + + lineage_by_group[group].append(attributes) + + # Log a summary message + + run_duration = time.perf_counter() - run_start + summary_message = ( + f"Run took {run_duration :g} seconds, with {cumulative_time:g} seconds " + "of total wait time." + ) + if run_duration > 0: + speedup = cumulative_time / run_duration + summary_message += ( + f" Concurrent execution achieved a speedup factor of {speedup:g} " + f"over sequential execution." + ) + log.info(summary_message) + + # Return the lineage by group + n_groups = max(lineage_by_group.keys()) + 1 + + if n_groups == 1: + # If there is only one group, return the lineage for that group as a list + return RunResult(lineage_by_group[0]) + else: + # If there are multiple groups, return the lineage for each group in a + # separate list + return RunResult(*(lineage_by_group[group] for group in range(n_groups)))
+
+ +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/functional/conduit/_producer.html b/docs/_modules/fluxus/functional/conduit/_producer.html new file mode 100644 index 0000000..71906ee --- /dev/null +++ b/docs/_modules/fluxus/functional/conduit/_producer.html @@ -0,0 +1,551 @@ + + + + + + + + + + fluxus.functional.conduit._producer — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus.functional.conduit._producer

+"""
+Implementation of ``DictProducer``.
+"""
+
+from __future__ import annotations
+
+import logging
+import time
+from collections.abc import (
+    AsyncIterable,
+    AsyncIterator,
+    Awaitable,
+    Callable,
+    Iterable,
+    Mapping,
+)
+from typing import Any
+
+from pytools.api import inheritdoc
+from pytools.asyncio import iter_sync_to_async
+
+from ... import AsyncProducer
+from ..product import DictProduct
+from ._conduit import DictConduit
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "DictProducer",
+]
+
+
+#
+# Classes
+#
+
+
+
+[docs] +@inheritdoc(match="[see superclass]") +class DictProducer(DictConduit, AsyncProducer[DictProduct]): + """ + A producer of dictionary products. + """ + + #: The dictionaries produced by this producer. + producer: Callable[ + [], + AsyncIterable[Mapping[str, Any]] + | Iterable[Mapping[str, Any]] + | Mapping[str, Any] + | Awaitable[Mapping[str, Any]], + ] + + def __init__( + self, + producer: Callable[ + [], + AsyncIterable[Mapping[str, Any]] + | Iterable[Mapping[str, Any]] + | Mapping[str, Any] + | Awaitable[Mapping[str, Any]], + ], + *, + name: str, + ) -> None: + """ + :param producer: a function that produces dictionaries + :param name: the name of this producer + """ + super().__init__(name=name) + self.producer = producer + +
+[docs] + async def aiter(self) -> AsyncIterator[DictProduct]: + """[see superclass]""" + products = self.producer() + + # Measure the start time of the step. We are interested in CPU time, not wall + # time, so we use time.perf_counter() instead of time.time(). + start_time = time.perf_counter() + + if isinstance(products, Mapping): + yield DictProduct( + name=self.name, + product_attributes=products, + start_time=start_time, + end_time=start_time, + ) + + elif isinstance(products, Awaitable): + attributes = await products + yield DictProduct( + name=self.name, + product_attributes=attributes, + start_time=start_time, + end_time=time.perf_counter(), + ) + + else: + if isinstance(products, Iterable): + products = iter_sync_to_async(products) # type: ignore[arg-type] + elif not isinstance(products, AsyncIterable): + raise TypeError( + f"Expected producer function to return a Mapping, an Iterable, or " + f"an AsyncIterable, but got: {products!r}" + ) + async for product in products: + # Measure the end time of the step. + end_time = time.perf_counter() + + yield DictProduct( + name=self.name, + product_attributes=product, + start_time=start_time, + end_time=end_time, + ) + + # Set the start time of the next iteration to the end time of this + # iteration. + start_time = end_time
+
+ +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/functional/conduit/_step.html b/docs/_modules/fluxus/functional/conduit/_step.html new file mode 100644 index 0000000..89182e5 --- /dev/null +++ b/docs/_modules/fluxus/functional/conduit/_step.html @@ -0,0 +1,865 @@ + + + + + + + + + + fluxus.functional.conduit._step — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus.functional.conduit._step

+"""
+Implementation of ``Step``.
+"""
+
+from __future__ import annotations
+
+import inspect
+import logging
+import time
+import typing
+from collections.abc import (
+    AsyncIterable,
+    AsyncIterator,
+    Awaitable,
+    Callable,
+    Iterable,
+    Mapping,
+)
+from typing import Any, cast
+
+from pytools.api import inheritdoc
+from pytools.asyncio import iter_sync_to_async
+from pytools.expression.atomic import Id
+from pytools.expression.composite import BinaryOperation, DictLiteral
+from pytools.expression.operator import BinaryOperator
+from pytools.typing import issubclass_generic
+
+from ... import AsyncTransformer
+from ..product import DictProduct
+from ._conduit import DictConduit
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "Step",
+]
+
+#
+# Constants
+#
+
+# A sentinel value that indicates that a required argument was not provided.
+_NOT_PROVIDED = object()
+
+# A sentinel value that indicates that an optional argument was not provided.
+_NOT_PROVIDED_OPTIONAL = object()
+
+_ARG_REQUIRED = True
+_ARG_OPTIONAL = False
+
+#
+# Step classes
+#
+
+
+
+[docs] +@inheritdoc(match="[see superclass]") +class Step(DictConduit, AsyncTransformer[DictProduct, DictProduct]): + """ + A step in a flow that applies a function to a dictionary. + + The ingoing dictionary is an attribute-value mapping. For each attribute that + matches an argument name of the function, the attribute value is passed to the + function. The function's attributes are therefore not allowed to be positional-only. + + Dictionary attributes that are not matched to an argument name of the function are + ignored. + + If the function allows arbitrary keyword arguments using ``**`` notation, all + attributes of the source product are passed to the function. + + The `Step` object may define additional fixed keyword arguments that are passed to + the function on each call. + + The function must return either + + - a single attribute-value mapping or dictionary + - an iterable of such dictionaries + - an async iterable of such dictionaries + + The name of the step and the names of the keyword arguments must be valid Python + identifiers. + + As an example, the following function could be used as a step: + + .. code-block:: python + + def add_one(x): + return dict(x=x + 1) + + The function ``add_one`` takes a single argument ``x`` and returns a dictionary + with the key ``x`` and the value ``x + 1``. The step could be defined as follows: + + .. code-block:: python + + step = Step("add_one", add_one) + + The step could then be combined with other steps using the :func:`.chain` and + :func:`.parallel` functions, or the ``>>`` and ``&`` operators (see the function + documentation for more information). + + The step, or a larger flow, can then be run with a given input, using function + :func:`.run`. For example: + + .. code-block:: python + + result = run(step, input=dict(x=1)) + """ + + #: The function that this step applies to the source product. + _function: Callable[ + ..., + Mapping[str, Any] + | Iterable[Mapping[str, Any]] + | AsyncIterable[Mapping[str, Any]] + | Awaitable[Mapping[str, Any]], + ] + + #: Additional keyword arguments to pass to the function. + kwargs: dict[str, Any] + + #: The names of the arguments that must or can be determined from the source + # product. + # Maps argument names to whether they are required (_ARG_REQUIRED) or optional + # (_ARG_OPTIONAL). + # None if arbitrary keyword arguments are allowed. + _function_arguments: Mapping[str, bool] | None + + def __init__( + self, + _name: str, + _function: Callable[ + ..., + Mapping[str, Any] + | Iterable[Mapping[str, Any]] + | AsyncIterable[Mapping[str, Any]] + | Awaitable[Mapping[str, Any]], + ], + /, + **kwargs: Any, + ) -> None: + """ + :param _name: the name of the step + :param _function: the function that the step applies to the source product, and + returns a single dictionary, or a synchronous or asynchronous iterable of + dictionaries + :param kwargs: additional keyword arguments to pass to the function + :raises TypeError: if the signature of the function is invalid + :raises ValueError: if the name of the step or the names of the keyword + arguments are not valid identifiers + """ + super().__init__(name=_name) + + invalid_kwargs = [ + key for key in kwargs if not (isinstance(key, str) and key.isidentifier()) + ] + if invalid_kwargs: + raise ValueError( + "Names of keyword arguments must be valid identifiers, but got: " + + ", ".join(map(repr, invalid_kwargs)) + ) + + function_arguments = _validate_function( + step=_name, function=_function, kwargs=kwargs + ) + self._name = _name + self._function = _function + self.kwargs = kwargs + self._function_arguments = function_arguments + +
+[docs] + def get_repr_attributes(self) -> Mapping[str, Any]: + """[see superclass]""" + return {"name": self.name, **self.kwargs}
+ + + @property + def function(self) -> Callable[ + ..., + Mapping[str, Any] + | Iterable[Mapping[str, Any]] + | AsyncIterable[Mapping[str, Any]] + | Awaitable[Mapping[str, Any]], + ]: + """ + The function that this step applies to the source product. + """ + return self._function + +
+[docs] + async def atransform( + self, source_product: DictProduct + ) -> AsyncIterator[DictProduct]: + """ + Apply the function of this step to the dictionary managed by the source product. + + :param source_product: the source product containing the dictionary to be + passed to the function of this step + :return: an async iterator of the resulting product or products + """ + + kwargs = self.kwargs + + # Get the source's product's attributes that need to be passed to the function + # of this step. + source_product_attributes = source_product.attributes + + # Warn if the fixed keyword arguments of the step shadow attributes of the + # source. + shadowed_attributes = source_product_attributes.keys() & kwargs.keys() + if shadowed_attributes: + logging.warning( + f"Fixed keyword arguments of step {self.name!r} shadow attributes of " + f"the source product: " + + ", ".join( + f"{attr}={kwargs[attr]} shadows {attr}=" + f"{source_product_attributes[attr]}" + for attr in sorted(shadowed_attributes) + ) + ) + + # Input arguments are the union of the source product attributes and the fixed + # keyword arguments of the step; fixed keyword arguments take precedence over + # source product attributes. + input_args = { + **await self._get_source_product_args(source_product_attributes), + **kwargs, + } + + # Measure the start time of the step. We are interested in CPU time, not wall + # time, so we use time.perf_counter() instead of time.time(). + start_time = time.perf_counter() + + # Call the function of this step with the input arguments. This may return the + # actual result, an iterable of results, or an async iterable of results. + attribute_iterable = self._function(**input_args) + + if isinstance(attribute_iterable, Mapping): + attribute_iterable = iter_sync_to_async([attribute_iterable]) + elif isinstance(attribute_iterable, Awaitable): + attribute_iterable = _awaitable_to_async_iter(attribute_iterable) + elif isinstance(attribute_iterable, Iterable): + attribute_iterable = iter_sync_to_async( + cast(Iterable[Mapping[str, Any]], attribute_iterable) + ) + elif not isinstance(attribute_iterable, AsyncIterable): + raise TypeError( + f"Function {self._function.__name__}() of step {self.name!r} must " + f"return a dictionary, an iterable of dictionaries, or an async " + f"iterable of dictionaries, not {type(attribute_iterable)}" + ) + + async for attributes in attribute_iterable: + # Measure the end time of the step. + end_time = time.perf_counter() + + if not isinstance(attributes, Mapping): + raise TypeError( + f"Expected function {self._function.__name__}() of step " + f"{self.name!r} to return a Mapping or dict, but got: " + f"{attributes!r}" + ) + + log.debug( + f"Completed step {self.name!r} in {end_time - start_time:g} " + f"seconds:\n" + + str( + BinaryOperation( + BinaryOperator.ASSIGN, + Id(self._function)(**input_args), + DictLiteral(**attributes), + ) + ) + ) + + yield DictProduct( + name=self.name, + product_attributes=attributes, + precursor=source_product, + start_time=start_time, + end_time=end_time, + ) + + # Set the start time of the next iteration to the end time of this + # iteration. + start_time = end_time
+ + + async def _get_source_product_args( + self, source_product_attributes: Mapping[str, Any] + ) -> Mapping[str, Any]: + # Get the arguments to pass to the function of this step from the source + # product. + + function_argument_names: Mapping[str, bool] | None = self._function_arguments + if function_argument_names is None: + # The function accepts arbitrary keyword arguments, so we pass all + # attributes of the source product to the function. + source_product_args = source_product_attributes + else: + # We match the attributes of the source product to the free arguments of + # the function. + source_product_args = { + name: source_product_attributes.get( + name, + ( + _NOT_PROVIDED_OPTIONAL + if optional is _ARG_OPTIONAL + else _NOT_PROVIDED + ), + ) + for name, optional in function_argument_names.items() + } + + # Check if all free arguments could be matched + unmatched_arguments = [ + name + for name, value in source_product_args.items() + if value is _NOT_PROVIDED + ] + if unmatched_arguments: + raise ValueError( + f"Step {self.name!r} is missing input attributes: " + + ", ".join(unmatched_arguments) + ) + + # remove optional arguments that were not provided + source_product_args = { + name: value + for name, value in source_product_args.items() + if value is not _NOT_PROVIDED_OPTIONAL + } + return source_product_args
+ + + +# +# Auxiliary functions +# + + +def _validate_function( + *, step: str, function: Callable[..., Any], kwargs: Mapping[str, Any] +) -> Mapping[str, bool] | None: + # Validate the N of the step function, if defined, and determine + # whether the return type is an iterator or an async iterator. + # + # Returns the names of arguments that must or can be determined from the + # source product, or None if arbitrary keyword arguments are allowed. + + # Get the function signature + signature = inspect.signature(function) + + # The free arguments that need to be (or can be) determined from the source product. + # Maps argument names to whether they are required (_ARG_REQUIRED) or optional + # (_ARG_OPTIONAL). + function_arguments: Mapping[str, bool] | None + + # Validate the parameters of the step function + parameters: list[inspect.Parameter] = list(signature.parameters.values()) + if any(parameter.kind == parameter.POSITIONAL_ONLY for parameter in parameters): + raise TypeError("Step function cannot have positional-only parameters.") + + if any(parameter.kind == parameter.VAR_KEYWORD for parameter in parameters): + # We allow arbitrary keyword arguments, so we don't need to check for missing + # named arguments. + function_arguments = None + else: + # If the function does not accept arbitrary keyword arguments, we need to + # ensure that there are named arguments for all fixed keyword arguments. + missing_named_arguments = kwargs.keys() - ( + parameter.name for parameter in parameters + ) + if missing_named_arguments: + raise TypeError( + f"Function {function.__name__} of step {step!r} is missing named " + "arguments for fixed keyword arguments: " + + ", ".join(missing_named_arguments) + ) + + function_arguments = { + parameter.name: ( + _ARG_REQUIRED if parameter.default == parameter.empty else _ARG_OPTIONAL + ) + for parameter in parameters + # We exclude *args from the free arguments … + if parameter.kind != parameter.VAR_POSITIONAL + # … as well as the names of fixed keyword arguments + and parameter.name not in kwargs + } + + # Get the return type of the step. This is either a Mapping, an iterator, or an + # async iterator. If the return type is not specified, we will still test the actual + # return type once the function is called. + return_annotation = signature.return_annotation + if return_annotation != signature.empty: + if isinstance(return_annotation, str): + # The return type is a forward reference, so we need to resolve it to a + # type object. + return_annotation = typing.get_type_hints(function).get("return", None) + if return_annotation is None: # pragma: no cover + # This should never happen + raise TypeError( + f"Return type of function {function.__name__} of step {step!r} " + f"is a forward reference that cannot be resolved: " + f"{return_annotation!r}" + ) + if not any( + issubclass_generic(return_annotation, tp) + for tp in ( + Mapping[str, Any], + Iterable[Mapping[str, Any]], + AsyncIterable[Mapping[str, Any]], + Awaitable[Mapping[str, Any]], + ) + ): + raise TypeError( + f"Return type of function {function.__name__} of step {step!r} must be " + "one of Mapping[str, Any], Iterable[Mapping[str, Any]], or " + f"AsyncIterable[Mapping[str, Any]], but got: {return_annotation}" + ) + + return function_arguments + + +async def _awaitable_to_async_iter( + x: Awaitable[Mapping[str, Any]] +) -> AsyncIterator[Mapping[str, Any]]: + """ + Convert an awaitable to an async iterator. + + :param x: the awaitable to convert + :return: an async iterator that yields the result of the awaitable + """ + yield await x +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/functional/product/_product.html b/docs/_modules/fluxus/functional/product/_product.html new file mode 100644 index 0000000..a157b8e --- /dev/null +++ b/docs/_modules/fluxus/functional/product/_product.html @@ -0,0 +1,585 @@ + + + + + + + + + + fluxus.functional.product._product — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus.functional.product._product

+"""
+Implementation of the functional API for the flow module.
+"""
+
+from __future__ import annotations
+
+import logging
+from collections.abc import Iterator, Mapping
+from typing import Any
+
+from pytools.api import inheritdoc
+from pytools.repr import HasDictRepr
+
+from ...lineage import HasLineage
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "DictProduct",
+]
+
+
+
+[docs] +@inheritdoc(match="""[see superclass]""") +class DictProduct(HasLineage["DictProduct | None"], Mapping[str, Any], HasDictRepr): + """ + A flow product that consists of an attribute-value mapping. + + The product name is the name of the step that created this product. + + The attribute names must be valid Python identifiers. + """ + + #: The product attribute name for the start time of the step that created this + #: product. + KEY_START_TIME = "[start]" + + #: The product attribute name for the end time of the step that created this + #: product. + KEY_END_TIME = "[end]" + + #: The name of this product. + name: str + + #: The attributes that were changed or added by the step that created this product. + _product_attributes: Mapping[str, Any] + + #: The complete set of attributes of this product, including both input and output + #: attributes; output attributes take precedence over input attributes of the same + #: name. + attributes: Mapping[str, Any] + + #: The start CPU time of the step that created this product, in seconds since an + #: arbitrary starting point. + start_time: float + + #: The end CPU time of the step that created this product, in seconds since an + #: arbitrary starting point. + end_time: float + + #: The precursor of this product. + _precursor: DictProduct | None + + def __init__( + self, + name: str, + product_attributes: Mapping[str, Any], + *, + precursor: DictProduct | None = None, + start_time: float, + end_time: float, + ) -> None: + """ + :param name: the name of this product + :param product_attributes: the attributes that were changed or added by the + step that created this product, comprising both fixed attributes and + dynamically generated attributes + :param precursor: the precursor of this product (optional) + :param start_time: the start CPU time of the step that created this product, in + seconds since an arbitrary starting point + :param end_time: the end CPU time of the step that created this product, in + seconds since an arbitrary starting point + :raises TypeError: if the product attributes are not a mapping + :raises ValueError: if the attribute names are not valid Python identifiers + """ + # Validate that the attributes are a mapping with valid identifiers as keys + _validate_product_attributes(product_attributes, step_name=name) + + self.name = name + self.start_time = start_time + self.end_time = end_time + self._product_attributes = product_attributes + self._precursor = precursor + + # We calculate the complete set of attributes by combining the input and output, + # with product attributes (the output) taking precedence over input attributes + # of the same name. + self.attributes = ( + {**precursor.attributes, **product_attributes} + if precursor + else product_attributes + ) + + @property + def product_name(self) -> str: + return self.name + + @property + def product_attributes(self) -> Mapping[str, Any]: + """[see superclass]""" + return self._product_attributes + + @property + def precursor(self) -> DictProduct | None: + """[see superclass]""" + return self._precursor + + def __getitem__(self, __key: str) -> Any: + return self._product_attributes[__key] + + def __len__(self) -> int: + return len(self._product_attributes) + + def __iter__(self) -> Iterator[str]: + return iter(self._product_attributes)
+ + + +def _validate_product_attributes( + attributes: Mapping[str, Any], *, step_name: str +) -> None: + """ + Validate that the output of a step is a :class:`Mapping`, and that its attribute + names are valid Python identifiers. + + :param attributes: the step output to validate + :param step_name: the name of the step producing the mapping + :raises TypeError: if the step output is not a mapping + :raises ValueError: if the attribute names are invalid + """ + + if not isinstance(attributes, Mapping): + raise TypeError( + f"Expected step {step_name!r} to produce mappings, but got: {attributes!r}" + ) + + invalid_names = [ + name + for name in attributes + if not (isinstance(name, str) and name.isidentifier()) + ] + if invalid_names: + raise ValueError( + f"Attribute names in output of step {step_name!r} must be valid Python " + f"identifiers, but included invalid names: " + + ", ".join(map(repr, invalid_names)) + ) +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/lineage/_label.html b/docs/_modules/fluxus/lineage/_label.html new file mode 100644 index 0000000..b3ecaed --- /dev/null +++ b/docs/_modules/fluxus/lineage/_label.html @@ -0,0 +1,787 @@ + + + + + + + + + + fluxus.lineage._label — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus.lineage._label

+"""
+Implementation of class ``LabelingProducer``.
+
+The ``LabelingProducer`` and ``LabelingTransformer`` classes are designed to add
+labels to the products produced by a producer or transformer. These classes are
+subclasses of the ``Producer`` and ``Transformer`` classes, respectively, as well as the
+``_Delegator`` class, which delegates attribute access to a wrapped object, unless the
+dictionary of the wrapped object includes the attribute.
+"""
+
+from __future__ import annotations
+
+import logging
+from abc import ABCMeta, abstractmethod
+from collections.abc import AsyncIterator, Iterator, Mapping
+from typing import Any, Generic, TypeVar, cast
+
+from pytools.api import inheritdoc, subsdoc
+
+from .._producer import Producer
+from .._transformer import Transformer
+from ..core import SerialSource
+from ._lineage import HasLineage
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "LabelingProducer",
+    "LabelingTransformer",
+]
+
+
+#
+# Type variables
+#
+# Naming convention used here:
+# _ret for covariant type variables used in return positions
+# _arg for contravariant type variables used in argument positions
+
+T_Labeler_ret = TypeVar("T_Labeler_ret", bound="_Labeler[Any]", covariant=True)
+T_LabelingProducer_ret = TypeVar(
+    "T_LabelingProducer_ret", bound="LabelingProducer[Any]"
+)
+T_LabelingTransformer_ret = TypeVar(
+    "T_LabelingTransformer_ret", bound="LabelingTransformer[Any, Any]"
+)
+T_Product_ret = TypeVar("T_Product_ret", bound=HasLineage[Any], covariant=True)
+T_Source = TypeVar("T_Source", bound=SerialSource[Any])
+T_SourceProduct_arg = TypeVar("T_SourceProduct_arg", contravariant=True)
+T_TransformedProduct_ret = TypeVar(
+    "T_TransformedProduct_ret", bound=HasLineage[Any], covariant=True
+)
+
+#
+# Constants
+#
+
+# Sentinel value for attribute not found
+_NOT_FOUND = object()
+
+
+#
+# Classes
+#
+
+
+@inheritdoc(match="""[see superclass]""")
+class _Labeler(SerialSource[T_Product_ret], Generic[T_Product_ret], metaclass=ABCMeta):
+    """
+    A mixin class, adding the ``label`` method to a producer or transformer to allow
+    adding labels to the products.
+    """
+
+    @abstractmethod
+    def label(self: T_Labeler_ret, **labels: Any) -> T_Labeler_ret:
+        """
+        Label the product of this source.
+
+        :param labels: the labels to add
+        :return: a new source instance that will add the given labels to all products
+        """
+
+    @subsdoc(
+        pattern=r"(\n\n\s*:return:)",
+        replacement=(
+            r"\n\nIncludes labels added to this conduit, prefixing their names "
+            r"with a '#' character.\1"
+        ),
+        using=SerialSource.get_repr_attributes,
+    )
+    def get_repr_attributes(self) -> Mapping[str, Any]:
+        """[see superclass]"""
+        return super().get_repr_attributes()
+
+
+
+[docs] +class LabelingProducer( + Producer[T_Product_ret], + _Labeler[T_Product_ret], + Generic[T_Product_ret], + metaclass=ABCMeta, +): + """ + A producer that can label their products. + + Implements the :meth:`.label` method, which defines labels that the producer will + add to all products. + + Products must implement the :class:`.HasLineage` interface. + + As an example, consider a producer of numbers: + + .. code-block:: python + + class Number(HasLineage): + # Implementation goes here + + class NumberProducer(Producer[Number], Labeler[A]): + \"""A producer\""" + + def iter(self) -> Iterator[int]: + for i in range(3): + yield Number(i) + + The producer can be labeled with an arbitrary number of labels: + + .. code-block:: python + + producer = NumberProducer().label(a="A", b="B") + + for product in producer.iter(): + print(product.product_labels) + + The output confirms that the labels have been added to all products: + + .. code-block:: python + + {'a': 'A', 'b': 'B'} + {'a': 'A', 'b': 'B'} + {'a': 'A', 'b': 'B'} + + """ + +
+[docs] + def label(self: T_LabelingProducer_ret, **labels: Any) -> T_LabelingProducer_ret: + """ + Label the product of this producer. + + :param labels: the labels to add + :return: a new version of this producer that will add the given labels to all + products + """ + + return cast( + # We cast the producer wrapper to the type of the delegate + T_LabelingProducer_ret, + _LabeledProducer(self, labels), + )
+
+ + + +
+[docs] +class LabelingTransformer( + Transformer[T_SourceProduct_arg, T_TransformedProduct_ret], + _Labeler[T_TransformedProduct_ret], + Generic[T_SourceProduct_arg, T_TransformedProduct_ret], + metaclass=ABCMeta, +): + """ + A transformer that can label its products. + + Implements the :meth:`.label` method, which defines labels that the transformer will + add to all products. + + Products must implement the :class:`.HasLineage` interface. + + As an example, consider a transformer that doubles numbers: + + .. code-block:: python + + class Number(HasLineage): + # Implementation goes here + + class NumberDoubler(Transformer[Number, Number], Labeler[A]): + \"""A transformer that doubles numbers\""" + + def transform(self, source_product: Number) -> Iterator[Number]: + yield Number(source_product.value * 2) + + The transformer can be labeled with an arbitrary number of labels: + + .. code-block:: python + + transformer = NumberDoubler().label(a="A", b="B") + + for product in transformer.run([Number(1), Number(2)]): + print(product.product_labels) + + The output confirms that the labels have been added to all products: + + .. code-block:: python + + {'a': 'A', 'b': 'B'} + {'a': 'A', 'b': 'B'} + """ + +
+[docs] + def label( + self: T_LabelingTransformer_ret, **labels: Any + ) -> T_LabelingTransformer_ret: + """ + Label the product of this transformer. + + :param labels: the labels to add + :return: a new transformer that will add the given labels to all products + """ + return cast( + # We cast the transformer wrapper to the type of the delegate + T_LabelingTransformer_ret, + _LabeledTransformer(self, labels), + )
+
+ + + +@inheritdoc(match="""[see superclass]""") +class _LabeledSource(_Labeler[Any], Generic[T_Source], metaclass=ABCMeta): + """ + A producer or transformer with added labels. Delegates attribute access to a wrapped + ``LabelingProducer`` or ``LabelingTransformer``, except for a small number of + methods that this wrapper class overrides to add labels to the products. + """ + + _delegate: T_Source + _labels: dict[str, Any] + + def __init__(self, delegate: T_Source, labels: dict[str, Any]) -> None: + """ + :param delegate: the producer to delegate to + :param labels: the labels to add to the products + """ + self._delegate = delegate + self._labels = labels + + def __getattribute__(self, name: str) -> Any: + # We won't delegate access to protected attributes + __getattribute__original = object.__getattribute__ + if name.startswith("_"): + return __getattribute__original(self, name) + # First, check if the attribute is in the local dictionary (excluding + # subclasses) + local_attributes = __getattribute__original(self, "__dict__") + value = local_attributes.get(name, _NOT_FOUND) + if value is _NOT_FOUND: + # If the attribute is not in the local dictionary, check if it is in the + # class dictionary + mro = __getattribute__original(self, "__class__").mro() + for cls in mro: + if name in cls.__dict__: + # If it is, use the default behavior to get the attribute + return __getattribute__original(self, name) + if cls is _LabeledSource: + # If we reach the _LabeledSource class, we stop searching + break + # If the attribute is not in the relevant class dictionaries, delegate to + # the wrapped object + return __getattribute__original(local_attributes.get("_delegate"), name) + else: + # Otherwise, return the value + return value + + def get_repr_attributes(self) -> Mapping[str, Any]: + """[see superclass]""" + return dict(self._delegate.get_repr_attributes()) | { + f"#{name}": value for name, value in self._labels.items() + } + + def label(self: T_Labeler_ret, **labels: Any) -> T_Labeler_ret: + """ + Label the product of this source. + + :param labels: the labels to add + :return: a new source instance that will add the given labels to all products + """ + # Combine the existing and new labels + cast(_LabeledSource[T_Source], self).__labels.update(labels) + # We return self to allow method chaining + return self + + +@inheritdoc(match="[see superclass]") +class _LabeledProducer( + LabelingProducer[T_Product_ret], + _LabeledSource[Producer[T_Product_ret]], + Generic[T_Product_ret], +): + """ + A producer that labels its products. + """ + + @property + def product_type(self) -> type[T_Product_ret]: + """[see superclass]""" + # Delegate to the wrapped object by invoking the property on the + # _Delegator class + return self._delegate.product_type + + def iter(self) -> Iterator[T_Product_ret]: + """[see superclass]""" + for product in self._delegate.iter(): + yield product.label(**self._labels) + + async def aiter(self) -> AsyncIterator[T_Product_ret]: + """[see superclass]""" + async for product in self._delegate.aiter(): + yield product.label(**self._labels) + + +@inheritdoc(match="[see superclass]") +class _LabeledTransformer( + LabelingTransformer[T_SourceProduct_arg, T_TransformedProduct_ret], + _LabeledSource[LabelingTransformer[T_SourceProduct_arg, T_TransformedProduct_ret]], + Generic[T_SourceProduct_arg, T_TransformedProduct_ret], +): + """ + A transformer that labels its products. + """ + + @property + def input_type(self) -> type[T_SourceProduct_arg]: + """[see superclass]""" + # Delegate to the wrapped object by invoking the property on the + # _Delegator class + return self._delegate.input_type + + @property + def product_type(self) -> type[T_TransformedProduct_ret]: + """[see superclass]""" + # Delegate to the wrapped object by invoking the property on the + # _Delegator class + return self._delegate.product_type + + def transform( + self, source_product: T_SourceProduct_arg + ) -> Iterator[T_TransformedProduct_ret]: + """[see superclass]""" + for product in self._delegate.transform(source_product): + yield product.label(**self._labels) + + async def atransform( + self, source_product: T_SourceProduct_arg + ) -> AsyncIterator[T_TransformedProduct_ret]: + """[see superclass]""" + async for product in super().atransform(source_product): + yield product.label(**self._labels) +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/lineage/_lineage.html b/docs/_modules/fluxus/lineage/_lineage.html new file mode 100644 index 0000000..188e2b7 --- /dev/null +++ b/docs/_modules/fluxus/lineage/_lineage.html @@ -0,0 +1,798 @@ + + + + + + + + + + fluxus.lineage._lineage — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus.lineage._lineage

+"""
+Implementation of the product lineage interface.
+"""
+
+from __future__ import annotations
+
+import logging
+from abc import ABCMeta, abstractmethod
+from collections.abc import Iterator, Mapping
+from typing import Any, Generic, Self, TypeVar, final
+
+from pytools.api import get_init_params
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "HasLineage",
+    "LineageOrigin",
+]
+
+#
+# Type variables
+#
+# Naming convention used here:
+# _ret for covariant type variables used in return positions
+# _arg for contravariant type variables used in argument positions
+
+T_Precursor_ret = TypeVar(
+    "T_Precursor_ret", bound="HasLineage[Any] | None", covariant=True
+)
+
+
+#
+# Classes
+#
+
+
+
+[docs] +class HasLineage(Generic[T_Precursor_ret], metaclass=ABCMeta): + # noinspection GrazieInspection + """ + Mixin-class for a product of a flow :class:`.Conduit` that is derived from a + preceding product. + + To implement this in a user-defined class, inherit from this class and + implement the :attr:`.precursor` and :attr:`.product_attributes` properties: + + - :attr:`.precursor` + Returns the preceding product, or ``None`` if there is no preceding product. + The preceding product must also implement the :class:`HasLineage` interface. + - :attr:`.product_attributes` + Returns a dictionary of attributes of this product, excluding the precursor. + + Users can also label products with additional values using the :meth:`.label` + method. + + As an example, consider a class ``Route`` that is derived from a preceding class + ``Destination``. ``Destination`` has no precursor, so its :attr:`.precursor` + property returns ``None``. + + .. code-block:: python + + class Destination(LineageOrigin): + \"""A destination in a route.\""" + + def __init__(self, name: str) -> None: + self.name = name + + def product_attributes(self) -> dict[str, Any]: + \""" + Attributes of this product. + \""" + return dict(name=self.name) + + + class Route(HasLineage): + \"""A route to a given destination."" + + def __init__( + self, + start: str, + destination: Destination, + ) -> None: + self.start = start + self.destination = destination + # ... + + @property + def precursor(self) -> Destination: + \""" + The preceding product. + \""" + return self.destination + + @property + def product_attributes(self) -> dict[str, Any]: + \""" + Attributes of this product. + \""" + return dict(start=self.start) + + Now we can create a ``Route`` object representing a route with any number of stops + to a destination, adding custom labels: + + .. code-block:: python + + par = Destination("Paris") + lon_par = Route("London", paris).label(distance_km=340, mode="train") + ber_lon = Route("Berlin", london_paris).label(distance_km=940, mode="plane") + + The lineage of the ``ber_lon`` object is ``[par, lon_par, ber_lon]``. + + Calling ``ber_lon.get_lineage_attributes()`` will return the following dictionary: + + .. code-block:: python + + { + ("Destination", "name"): "Paris", + ("Route", "start"): "London", + ("Route", "distance_km"): 340, + ("Route", "mode"): "train", + ("Route#1", "start"): "Berlin", + ("Route#1", "distance_km"): 940, + ("Route#1", "mode"): "plane", + } + """ + + __labels: dict[str, Any] | None = None + + @property + @abstractmethod + def precursor(self) -> T_Precursor_ret: + """ + The preceding product. + """ + + @property + @abstractmethod + def product_name(self) -> str: + """ + A name representing this product; used for grouping attributes in + :meth:`.get_lineage_attributes`. + """ + + @property + def product_attributes(self) -> Mapping[str, Any]: + """ + Attributes of this product, excluding the precursor. + """ + # The precursor should not be included in the attributes + precursor = self.precursor + + # We get the init parameters, including the default values + params = get_init_params(self, ignore_default=False) + + if precursor is None: + # There is no precursor, so we return all init parameters + return params + + return { + name: value + for name, value in params.items() + # We skip the init parameter if it is the precursor + if value is not precursor + } + + @property + @final + def product_labels(self) -> Mapping[str, Any]: + """ + Labels of this product that were set using the :meth:`.label` method. + """ + return self.__labels or {} + +
+[docs] + def label(self, **labels: Any) -> Self: + """ + Label this product with the given labels and return ``self``. + + This means that labelling can be easily inserted into the code anywhere + the product is used, e.g., after object creation, replacing + + .. code-block:: python + + product = Product() + + with + + .. code-block:: python + + product = Product().label(label1=value1).label(label2=value2) + + or, even better + + .. code-block:: python + + product = Product().label(label1=value1, label2=value2) + + If the labels are already stored in a dictionary, they can be passed as + keyword arguments using the ``**`` operator: + + .. code-block:: python + + labels = {"label1": value1, "label2": value2} + product = Product().label(**labels) + + + :param labels: the labels to add + :return: ``self`` + """ + + # Raise a ValueError if any label would override an existing attribute. + invalid_labels = labels.keys() & self.product_attributes.keys() & labels.keys() + if invalid_labels: + raise ValueError( + f"Label{'s' if len(invalid_labels) > 1 else ''} would override " + "existing attributes: " + ", ".join(invalid_labels) + ) + + existing_labels = self.__labels + if existing_labels is None: + # We do not have a labels dictionary yet, so we create one. + self.__labels = labels + else: + # We already have a labels dictionary, so we update it. + existing_labels.update(labels) + + # We return self to allow method chaining. + return self
+ + +
+[docs] + @final + def get_lineage(self) -> list[HasLineage[Any]]: + """ + Get the lineage of this product as a list, starting with originating product and + ending with this product. + + :return: the lineage of this product + """ + + def _iter_lineage(product: HasLineage[Any]) -> Iterator[HasLineage[Any]]: + precursor = product.precursor + if precursor is not None: + yield from _iter_lineage(precursor) + yield product + + return list(_iter_lineage(self))
+ + +
+[docs] + @final + def get_lineage_attributes(self) -> dict[str, dict[str, Any]]: + """ + Get the attributes and labels of this product's lineage. + + Creates a nested dictionary where the first level keys are product names and the + second level keys are attribute/label names. The second level values are + attribute/label values. + + Product names are unique names for each product in the lineage. This is the + class name of the product, followed by an ascending integer if the same class + appears multiple times in the lineage. For example, if the lineage is ``[A(…), + B(…), A(…), A(…), C(…), B(…)]``, the product names will be ``["A", "B", "A#1", + "A#2", "C", "B#1"]``. + + If a label has the same name as an attribute, the label's value will override + the attribute's value in the dictionary. For example, if a product ``A(…)`` has + an attribute ``"name"`` and a label ``"name"``, the label's value will be used + for the key ``{"A": {"name": …}}``. + + :return: the attributes and labels of the lineage as a nested dictionary + """ + + # The lineage for which we want to get the attributes. + lineage: list[HasLineage[Any]] = self.get_lineage() + + # Create a list of unique names for each product in the lineage. + # This is the class name of the product, followed by an ascending integer + # if the same class appears multiple times in the lineage. + product_names: list[str] = _product_names(lineage) + + # Create a nested dictionary with two levels: the product names and the + # attribute names. Exclude products with no attributes. + return { + product_name: attributes + for product_name, attributes in ( + ( + product_name, + { + # Override attributes with labels + **product.product_attributes, + **product.product_labels, + }, + ) + for product_name, product in zip(product_names, lineage) + ) + if attributes + }
+
+ + + +
+[docs] +class LineageOrigin(HasLineage[None], metaclass=ABCMeta): + """ + A product that has no precursor. + + This is a convenience class for products that are not derived from any other + product. + + To implement this in a user-defined class, inherit from this class and implement + the :attr:`.product_attributes` property, as described in :class:`HasLineage`. + """ + + @property + @final + def precursor(self) -> None: + """ + The preceding product. + """ + return None
+ + + +# +# Auxiliary functions +# + + +def _product_names(lineage: list[HasLineage[Any]]) -> list[str]: + """ + Create a list of unique names for each product in the lineage. + + This is the class name of the product, followed by an ascending integer if the same + class appears multiple times in the lineage. + + For example, if the lineage is ``[A, B, A, A, C, B]``, the names will be + ``["A", "B", "A#1", "A#2", "C", "B#1"]``. + + :param lineage: the lineage of products for which to create names + :return: the list of names + """ + + def _generate_name(item: HasLineage[Any]) -> str: + name = item.product_name + count = counts.get(name, 0) + counts[name] = count + 1 + if count: + return f"{name}#{count}" + else: + return name + + counts: dict[str, int] = {} + + return list(map(_generate_name, lineage)) + + +def _as_dict(mapping: Mapping[str, Any]) -> dict[str, Any]: + """ + Convert a mapping to a dictionary, unless it is already a dictionary. + + :param mapping: the mapping to convert + :return: the dictionary + """ + return mapping if isinstance(mapping, dict) else dict(mapping) +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/simple/_simple.html b/docs/_modules/fluxus/simple/_simple.html new file mode 100644 index 0000000..504eb70 --- /dev/null +++ b/docs/_modules/fluxus/simple/_simple.html @@ -0,0 +1,539 @@ + + + + + + + + + + fluxus.simple._simple — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus.simple._simple

+"""
+Implementation of unions.
+"""
+
+from __future__ import annotations
+
+import logging
+from collections.abc import AsyncIterable, AsyncIterator, Iterable, Iterator
+from typing import Generic, TypeVar
+
+from pytools.api import inheritdoc, to_tuple
+from pytools.typing import isinstance_generic
+
+from .._passthrough import Passthrough
+from .._producer import AsyncProducer, Producer
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "SimpleAsyncProducer",
+    "SimpleProducer",
+]
+
+#
+# Type variables
+#
+# Naming convention used here:
+# _ret for covariant type variables used in return positions
+# _arg for contravariant type variables used in argument positions
+
+T_Product_ret = TypeVar("T_Product_ret", covariant=True)
+
+
+#
+# Constants
+#
+
+# The passthrough singleton instance.
+_PASSTHROUGH = Passthrough()
+
+#
+# Classes
+#
+
+
+
+[docs] +@inheritdoc(match="[see superclass]") +class SimpleProducer(Producer[T_Product_ret], Generic[T_Product_ret]): + """ + A simple producer that iterates over a given list of products. + """ + + #: The products of this producer. + products: Iterable[T_Product_ret] + + def __init__(self, products: Iterable[T_Product_ret]) -> None: + """ + :param products: the products to iterate over + """ + if not isinstance(products, Iterable): + raise TypeError( + f"Products must be an iterable, not {type(products).__name__}" + ) + self.products = products = to_tuple(products) + + product_type = self.product_type + mismatched_products = [ + product + for product in products + if not isinstance_generic(product, product_type) + ] + if mismatched_products: + raise TypeError( + f"Arg products contains products that are not of expected type" + f"{product_type}: " + ", ".join(map(repr, mismatched_products)) + ) + +
+[docs] + def iter(self) -> Iterator[T_Product_ret]: + """[see superclass]""" + return iter(self.products)
+
+ + + +
+[docs] +@inheritdoc(match="[see superclass]") +class SimpleAsyncProducer(AsyncProducer[T_Product_ret], Generic[T_Product_ret]): + """ + A simple asynchronous producer that iterates over a given list of products. + """ + + #: The products of this producer. + products: AsyncIterable[T_Product_ret] + + def __init__(self, products: AsyncIterable[T_Product_ret]) -> None: + """ + :param products: the products to iterate over; must be an async iterable + and will not be materialized by this producer + """ + self.products = products + +
+[docs] + def aiter(self) -> AsyncIterator[T_Product_ret]: + """[see superclass]""" + return aiter(self.products)
+
+ +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/util/_repr.html b/docs/_modules/fluxus/util/_repr.html new file mode 100644 index 0000000..4b6b01a --- /dev/null +++ b/docs/_modules/fluxus/util/_repr.html @@ -0,0 +1,510 @@ + + + + + + + + + + fluxus.util._repr — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus.util._repr

+"""
+Native classes, enhanced with mixin class :class:`.HasExpressionRepr`.
+"""
+
+from __future__ import annotations
+
+import logging
+from collections.abc import Mapping, Sized
+from typing import Any
+
+from pytools.expression.atomic import Id
+from pytools.expression.base import BracketPair, Invocation
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "simplify_repr_attributes",
+]
+
+
+#
+# Classes
+#
+
+
+
+[docs] +def simplify_repr_attributes(attributes: Mapping[str, Any]) -> dict[str, Any]: + """ + Simplify the values of an attribute-value dictionary for representation. + + Returns a new dictionary with all attribute-value pairs where the value is + + - a number + - a boolean + - a string or ``bytes`` object, shortening it to a maximum length of 15 + characters + - an object implementing :func:`len`, returning its type and length as an + :class:`.Expression` object in the format ``"ClassName<n>"`` + + :param attributes: the attribute-value dictionary to simplify + :return: the simplified attribute-value dictionary + """ + return { + name: value + for name, value in ( + (name, _simplify_repr_value(value)) for name, value in attributes.items() + ) + if value is not None + }
+ + + +# +# Auxiliary functions +# +def _simplify_repr_value(value: Any) -> Any: + """ + Simplify a value for representation. + + :param value: the value to simplify + :return: the simplified value + """ + + def _str_repr(s: str) -> str: + # remove leading and trailing whitespace, and escape special characters + return repr(s.strip())[1:-1] + + if isinstance(value, (int, float, complex, bool)): + return value + elif isinstance(value, (str, bytes)): + if len(value) <= 15: + return value + else: + value = str(value.strip()) + return f"{_str_repr(value[:6])}...{_str_repr(value[-6:])}" + elif isinstance(value, Sized): + return Invocation( + Id(type(value)), + brackets=BracketPair.ANGLED, + args=(len(value),), + ) + return None +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/viz/_flow.html b/docs/_modules/fluxus/viz/_flow.html new file mode 100644 index 0000000..6ee83b2 --- /dev/null +++ b/docs/_modules/fluxus/viz/_flow.html @@ -0,0 +1,606 @@ + + + + + + + + + + fluxus.viz._flow — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus.viz._flow

+"""
+Implementation of drawers for conduits and flows.
+"""
+
+from __future__ import annotations
+
+import logging
+from collections.abc import Iterable
+from io import BytesIO
+from typing import Any, Generic, TypeVar
+
+from pytools.api import inheritdoc
+from pytools.viz import ColoredStyle, Drawer, TextStyle
+from pytools.viz.color import ColorScheme
+
+from ..core import Conduit
+from ._graph import FlowGraph
+from .base import FlowStyle
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "FlowDrawer",
+    "FlowGraphStyle",
+    "FlowTextStyle",
+]
+
+#
+# Type variables
+#
+
+
+T_ColorScheme = TypeVar("T_ColorScheme", bound=ColorScheme)
+
+
+#
+# Classes
+#
+
+
+
+[docs] +@inheritdoc(match="[see superclass]") +class FlowTextStyle(FlowStyle, TextStyle): + """ + A style for rendering flows as text. + """ + +
+[docs] + @classmethod + def get_default_style_name(cls) -> str: + return "text"
+ + +
+[docs] + def render_flow(self, flow: Conduit[Any]) -> None: + """[see superclass]""" + print(flow.to_expression(compact=True), file=self.out)
+
+ + + +
+[docs] +@inheritdoc(match="[see superclass]") +class FlowGraphStyle(FlowStyle, ColoredStyle[T_ColorScheme], Generic[T_ColorScheme]): + """ + A style for rendering flows as graphs. + + The graph is rendered using the ``graphviz`` package, which must be installed + for this style to work. + + If no filename is given, displays the graph using the ``IPython.display`` + package which, when used in a Jupyter notebook, will display the graph + inline. + + If a filename is given, writes the graph to a file. + """ + + #: A filename or file-like object to write the graph to (optional) + file: str | BytesIO | None + + #: The format of the graph file (optional, defaults to "png") + format: str | None + + def __init__( + self, + file: str | BytesIO | None = None, + *, + format: str | None = None, + colors: T_ColorScheme | None = None, + ) -> None: + """ + :param file: a filename or file-like object to write the graph to (optional) + :param format: the format of the graph file (optional, defaults to "png") + :param colors: the color scheme to use (optional) + """ + super().__init__(colors=colors) + self.file = file + self.format = format or "png" + +
+[docs] + @classmethod + def get_default_style_name(cls) -> str: + return "graph"
+ + +
+[docs] + def render_flow(self, flow: Conduit[Any]) -> None: # pragma: no cover + """[see superclass]""" + import graphviz + + # get the color scheme + color_scheme = self.colors + + graph: graphviz.Source = graphviz.Source( + FlowGraph.from_conduit(flow).to_dot( + font="Monaco, Consolas, monospace", + fontcolor=color_scheme.contrast_color(color_scheme.accent_1), + fontsize=10, + foreground=color_scheme.foreground, + background=color_scheme.background, + fill=color_scheme.accent_1, + stroke=color_scheme.accent_2, + ) + ) + # set the foreground color of the graph + + if self.file is None: + from IPython.display import display + + display(graph) + elif isinstance(self.file, str): + graph.render(self.file, format=self.format, cleanup=True) + else: + self.file.write(graph.pipe(format=self.format))
+
+ + + +
+[docs] +@inheritdoc(match="[see superclass]") +class FlowDrawer(Drawer[Conduit[Any], FlowStyle]): + """ + A drawer for flows. + + Available styles: + + - :class:`FlowTextStyle` or "text" + - :class:`FlowGraphStyle` or one of "graph", "graph_dark" + """ + +
+[docs] + @classmethod + def get_style_classes(cls) -> Iterable[type[FlowStyle]]: + """[see superclass]""" + return [FlowTextStyle, FlowGraphStyle]
+ + +
+[docs] + @classmethod + def get_default_style(cls) -> FlowStyle: + """[see superclass]""" + return FlowGraphStyle()
+ + +
+[docs] + def _draw(self, data: Conduit[Any]) -> None: + self.style.render_flow(data)
+
+ +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/viz/_graph.html b/docs/_modules/fluxus/viz/_graph.html new file mode 100644 index 0000000..2fef409 --- /dev/null +++ b/docs/_modules/fluxus/viz/_graph.html @@ -0,0 +1,819 @@ + + + + + + + + + + fluxus.viz._graph — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus.viz._graph

+"""
+Implementation of flow graphs.
+"""
+
+from __future__ import annotations
+
+import logging
+from abc import ABCMeta, abstractmethod
+from collections.abc import Iterable, Iterator
+from typing import Any, Literal, Never
+
+from pytools.api import inheritdoc
+from pytools.viz.color import RgbColor, RgbaColor
+
+from .. import Consumer, Producer
+from ..core import Conduit, SerialConduit
+from ..core.producer import BaseProducer
+from ..core.transformer import BaseTransformer
+from ..util import simplify_repr_attributes
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "FlowGraph",
+]
+
+
+#
+# Classes
+#
+
+
+
+[docs] +class FlowGraph: + """ + A graph representation of a :class:`.Flow`, or a partial flow represented + by any type of :class:`.Conduit`. + """ + + #: The set of all single, unconnected conduits in the graph. + single_conduits: set[SerialConduit[Any]] + + #: The set of all connections between conduits in the graph. + connections: set[tuple[SerialConduit[Any], SerialConduit[Any]]] + + def __init__( + self, + *, + single_conduits: set[SerialConduit[Any]] | None = None, + connections: set[tuple[SerialConduit[Any], SerialConduit[Any]]] | None = None, + ) -> None: + """ + :param single_conduits: the set of all single, unconnected conduits in the graph + (optional) + :param connections: the set of all connections between conduits in the graph + (optional) + """ + self.single_conduits = single_conduits or set() + self.connections = connections or set() + +
+[docs] + @classmethod + def from_conduit(cls, conduit: Conduit[Any]) -> FlowGraph: + """ + Create a flow graph from a conduit. + + :param conduit: the conduit to create the flow graph from + :return: the flow graph + """ + # complete the flow if it is not complete + + # we have no leading producer, so we add one + if isinstance(conduit, (BaseTransformer, Consumer)): + # noinspection PyTypeChecker + conduit = _StartNode() >> conduit + + # we have no trailing consumer, so we add one + if isinstance(conduit, BaseProducer): + conduit = conduit >> _EndNode() + + connections: set[tuple[SerialConduit[Any], SerialConduit[Any]]] = set( + conduit.get_connections(ingoing=[]) + ) + + single_conduits: set[SerialConduit[Any]] = { + conduit + for conduit in conduit.iter_concurrent_conduits() + if conduit.is_atomic + } + + return FlowGraph(connections=connections, single_conduits=single_conduits)
+ + +
+[docs] + def to_dot( + self, + *, + width: float | None = 7.5, + font: str | None = None, + fontsize: float | None = None, + fontcolor: RgbColor | RgbaColor | None = None, + background: RgbColor | RgbaColor | None = None, + foreground: RgbColor | RgbaColor | None = None, + fill: RgbColor | RgbaColor | None = None, + stroke: RgbColor | RgbaColor | None = None, + ) -> str: + """ + Convert this flow graph to a DOT graph representation. + + :param width: the width of the graph in inches, or ``None`` for unconstrained + width (defaults to 7.5 inches, fitting letter or A4 pages in portrait mode) + :param font: the font of the graph (optional) + :param fontsize: the font size (optional) + :param fontcolor: the text color of the graph (defaults to the foreground color) + :param background: the background color of the graph (optional) + :param foreground: the foreground color of the graph (optional) + :param fill: the fill color of the nodes (optional) + :param stroke: the stroke color of the nodes and edges (optional) + :return: the DOT graph representation + """ + + # get all connections + connections = self.connections + # get the sets of source nodes and processor nodes + nodes_source = {producer for producer, _ in connections} + nodes_processor = {processor for _, processor in connections} + # all nodes are the union of the two sets, plus the single conduits + nodes = nodes_source | nodes_processor | self.single_conduits + + def _node_id(_node: Conduit[Any]) -> str: + return f"{type(_node).__name__}_{id(_node)}" + + globals = dict( + rankdir="LR", + labeljust="l", + ) + if width is not None: + globals["width"] = f"{width:g}" + node_defaults = dict( + shape="box", + style="rounded", + ) + edge_defaults = dict() + + if background: + globals["bgcolor"] = background.hex + if foreground: + node_defaults["color"] = node_defaults["fontcolor"] = edge_defaults[ + "color" + ] = foreground.hex + if fontcolor: + node_defaults["fontcolor"] = fontcolor.hex + if fontsize: + node_defaults["fontsize"] = f"{fontsize:g}" + if fill: + node_defaults["style"] += ",filled" + node_defaults["fillcolor"] = fill.hex + if stroke: + node_defaults["color"] = edge_defaults["color"] = stroke.hex + if font: + node_defaults["fontname"] = font + + digraph = _DotGraph( + "Flow", + globals=globals, + node_defaults=node_defaults, + edge_defaults=edge_defaults, + ) + + for node in nodes: + node_attrs = {} + node_name = node.name + if isinstance(node, _SpecialNode): + # Special nodes at the start and end of the flow + node_attrs["label"] = f"{node.name}\\n" + node_attrs["shape"] = node.shape + node_attrs["style"] = "dashed" + if stroke: + # same color as the lines + node_attrs["fontcolor"] = node_defaults["color"] + else: + conduit_attributes = simplify_repr_attributes( + {k: v for k, v in node.get_repr_attributes().items() if k != "name"} + ) + if conduit_attributes: + node_attrs["shape"] = "record" + attributes = r"\l".join( + f"{k}={v}" for k, v in conduit_attributes.items() + ) + node_attrs["label"] = f"{node_name}|{attributes}\\l" + else: + node_attrs["label"] = node_name + + if isinstance(node, (BaseProducer, Consumer)): + # Producers and consumers don't get filled, just rounded + node_attrs["style"] = "rounded" + + digraph.add_node(_node_id(node), **node_attrs) + + for source, processor in connections: + if isinstance(source, _SpecialNode) or isinstance(processor, _SpecialNode): + edge_attrs = {"style": "dashed"} + else: + edge_attrs = {} + digraph.add_edge(_node_id(source), _node_id(processor), **edge_attrs) + + return str(digraph)
+
+ + + +# +# Auxiliary classes and functions +# + + +class _DotGraph: + """ + A DOT graph representation. + """ + + #: The name of the graph. + name: str + + #: The type of the graph, either "graph" or "digraph". + graph_type: Literal["graph", "digraph"] + + #: The global attributes of the graph. + globals: dict[str, str] + + #: The default attributes for nodes. + node_defaults: dict[str, str] + + #: The default attributes for edges. + edge_defaults: dict[str, str] + + #: The nodes of the graph, each with a dictionary of attributes. + nodes: dict[str, dict[str, str]] + + #: The edges of the graph, each with a dictionary of attributes. + edges: dict[tuple[str, str], dict[str, str]] + + def __init__( + self, + name: str, + *, + graph_type: Literal["graph", "digraph"] = "digraph", + globals: dict[str, str] | None = None, + node_defaults: dict[str, str] | None = None, + edge_defaults: dict[str, str] | None = None, + ) -> None: + """ + :param name: the name of the graph + :param graph_type: the type of the graph, either "graph" or "digraph" + (default: "digraph") + :param globals: global attributes of the graph (optional) + :param node_defaults: default attributes for nodes (optional) + :param edge_defaults: default attributes for edges (optional) + """ + self.name = name + self.graph_type = graph_type + self.globals = globals or {} + self.node_defaults = node_defaults or {} + self.edge_defaults = edge_defaults or {} + + self.nodes = {} + self.edges = {} + + def add_node(self, name: str, **attrs: str) -> None: + """ + Add a node to the graph. + + :param name: the name of the node + :param attrs: the attributes of the node, as additional keyword arguments + (optional) + """ + self.nodes[name] = attrs + + def add_edge(self, source: str, target: str, **attrs: str) -> None: + """ + Add an edge to the graph. + + :param source: the source node of the edge + :param target: the target node of the edge + :param attrs: the attributes of the edge, as additional keyword arguments + (optional) + """ + self.edges[(source, target)] = attrs + + def __str__(self) -> str: + + def _escape_string(s: str) -> str: + if s and all(c.isalnum() or c in "_-" for c in s): + return s + else: + s.replace('"', '\\"') + return f'"{s}"' + + def _attr_string(_attrs: dict[str, str]) -> str: + return ",".join( + f"{_attr}={_escape_string(_value)}" for _attr, _value in _attrs.items() + ) + + dot: str = f'{self.graph_type} "{self.name}" {{' + for attr, value in self.globals.items(): + dot += f"\n {attr}={_escape_string(value)};" # noqa: E702 + if self.node_defaults: + dot += f"\n node [{_attr_string(self.node_defaults)}];" # noqa: E702 + if self.edge_defaults: + dot += f"\n edge [{_attr_string(self.edge_defaults)}];" # noqa: E702 + for node, attrs in self.nodes.items(): + dot += f"\n {_escape_string(node)}" + if attrs: + dot += f" [{_attr_string(attrs)}];" # noqa: E702 + for (source, target), attrs in self.edges.items(): + dot += f"\n {_escape_string(source)} -> {_escape_string(target)}" + if attrs: + dot += f"[{_attr_string(attrs)}];" # noqa: E702 + + dot += "\n}" + return dot + + +class _SpecialNode(Conduit[Never], metaclass=ABCMeta): + """ + A special node that is not part of the flow graph. + """ + + @property + @abstractmethod + def shape(self) -> str: + """ + The shape of the node. + """ + + +@inheritdoc(match="[see superclass]") +class _StartNode(_SpecialNode, Producer[Never]): + """ + An input producer that produces a single product. + """ + + @property + def name(self) -> str: + """[see superclass]""" + # A greater than symbol evokes 'start' + return ">" + + @property + def shape(self) -> str: + """[see superclass]""" + return "circle" + + def iter(self) -> Iterator[Never]: # pragma: no cover + """ + Yield nothing. + """ + yield from () + + +@inheritdoc(match="[see superclass]") +class _EndNode(_SpecialNode, Consumer[Any, Never]): + """ + An output consumer that consumes a single product. + """ + + @property + def name(self) -> str: + """[see superclass]""" + # We use mathematical angle brackets to make this node visually distinct from + # other nodes. + + # symbol evokes 'pause' or 'stop' + return "||" + + @property + def shape(self) -> str: + """[see superclass]""" + return "doublecircle" + + def consume(self, products: Iterable[Any]) -> Never: # pragma: no cover + """ + Consume anything, return nothing. + + :param products: the products to consume + :return: never returns + :raises NotImplementedError: always + """ + raise NotImplementedError("Output consumer cannot consume any products") +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/viz/_timeline.html b/docs/_modules/fluxus/viz/_timeline.html new file mode 100644 index 0000000..00a83a3 --- /dev/null +++ b/docs/_modules/fluxus/viz/_timeline.html @@ -0,0 +1,699 @@ + + + + + + + + + + fluxus.viz._timeline — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus.viz._timeline

+"""
+Implementation of the Timeline drawer.
+"""
+
+from __future__ import annotations
+
+import logging
+import math
+from collections.abc import Iterable
+from typing import Any
+
+from pytools.api import inheritdoc
+from pytools.text import format_table
+from pytools.viz import Drawer, MatplotStyle, TextStyle
+from pytools.viz.util import FittedText
+
+from ..functional import RunResult
+from ..functional.product import DictProduct
+from .base import TimelineStyle
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "TimelineDrawer",
+    "TimelineTextStyle",
+    "TimelineMatplotStyle",
+]
+
+
+
+[docs] +@inheritdoc(match="""[see superclass]""") +class TimelineDrawer(Drawer[RunResult, TimelineStyle]): + """ + A drawer for rendering timelines. + """ + +
+[docs] + @classmethod + def get_style_classes(cls) -> Iterable[type[TimelineStyle]]: + """[see superclass]""" + return [TimelineTextStyle, TimelineMatplotStyle]
+ + +
+[docs] + @classmethod + def get_default_style(cls) -> TimelineStyle: + """[see superclass]""" + return TimelineTextStyle()
+ + +
+[docs] + def _draw(self, data: RunResult) -> None: + outputs: list[ + # a list of individual flow outputs + list[ + # a list of steps in a single flow output + tuple[ + # a single step in a flow output + int, # the path index + str, # the step name + float, # the start time + float, # the end time + ] + ] + ] = [ + [ + ( + # the path index + path_index, + # the step name + step, + # the start time + step_result[DictProduct.KEY_START_TIME], + # the end time + step_result[DictProduct.KEY_END_TIME], + ) + for step, step_result in output.items() + if DictProduct.KEY_START_TIME in step_result + and DictProduct.KEY_START_TIME in step_result + ] + for path_index, output in ( + (path_index, output) + for path_index, path_outputs in enumerate(data.get_outputs_per_path()) + for output in path_outputs + ) + ] + + style = self.style + + # the index of the flow output + output_index: int + # a list of steps leading up to a single output + execution: list[tuple[int, str, float, float]] + + for output_index, execution in enumerate(outputs): + style.render_timeline(output_index, execution)
+
+ + + +
+[docs] +@inheritdoc(match="[see superclass]") +class TimelineTextStyle(TimelineStyle, TextStyle): + """ + A style for rendering timelines as plain text. + """ + + #: The header of the timeline table + TABLE_HEADER = ["Output", "Path", "Step", "Start", "End"] + + #: The format of the columns in the timeline table + COLUMN_FORMATS = ["d", "d", "s", "g", "g"] + + #: The alignment of the columns in the timeline table + COLUMN_ALIGNMENT = [">", ">", "<", ">", ">"] + + #: The rows of the timeline table + _table_rows: list[tuple[int, int, str, float, float]] + +
+[docs] + def start_drawing(self, *, title: str, **kwargs: Any) -> None: + """[see superclass]""" + super().start_drawing(title=title, **kwargs) + self._table_rows = []
+ + +
+[docs] + def render_timeline( + self, output_index: int, timeline: Iterable[tuple[int, str, float, float]] + ) -> None: + """[see superclass]""" + self._table_rows.extend( + (output_index, path_index, step, start, end) + for path_index, step, start, end in timeline + )
+ + +
+[docs] + def finalize_drawing(self, **kwargs: Any) -> None: + """[see superclass]""" + self.out.write( + format_table( + headings=self.TABLE_HEADER, + data=self._table_rows, + formats=self.COLUMN_FORMATS, + alignment=self.COLUMN_ALIGNMENT, + ) + ) + del self._table_rows + super().finalize_drawing(**kwargs)
+
+ + + +
+[docs] +@inheritdoc(match="[see superclass]") +class TimelineMatplotStyle(TimelineStyle, MatplotStyle): # pragma: no cover + """ + A style for rendering timelines as a horizontal bar chart in a Matplotlib plot. + """ + + # The minimum time in the timeline + _min_time: float + # The maximum time in the timeline + _max_time: float + # The maximum output index + _max_output_index: int + +
+[docs] + def start_drawing(self, *, title: str, **kwargs: Any) -> None: + """[see superclass]""" + super().start_drawing(title=title, **kwargs) + self._min_time = math.inf + self._max_time = -math.inf + self._max_output_index = 0
+ + +
+[docs] + def render_timeline( + self, output_index: int, timeline: Iterable[tuple[int, str, float, float]] + ) -> None: + """[see superclass]""" + + start_time_all: tuple[float, ...] + end_time_all: tuple[float, ...] + start_time_all, end_time_all, step_all = zip( + *( + (start_time, end_time, step) + for _, step, start_time, end_time in timeline + ) + ) + + self._min_time = min(self._min_time, *start_time_all) + self._max_time = max(self._max_time, *end_time_all) + self._max_output_index = max(self._max_output_index, output_index) + + colors = self.colors + ax = self.ax + + # Calculate the coordinates of the bars + n = len(start_time_all) + y = -output_index + x_all = start_time_all + width_all = [ + end_time - start_time + for start_time, end_time in zip(start_time_all, end_time_all) + ] + color_all = [ + colors.accent_1 if i % 2 == 0 else colors.accent_2 for i in range(n) + ] + + # Draw the bars + ax.barh( + y=y, + width=width_all, + left=x_all, + height=0.8, + # fill color is accent 1 for even steps, accent 2 for odd steps + color=color_all, + # black edge color, so that the bars are visually separated + edgecolor=colors.background, + ) + + # Draw the text + for x, width, color, step in zip(x_all, width_all, color_all, step_all): + ax.add_artist( + FittedText( + x=x + width / 2, + y=y, + width=width, + height=0.8, + text=step, + ha="center", + va="center", + color=colors.contrast_color(color), + ) + )
+ + +
+[docs] + def finalize_drawing(self, **kwargs: Any) -> None: + """[see superclass]""" + + ax = self.ax + + # Set the x axis limits to the minimum and maximum times + if self._min_time != math.inf and self._max_time != -math.inf: + ax.set_xlim(self._min_time, self._max_time) + + # Set axis labels + ax.set_xlabel("Seconds") + ax.set_ylabel("Output") + + # Set the y-axis ticks to the output indices, in descending order + ax.set_yticks(list(range(-self._max_output_index, 1))) + ax.set_yticklabels(list(map(str, reversed(range(self._max_output_index + 1))))) + + super().finalize_drawing(**kwargs)
+
+ +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/fluxus/viz/base/_style.html b/docs/_modules/fluxus/viz/base/_style.html new file mode 100644 index 0000000..52beccc --- /dev/null +++ b/docs/_modules/fluxus/viz/base/_style.html @@ -0,0 +1,497 @@ + + + + + + + + + + fluxus.viz.base._style — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + +
+
+ +
+
+ +
+ +
+ + + + +
+ +
+ + +
+
+ + + + + +
+ +

Source code for fluxus.viz.base._style

+"""
+Implementation of visualization base classes.
+"""
+
+from __future__ import annotations
+
+import logging
+from abc import ABCMeta, abstractmethod
+from collections.abc import Iterable
+from typing import Any
+
+from pytools.viz import DrawingStyle
+
+from ...core import Conduit
+
+log = logging.getLogger(__name__)
+
+__all__ = [
+    "FlowStyle",
+    "TimelineStyle",
+]
+
+
+#
+# CLasses
+#
+
+
+
+[docs] +class FlowStyle(DrawingStyle, metaclass=ABCMeta): + """ + A style for rendering flows. + """ + +
+[docs] + @abstractmethod + def render_flow(self, flow: Conduit[Any]) -> None: + """ + Render the given flow. + + :param flow: the flow to render + """
+
+ + + +
+[docs] +class TimelineStyle(DrawingStyle, metaclass=ABCMeta): + """ + A style for rendering timelines. + """ + +
+[docs] + @abstractmethod + def render_timeline( + self, output_index: int, timeline: Iterable[tuple[int, str, float, float]] + ) -> None: + """ + Render the timeline of steps leading up to individual flow outputs. + + :param output_index: the index of the output + :param timeline: the timeline to render, as an iterable of quadruples specifying + the path index, step name, the start time, and the end time + """
+
+ +
+ +
+ + + + + +
+ + + + +
+ + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_modules/index.html b/docs/_modules/index.html new file mode 100644 index 0000000..a041b80 --- /dev/null +++ b/docs/_modules/index.html @@ -0,0 +1,451 @@ + + + + + + + + + + Overview: module code — fluxus documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + + + + + + + + +
+
+
+
+
+ +
+ +
+ + + + + +
+
+ + + + + +
+ + + + + + + + + + + +
+ +
+ + + + + +
+
+
+ + + + + + + + \ No newline at end of file diff --git a/docs/_sources/_generated/home.rst.txt b/docs/_sources/_generated/home.rst.txt new file mode 100644 index 0000000..964e820 --- /dev/null +++ b/docs/_sources/_generated/home.rst.txt @@ -0,0 +1,274 @@ +.. _home: + +Home +==== + +**FLUXUS** is a Python framework designed by `BCG X `_ to +streamline the development of complex data processing pipelines (called *flows*), +enabling users to quickly and efficiently build, test, and deploy data workflows, +making complex operations more manageable. + +Introducing Flows +----------------- + +A flow in *fluxus* represents a Directed Acyclic Graph (DAG) where each node performs +a specific operation on the data. These nodes, called *conduits*, are the building +blocks of a flow, and the data elements that move through the flow are referred to as +*products*. The conduits are connected to ensure that *products* are processed and +transferred correctly from one stage to another. + +Within a *fluxus* flow, there are three main types of conduits: + +- **Producers**: These conduits generate or gather raw data from various sources such as + databases, APIs, or sensors. They are the entry points of the flow, feeding initial + *products* into the system. +- **Transformers**: These conduits take the *products* from producers and transform + them. This can involve filtering, aggregating, enriching, or changing the data to fit + the required output or format. +- **Consumers**: Consumers represent the endpoints of the flow. Each flow has exactly + one consumer, which handles the final processed *products*. The consumer may store the + data, display it in a user interface, or send it to another system. + + +A Simple Example +---------------- + +Consider a simple flow that takes a greeting message, converts it to different cases +(uppercase, lowercase), and then annotates each message with the case change that +has been applied. The flow looks like this: + +.. image:: /_images/flow-hello-world.svg + :alt: "Hello World" flow diagram + :width: 600px + + +With *fluxus*, we can define this flow as follows: + +.. code-block:: python + + from fluxus.functional import step, passthrough, run + + input_data = [ + dict(greeting="Hello, World!"), + dict(greeting="Bonjour!"), + ] + + def lower(greeting: str) -> dict[str, str]: + # Convert the greeting to lowercase and keep track of the case change + return dict( + greeting=greeting.lower(), + case="lower", + ) + + def upper(greeting: str) -> dict[str, str]: + # Convert the greeting to uppercase and keep track of the case change + return dict( + greeting=greeting.upper(), + tone="upper", + ) + + def annotate(greeting: str, case: str = "original") -> dict[str, str]: + # Annotate the greeting with the case change; default to "original" + return dict(greeting=f"{greeting!r} ({case})") + + flow = ( + step("input", input_data) # initial producer step + >> ( # 3 parallel steps: upper, lower, and passthrough + step("lower", lower) + & step("upper", upper) + & passthrough() # passthrough the original input data + ) + >> step("annotate", annotate) # annotate all outputs + ) + + # Draw the flow diagram + flow.draw() + +Note the ``passthrough()`` step in the flow. This step is a special type of conduit that +simply passes the input data along without modification. This is useful when you want to +run multiple transformations in parallel but still want to preserve the original data +for further processing. + +You may have noted that the above code does not define a final consumer step. This is +because the ``run`` function automatically adds a consumer step to the end of the flow +to collect the final output. Custom consumers come into play when you start building +more customised flows using the object-oriented API instead of the simpler functional +API we are using here. + +We run the flow with + +.. code-block:: python + + result = run(flow) + +This gives us the following output in :code:`result`: + +.. code-block:: python + + RunResult( + [ + { + 'input': {'greeting': 'Hello, World!'}, + 'lower': {'greeting': 'hello, world!', 'case': 'lower'}, + 'annotate': {'greeting': "'hello, world!' (lower)"} + }, + { + 'input': {'greeting': 'Bonjour!'}, + 'lower': {'greeting': 'bonjour!', 'case': 'lower'}, + 'annotate': {'greeting': "'bonjour!' (lower)"} + } + ], + [ + { + 'input': {'greeting': 'Hello, World!'}, + 'upper': {'greeting': 'HELLO, WORLD!', 'tone': 'upper'}, + 'annotate': {'greeting': "'HELLO, WORLD!' (original)"} + }, + { + 'input': {'greeting': 'Bonjour!'}, + 'upper': {'greeting': 'BONJOUR!', 'tone': 'upper'}, + 'annotate': {'greeting': "'BONJOUR!' (original)"} + } + ], + [ + { + 'input': {'greeting': 'Hello, World!'}, + 'annotate': {'greeting': "'Hello, World!' (original)"} + }, + { + 'input': {'greeting': 'Bonjour!'}, + 'annotate': {'greeting': "'Bonjour!' (original)"} + } + ] + ) + + +Here's what happened: The flow starts with a single input data item, which is then +passed along three parallel paths. Each path applies different transformations to the +data. The flow then combines the results of these transformations into a single output, +the :code:`RunResult`. + +Note that the result contains six outputs—one for each of the two input data items along +each of the three paths through the flow. Also note that the results are grouped as +separate lists for each path. + +The run result not only gives us the final product of the ``annotate`` step but also the +inputs and intermediate products of the ``lower`` and ``upper`` steps. We refer to this +extended view of the flow results as the *lineage* of the flow. + +For a more thorough introduction to FLUXUS, please visit our `User Guide <#>`_ and +`Examples <#>`_! + + +Why *fluxus*? +------------- + +The complexity of data processing tasks demands tools that streamline operations and +ensure efficiency. *fluxus* addresses these needs by offering a structured approach to +creating flows that handle various data sources and processing requirements. Key +motivations for using *fluxus* include: + +- **Organisation and Structure**: *fluxus* offers a clear, structured approach to data + processing, breaking down complex operations into manageable steps. +- **Maintainability**: Its modular design allows individual components to be developed, + tested, and debugged independently, simplifying maintenance and updates. +- **Reusability**: Components in *fluxus* can be reused across different projects, + reducing development time and effort. +- **Efficiency**: By supporting concurrent processing, *fluxus* ensures optimal use of + system resources, speeding up data processing tasks. +- **Ease of Use**: *fluxus* provides a functional API that abstracts away the + complexities of data processing, making it accessible to developers of all levels. + More experienced users can also leverage the advanced features of its underlying + object-oriented implementation for customisation and optimisation (see + `Advanced Features <#>`_ for more details). + + + +Concurrent Processing in *fluxus* +--------------------------------- + +A standout feature of *fluxus* is its support for concurrent processing, allowing +multiple operations to run simultaneously. This is essential for: + +- **Performance**: Significantly reducing data processing time by executing multiple + data streams or tasks in parallel. +- **Resource Utilisation**: Maximising the use of system resources by distributing the + processing load across multiple processes or threads. + +*fluxus* leverages Python techniques such as threading and asynchronous programming to +achieve concurrent processing. + +By harnessing the capabilities of *fluxus*, developers can build efficient, scalable, +and maintainable data processing systems that meet the demands of contemporary +applications. + +Getting started +=============== + +- See the `FLUXUS Documentation <#>`_ for a comprehensive User Guide, Examples, + API reference, and more. +- See `Contributing `_ or visit our detailed `Contributor Guide <#>`_ + for information on contributing. +- We have an `FAQ <#>`_ for common questions. For anything else, please reach out to + ARTKIT@bcg.com. + + +User Installation +----------------- + +Install using ``pip``: + +.. code-block:: bash + + pip install fluxus + +or ``conda``: + +.. code-block:: bash + + conda install -c bcgx fluxus + + +Optional dependencies +^^^^^^^^^^^^^^^^^^^^^ + +To enable visualizations of flow diagrams, install `GraphViz `_ +and ensure it is in your system's PATH variable: + +- For MacOS and Linux users, instructions provided on `GraphViz Downloads `_ automatically add GraphViz to your path. +- Windows users may need to manually add GraphViz to your PATH (see `Simplified Windows installation procedure `_). +- Run ``dot -V`` in Terminal or Command Prompt to verify installation. + + +Environment Setup +----------------- + +Virtual environment +^^^^^^^^^^^^^^^^^^^ + +We recommend working in a dedicated environment, e.g., using ``venv``: + +.. code-block:: bash + + python -m venv fluxus + source fluxus/bin/activate + +or ``conda``: + +.. code-block:: bash + + conda env create -f environment.yml + conda activate fluxus + + +Contributing +------------ + +Contributions to ARTKIT are welcome and appreciated! Please see the `Contributing `_ section for information. + + +License +------- + +This project is under the Apache License 2.0, allowing free use, modification, and distribution with added protections against patent litigation. +See the `LICENSE `_ file for more details or visit `Apache 2.0 `_. diff --git a/docs/_sources/_generated/release_notes.rst.txt b/docs/_sources/_generated/release_notes.rst.txt new file mode 100644 index 0000000..42d5762 --- /dev/null +++ b/docs/_sources/_generated/release_notes.rst.txt @@ -0,0 +1,12 @@ +.. _release-notes: + +Release Notes +============= + +*fluxus* 1.0 +------------ + +*fluxus* 1.0.0 +~~~~~~~~~~~~~~ + +- Initial release of *fluxus*. \ No newline at end of file diff --git a/docs/_sources/api_landing.rst.txt b/docs/_sources/api_landing.rst.txt new file mode 100644 index 0000000..93a23f1 --- /dev/null +++ b/docs/_sources/api_landing.rst.txt @@ -0,0 +1 @@ +Please see the :ref:`release notes ` for recent API updates and bug fixes. \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus.rst.txt b/docs/_sources/apidoc/fluxus.rst.txt new file mode 100644 index 0000000..5ad6f19 --- /dev/null +++ b/docs/_sources/apidoc/fluxus.rst.txt @@ -0,0 +1,79 @@ +.. _api: + +API Reference +============= + +.. toctree:: + :maxdepth: 1 + :hidden: + + self + +.. automodule:: fluxus + :no-imported-members: + + + .. include:: ../api_landing.rst + + + + + ======= + Classes + ======= + + .. autosummary:: + :toctree: fluxus + :template: custom-class-template.rst + :nosignatures: + + AsyncConsumer + AsyncProducer + AsyncTransformer + Consumer + Flow + Passthrough + Producer + Transformer + + + + + + + + + + + + + + ========== + Exceptions + ========== + + .. autosummary:: + :toctree: fluxus + + FlowWarning + + + + + + +========== +Submodules +========== +.. autosummary:: + :toctree: fluxus + :template: custom-module-template.rst + :recursive: + + fluxus.core + fluxus.functional + fluxus.lineage + fluxus.simple + fluxus.util + fluxus.viz + diff --git a/docs/_sources/apidoc/fluxus/core/fluxus.core.AtomicConduit.rst.txt b/docs/_sources/apidoc/fluxus/core/fluxus.core.AtomicConduit.rst.txt new file mode 100644 index 0000000..569f522 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/core/fluxus.core.AtomicConduit.rst.txt @@ -0,0 +1,45 @@ +fluxus.core.AtomicConduit +========================= + +.. currentmodule:: fluxus.core + +.. autoclass:: AtomicConduit + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~AtomicConduit.aiter_concurrent_conduits + ~AtomicConduit.draw + ~AtomicConduit.get_connections + ~AtomicConduit.get_final_conduits + ~AtomicConduit.get_repr_attributes + ~AtomicConduit.iter_concurrent_conduits + ~AtomicConduit.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~AtomicConduit.chained_conduits + ~AtomicConduit.final_conduit + ~AtomicConduit.is_atomic + ~AtomicConduit.is_chained + ~AtomicConduit.is_concurrent + ~AtomicConduit.n_concurrent_conduits + ~AtomicConduit.name + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/core/fluxus.core.ConcurrentConduit.rst.txt b/docs/_sources/apidoc/fluxus/core/fluxus.core.ConcurrentConduit.rst.txt new file mode 100644 index 0000000..5f8a764 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/core/fluxus.core.ConcurrentConduit.rst.txt @@ -0,0 +1,43 @@ +fluxus.core.ConcurrentConduit +============================= + +.. currentmodule:: fluxus.core + +.. autoclass:: ConcurrentConduit + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~ConcurrentConduit.aiter_concurrent_conduits + ~ConcurrentConduit.draw + ~ConcurrentConduit.get_connections + ~ConcurrentConduit.get_final_conduits + ~ConcurrentConduit.iter_concurrent_conduits + ~ConcurrentConduit.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~ConcurrentConduit.final_conduit + ~ConcurrentConduit.is_atomic + ~ConcurrentConduit.is_chained + ~ConcurrentConduit.is_concurrent + ~ConcurrentConduit.n_concurrent_conduits + ~ConcurrentConduit.name + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/core/fluxus.core.Conduit.rst.txt b/docs/_sources/apidoc/fluxus/core/fluxus.core.Conduit.rst.txt new file mode 100644 index 0000000..f11ae64 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/core/fluxus.core.Conduit.rst.txt @@ -0,0 +1,43 @@ +fluxus.core.Conduit +=================== + +.. currentmodule:: fluxus.core + +.. autoclass:: Conduit + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~Conduit.aiter_concurrent_conduits + ~Conduit.draw + ~Conduit.get_connections + ~Conduit.get_final_conduits + ~Conduit.iter_concurrent_conduits + ~Conduit.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~Conduit.final_conduit + ~Conduit.is_atomic + ~Conduit.is_chained + ~Conduit.is_concurrent + ~Conduit.n_concurrent_conduits + ~Conduit.name + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/core/fluxus.core.Processor.rst.txt b/docs/_sources/apidoc/fluxus/core/fluxus.core.Processor.rst.txt new file mode 100644 index 0000000..046ef1c --- /dev/null +++ b/docs/_sources/apidoc/fluxus/core/fluxus.core.Processor.rst.txt @@ -0,0 +1,47 @@ +fluxus.core.Processor +===================== + +.. currentmodule:: fluxus.core + +.. autoclass:: Processor + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~Processor.aiter_concurrent_conduits + ~Processor.aprocess + ~Processor.draw + ~Processor.get_connections + ~Processor.get_final_conduits + ~Processor.is_valid_source + ~Processor.iter_concurrent_conduits + ~Processor.process + ~Processor.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~Processor.final_conduit + ~Processor.input_type + ~Processor.is_atomic + ~Processor.is_chained + ~Processor.is_concurrent + ~Processor.n_concurrent_conduits + ~Processor.name + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/core/fluxus.core.SerialConduit.rst.txt b/docs/_sources/apidoc/fluxus/core/fluxus.core.SerialConduit.rst.txt new file mode 100644 index 0000000..eb75ec6 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/core/fluxus.core.SerialConduit.rst.txt @@ -0,0 +1,45 @@ +fluxus.core.SerialConduit +========================= + +.. currentmodule:: fluxus.core + +.. autoclass:: SerialConduit + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~SerialConduit.aiter_concurrent_conduits + ~SerialConduit.draw + ~SerialConduit.get_connections + ~SerialConduit.get_final_conduits + ~SerialConduit.get_repr_attributes + ~SerialConduit.iter_concurrent_conduits + ~SerialConduit.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~SerialConduit.chained_conduits + ~SerialConduit.final_conduit + ~SerialConduit.is_atomic + ~SerialConduit.is_chained + ~SerialConduit.is_concurrent + ~SerialConduit.n_concurrent_conduits + ~SerialConduit.name + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/core/fluxus.core.SerialProcessor.rst.txt b/docs/_sources/apidoc/fluxus/core/fluxus.core.SerialProcessor.rst.txt new file mode 100644 index 0000000..2dfccec --- /dev/null +++ b/docs/_sources/apidoc/fluxus/core/fluxus.core.SerialProcessor.rst.txt @@ -0,0 +1,49 @@ +fluxus.core.SerialProcessor +=========================== + +.. currentmodule:: fluxus.core + +.. autoclass:: SerialProcessor + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~SerialProcessor.aiter_concurrent_conduits + ~SerialProcessor.aprocess + ~SerialProcessor.draw + ~SerialProcessor.get_connections + ~SerialProcessor.get_final_conduits + ~SerialProcessor.get_repr_attributes + ~SerialProcessor.is_valid_source + ~SerialProcessor.iter_concurrent_conduits + ~SerialProcessor.process + ~SerialProcessor.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~SerialProcessor.chained_conduits + ~SerialProcessor.final_conduit + ~SerialProcessor.input_type + ~SerialProcessor.is_atomic + ~SerialProcessor.is_chained + ~SerialProcessor.is_concurrent + ~SerialProcessor.n_concurrent_conduits + ~SerialProcessor.name + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/core/fluxus.core.SerialSource.rst.txt b/docs/_sources/apidoc/fluxus/core/fluxus.core.SerialSource.rst.txt new file mode 100644 index 0000000..08dfd1c --- /dev/null +++ b/docs/_sources/apidoc/fluxus/core/fluxus.core.SerialSource.rst.txt @@ -0,0 +1,46 @@ +fluxus.core.SerialSource +======================== + +.. currentmodule:: fluxus.core + +.. autoclass:: SerialSource + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~SerialSource.aiter_concurrent_conduits + ~SerialSource.draw + ~SerialSource.get_connections + ~SerialSource.get_final_conduits + ~SerialSource.get_repr_attributes + ~SerialSource.iter_concurrent_conduits + ~SerialSource.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~SerialSource.chained_conduits + ~SerialSource.final_conduit + ~SerialSource.is_atomic + ~SerialSource.is_chained + ~SerialSource.is_concurrent + ~SerialSource.n_concurrent_conduits + ~SerialSource.name + ~SerialSource.product_type + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/core/fluxus.core.Source.rst.txt b/docs/_sources/apidoc/fluxus/core/fluxus.core.Source.rst.txt new file mode 100644 index 0000000..38ce55e --- /dev/null +++ b/docs/_sources/apidoc/fluxus/core/fluxus.core.Source.rst.txt @@ -0,0 +1,44 @@ +fluxus.core.Source +================== + +.. currentmodule:: fluxus.core + +.. autoclass:: Source + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~Source.aiter_concurrent_conduits + ~Source.draw + ~Source.get_connections + ~Source.get_final_conduits + ~Source.iter_concurrent_conduits + ~Source.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~Source.final_conduit + ~Source.is_atomic + ~Source.is_chained + ~Source.is_concurrent + ~Source.n_concurrent_conduits + ~Source.name + ~Source.product_type + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/core/fluxus.core.producer.rst.txt b/docs/_sources/apidoc/fluxus/core/fluxus.core.producer.rst.txt new file mode 100644 index 0000000..979e02f --- /dev/null +++ b/docs/_sources/apidoc/fluxus/core/fluxus.core.producer.rst.txt @@ -0,0 +1,46 @@ +fluxus.core.producer +==================== + +.. toctree:: + :maxdepth: 1 + :hidden: + + self + +.. automodule:: fluxus.core.producer + :no-imported-members: + + + + + + ======= + Classes + ======= + + .. autosummary:: + :toctree: producer + :template: custom-class-template.rst + :nosignatures: + + BaseProducer + ConcurrentProducer + SerialProducer + SimpleConcurrentProducer + + + + + + + + + + + + + + + + + diff --git a/docs/_sources/apidoc/fluxus/core/fluxus.core.transformer.rst.txt b/docs/_sources/apidoc/fluxus/core/fluxus.core.transformer.rst.txt new file mode 100644 index 0000000..8c00a63 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/core/fluxus.core.transformer.rst.txt @@ -0,0 +1,46 @@ +fluxus.core.transformer +======================= + +.. toctree:: + :maxdepth: 1 + :hidden: + + self + +.. automodule:: fluxus.core.transformer + :no-imported-members: + + + + + + ======= + Classes + ======= + + .. autosummary:: + :toctree: transformer + :template: custom-class-template.rst + :nosignatures: + + BaseTransformer + ConcurrentTransformer + SerialTransformer + SimpleConcurrentTransformer + + + + + + + + + + + + + + + + + diff --git a/docs/_sources/apidoc/fluxus/core/producer/fluxus.core.producer.BaseProducer.rst.txt b/docs/_sources/apidoc/fluxus/core/producer/fluxus.core.producer.BaseProducer.rst.txt new file mode 100644 index 0000000..d3b423d --- /dev/null +++ b/docs/_sources/apidoc/fluxus/core/producer/fluxus.core.producer.BaseProducer.rst.txt @@ -0,0 +1,46 @@ +fluxus.core.producer.BaseProducer +================================= + +.. currentmodule:: fluxus.core.producer + +.. autoclass:: BaseProducer + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~BaseProducer.aiter + ~BaseProducer.aiter_concurrent_conduits + ~BaseProducer.draw + ~BaseProducer.get_connections + ~BaseProducer.get_final_conduits + ~BaseProducer.iter + ~BaseProducer.iter_concurrent_conduits + ~BaseProducer.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~BaseProducer.final_conduit + ~BaseProducer.is_atomic + ~BaseProducer.is_chained + ~BaseProducer.is_concurrent + ~BaseProducer.n_concurrent_conduits + ~BaseProducer.name + ~BaseProducer.product_type + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/core/producer/fluxus.core.producer.ConcurrentProducer.rst.txt b/docs/_sources/apidoc/fluxus/core/producer/fluxus.core.producer.ConcurrentProducer.rst.txt new file mode 100644 index 0000000..51ae3a3 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/core/producer/fluxus.core.producer.ConcurrentProducer.rst.txt @@ -0,0 +1,46 @@ +fluxus.core.producer.ConcurrentProducer +======================================= + +.. currentmodule:: fluxus.core.producer + +.. autoclass:: ConcurrentProducer + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~ConcurrentProducer.aiter + ~ConcurrentProducer.aiter_concurrent_conduits + ~ConcurrentProducer.draw + ~ConcurrentProducer.get_connections + ~ConcurrentProducer.get_final_conduits + ~ConcurrentProducer.iter + ~ConcurrentProducer.iter_concurrent_conduits + ~ConcurrentProducer.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~ConcurrentProducer.final_conduit + ~ConcurrentProducer.is_atomic + ~ConcurrentProducer.is_chained + ~ConcurrentProducer.is_concurrent + ~ConcurrentProducer.n_concurrent_conduits + ~ConcurrentProducer.name + ~ConcurrentProducer.product_type + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/core/producer/fluxus.core.producer.SerialProducer.rst.txt b/docs/_sources/apidoc/fluxus/core/producer/fluxus.core.producer.SerialProducer.rst.txt new file mode 100644 index 0000000..771fa34 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/core/producer/fluxus.core.producer.SerialProducer.rst.txt @@ -0,0 +1,48 @@ +fluxus.core.producer.SerialProducer +=================================== + +.. currentmodule:: fluxus.core.producer + +.. autoclass:: SerialProducer + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~SerialProducer.aiter + ~SerialProducer.aiter_concurrent_conduits + ~SerialProducer.draw + ~SerialProducer.get_connections + ~SerialProducer.get_final_conduits + ~SerialProducer.get_repr_attributes + ~SerialProducer.iter + ~SerialProducer.iter_concurrent_conduits + ~SerialProducer.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~SerialProducer.chained_conduits + ~SerialProducer.final_conduit + ~SerialProducer.is_atomic + ~SerialProducer.is_chained + ~SerialProducer.is_concurrent + ~SerialProducer.n_concurrent_conduits + ~SerialProducer.name + ~SerialProducer.product_type + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/core/producer/fluxus.core.producer.SimpleConcurrentProducer.rst.txt b/docs/_sources/apidoc/fluxus/core/producer/fluxus.core.producer.SimpleConcurrentProducer.rst.txt new file mode 100644 index 0000000..d74ed7c --- /dev/null +++ b/docs/_sources/apidoc/fluxus/core/producer/fluxus.core.producer.SimpleConcurrentProducer.rst.txt @@ -0,0 +1,47 @@ +fluxus.core.producer.SimpleConcurrentProducer +============================================= + +.. currentmodule:: fluxus.core.producer + +.. autoclass:: SimpleConcurrentProducer + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~SimpleConcurrentProducer.aiter + ~SimpleConcurrentProducer.aiter_concurrent_conduits + ~SimpleConcurrentProducer.draw + ~SimpleConcurrentProducer.get_connections + ~SimpleConcurrentProducer.get_final_conduits + ~SimpleConcurrentProducer.iter + ~SimpleConcurrentProducer.iter_concurrent_conduits + ~SimpleConcurrentProducer.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~SimpleConcurrentProducer.final_conduit + ~SimpleConcurrentProducer.is_atomic + ~SimpleConcurrentProducer.is_chained + ~SimpleConcurrentProducer.is_concurrent + ~SimpleConcurrentProducer.n_concurrent_conduits + ~SimpleConcurrentProducer.name + ~SimpleConcurrentProducer.product_type + ~SimpleConcurrentProducer.producers + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/core/transformer/fluxus.core.transformer.BaseTransformer.rst.txt b/docs/_sources/apidoc/fluxus/core/transformer/fluxus.core.transformer.BaseTransformer.rst.txt new file mode 100644 index 0000000..448b2d0 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/core/transformer/fluxus.core.transformer.BaseTransformer.rst.txt @@ -0,0 +1,48 @@ +fluxus.core.transformer.BaseTransformer +======================================= + +.. currentmodule:: fluxus.core.transformer + +.. autoclass:: BaseTransformer + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~BaseTransformer.aiter_concurrent_conduits + ~BaseTransformer.aprocess + ~BaseTransformer.draw + ~BaseTransformer.get_connections + ~BaseTransformer.get_final_conduits + ~BaseTransformer.is_valid_source + ~BaseTransformer.iter_concurrent_conduits + ~BaseTransformer.process + ~BaseTransformer.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~BaseTransformer.final_conduit + ~BaseTransformer.input_type + ~BaseTransformer.is_atomic + ~BaseTransformer.is_chained + ~BaseTransformer.is_concurrent + ~BaseTransformer.n_concurrent_conduits + ~BaseTransformer.name + ~BaseTransformer.product_type + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/core/transformer/fluxus.core.transformer.ConcurrentTransformer.rst.txt b/docs/_sources/apidoc/fluxus/core/transformer/fluxus.core.transformer.ConcurrentTransformer.rst.txt new file mode 100644 index 0000000..e957d30 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/core/transformer/fluxus.core.transformer.ConcurrentTransformer.rst.txt @@ -0,0 +1,48 @@ +fluxus.core.transformer.ConcurrentTransformer +============================================= + +.. currentmodule:: fluxus.core.transformer + +.. autoclass:: ConcurrentTransformer + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~ConcurrentTransformer.aiter_concurrent_conduits + ~ConcurrentTransformer.aprocess + ~ConcurrentTransformer.draw + ~ConcurrentTransformer.get_connections + ~ConcurrentTransformer.get_final_conduits + ~ConcurrentTransformer.is_valid_source + ~ConcurrentTransformer.iter_concurrent_conduits + ~ConcurrentTransformer.process + ~ConcurrentTransformer.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~ConcurrentTransformer.final_conduit + ~ConcurrentTransformer.input_type + ~ConcurrentTransformer.is_atomic + ~ConcurrentTransformer.is_chained + ~ConcurrentTransformer.is_concurrent + ~ConcurrentTransformer.n_concurrent_conduits + ~ConcurrentTransformer.name + ~ConcurrentTransformer.product_type + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/core/transformer/fluxus.core.transformer.SerialTransformer.rst.txt b/docs/_sources/apidoc/fluxus/core/transformer/fluxus.core.transformer.SerialTransformer.rst.txt new file mode 100644 index 0000000..600b704 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/core/transformer/fluxus.core.transformer.SerialTransformer.rst.txt @@ -0,0 +1,54 @@ +fluxus.core.transformer.SerialTransformer +========================================= + +.. currentmodule:: fluxus.core.transformer + +.. autoclass:: SerialTransformer + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~SerialTransformer.aiter + ~SerialTransformer.aiter_concurrent_conduits + ~SerialTransformer.aprocess + ~SerialTransformer.atransform + ~SerialTransformer.draw + ~SerialTransformer.get_connections + ~SerialTransformer.get_final_conduits + ~SerialTransformer.get_repr_attributes + ~SerialTransformer.is_valid_source + ~SerialTransformer.iter + ~SerialTransformer.iter_concurrent_conduits + ~SerialTransformer.process + ~SerialTransformer.to_expression + ~SerialTransformer.transform + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~SerialTransformer.chained_conduits + ~SerialTransformer.final_conduit + ~SerialTransformer.input_type + ~SerialTransformer.is_atomic + ~SerialTransformer.is_chained + ~SerialTransformer.is_concurrent + ~SerialTransformer.n_concurrent_conduits + ~SerialTransformer.name + ~SerialTransformer.product_type + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/core/transformer/fluxus.core.transformer.SimpleConcurrentTransformer.rst.txt b/docs/_sources/apidoc/fluxus/core/transformer/fluxus.core.transformer.SimpleConcurrentTransformer.rst.txt new file mode 100644 index 0000000..cb81beb --- /dev/null +++ b/docs/_sources/apidoc/fluxus/core/transformer/fluxus.core.transformer.SimpleConcurrentTransformer.rst.txt @@ -0,0 +1,49 @@ +fluxus.core.transformer.SimpleConcurrentTransformer +=================================================== + +.. currentmodule:: fluxus.core.transformer + +.. autoclass:: SimpleConcurrentTransformer + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~SimpleConcurrentTransformer.aiter_concurrent_conduits + ~SimpleConcurrentTransformer.aprocess + ~SimpleConcurrentTransformer.draw + ~SimpleConcurrentTransformer.get_connections + ~SimpleConcurrentTransformer.get_final_conduits + ~SimpleConcurrentTransformer.is_valid_source + ~SimpleConcurrentTransformer.iter_concurrent_conduits + ~SimpleConcurrentTransformer.process + ~SimpleConcurrentTransformer.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~SimpleConcurrentTransformer.final_conduit + ~SimpleConcurrentTransformer.input_type + ~SimpleConcurrentTransformer.is_atomic + ~SimpleConcurrentTransformer.is_chained + ~SimpleConcurrentTransformer.is_concurrent + ~SimpleConcurrentTransformer.n_concurrent_conduits + ~SimpleConcurrentTransformer.name + ~SimpleConcurrentTransformer.product_type + ~SimpleConcurrentTransformer.transformers + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/fluxus.AsyncConsumer.rst.txt b/docs/_sources/apidoc/fluxus/fluxus.AsyncConsumer.rst.txt new file mode 100644 index 0000000..6da6ec5 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/fluxus.AsyncConsumer.rst.txt @@ -0,0 +1,51 @@ +fluxus.AsyncConsumer +==================== + +.. currentmodule:: fluxus + +.. autoclass:: AsyncConsumer + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~AsyncConsumer.aconsume + ~AsyncConsumer.aiter_concurrent_conduits + ~AsyncConsumer.aprocess + ~AsyncConsumer.consume + ~AsyncConsumer.draw + ~AsyncConsumer.get_connections + ~AsyncConsumer.get_final_conduits + ~AsyncConsumer.get_repr_attributes + ~AsyncConsumer.is_valid_source + ~AsyncConsumer.iter_concurrent_conduits + ~AsyncConsumer.process + ~AsyncConsumer.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~AsyncConsumer.chained_conduits + ~AsyncConsumer.final_conduit + ~AsyncConsumer.input_type + ~AsyncConsumer.is_atomic + ~AsyncConsumer.is_chained + ~AsyncConsumer.is_concurrent + ~AsyncConsumer.n_concurrent_conduits + ~AsyncConsumer.name + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/fluxus.AsyncProducer.rst.txt b/docs/_sources/apidoc/fluxus/fluxus.AsyncProducer.rst.txt new file mode 100644 index 0000000..c9614bc --- /dev/null +++ b/docs/_sources/apidoc/fluxus/fluxus.AsyncProducer.rst.txt @@ -0,0 +1,48 @@ +fluxus.AsyncProducer +==================== + +.. currentmodule:: fluxus + +.. autoclass:: AsyncProducer + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~AsyncProducer.aiter + ~AsyncProducer.aiter_concurrent_conduits + ~AsyncProducer.draw + ~AsyncProducer.get_connections + ~AsyncProducer.get_final_conduits + ~AsyncProducer.get_repr_attributes + ~AsyncProducer.iter + ~AsyncProducer.iter_concurrent_conduits + ~AsyncProducer.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~AsyncProducer.chained_conduits + ~AsyncProducer.final_conduit + ~AsyncProducer.is_atomic + ~AsyncProducer.is_chained + ~AsyncProducer.is_concurrent + ~AsyncProducer.n_concurrent_conduits + ~AsyncProducer.name + ~AsyncProducer.product_type + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/fluxus.AsyncTransformer.rst.txt b/docs/_sources/apidoc/fluxus/fluxus.AsyncTransformer.rst.txt new file mode 100644 index 0000000..7e4f9a7 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/fluxus.AsyncTransformer.rst.txt @@ -0,0 +1,54 @@ +fluxus.AsyncTransformer +======================= + +.. currentmodule:: fluxus + +.. autoclass:: AsyncTransformer + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~AsyncTransformer.aiter + ~AsyncTransformer.aiter_concurrent_conduits + ~AsyncTransformer.aprocess + ~AsyncTransformer.atransform + ~AsyncTransformer.draw + ~AsyncTransformer.get_connections + ~AsyncTransformer.get_final_conduits + ~AsyncTransformer.get_repr_attributes + ~AsyncTransformer.is_valid_source + ~AsyncTransformer.iter + ~AsyncTransformer.iter_concurrent_conduits + ~AsyncTransformer.process + ~AsyncTransformer.to_expression + ~AsyncTransformer.transform + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~AsyncTransformer.chained_conduits + ~AsyncTransformer.final_conduit + ~AsyncTransformer.input_type + ~AsyncTransformer.is_atomic + ~AsyncTransformer.is_chained + ~AsyncTransformer.is_concurrent + ~AsyncTransformer.n_concurrent_conduits + ~AsyncTransformer.name + ~AsyncTransformer.product_type + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/fluxus.Consumer.rst.txt b/docs/_sources/apidoc/fluxus/fluxus.Consumer.rst.txt new file mode 100644 index 0000000..bd9a167 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/fluxus.Consumer.rst.txt @@ -0,0 +1,51 @@ +fluxus.Consumer +=============== + +.. currentmodule:: fluxus + +.. autoclass:: Consumer + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~Consumer.aconsume + ~Consumer.aiter_concurrent_conduits + ~Consumer.aprocess + ~Consumer.consume + ~Consumer.draw + ~Consumer.get_connections + ~Consumer.get_final_conduits + ~Consumer.get_repr_attributes + ~Consumer.is_valid_source + ~Consumer.iter_concurrent_conduits + ~Consumer.process + ~Consumer.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~Consumer.chained_conduits + ~Consumer.final_conduit + ~Consumer.input_type + ~Consumer.is_atomic + ~Consumer.is_chained + ~Consumer.is_concurrent + ~Consumer.n_concurrent_conduits + ~Consumer.name + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/fluxus.Flow.rst.txt b/docs/_sources/apidoc/fluxus/fluxus.Flow.rst.txt new file mode 100644 index 0000000..efe5770 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/fluxus.Flow.rst.txt @@ -0,0 +1,45 @@ +fluxus.Flow +=========== + +.. currentmodule:: fluxus + +.. autoclass:: Flow + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~Flow.aiter_concurrent_conduits + ~Flow.arun + ~Flow.draw + ~Flow.get_connections + ~Flow.get_final_conduits + ~Flow.iter_concurrent_conduits + ~Flow.run + ~Flow.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~Flow.final_conduit + ~Flow.is_atomic + ~Flow.is_chained + ~Flow.is_concurrent + ~Flow.n_concurrent_conduits + ~Flow.name + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/fluxus.FlowWarning.rst.txt b/docs/_sources/apidoc/fluxus/fluxus.FlowWarning.rst.txt new file mode 100644 index 0000000..230cbee --- /dev/null +++ b/docs/_sources/apidoc/fluxus/fluxus.FlowWarning.rst.txt @@ -0,0 +1,6 @@ +fluxus.FlowWarning +================== + +.. currentmodule:: fluxus + +.. autoexception:: FlowWarning \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/fluxus.Passthrough.rst.txt b/docs/_sources/apidoc/fluxus/fluxus.Passthrough.rst.txt new file mode 100644 index 0000000..6963e8d --- /dev/null +++ b/docs/_sources/apidoc/fluxus/fluxus.Passthrough.rst.txt @@ -0,0 +1,45 @@ +fluxus.Passthrough +================== + +.. currentmodule:: fluxus + +.. autoclass:: Passthrough + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~Passthrough.aiter_concurrent_conduits + ~Passthrough.draw + ~Passthrough.get_connections + ~Passthrough.get_final_conduits + ~Passthrough.get_repr_attributes + ~Passthrough.iter_concurrent_conduits + ~Passthrough.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~Passthrough.chained_conduits + ~Passthrough.final_conduit + ~Passthrough.is_atomic + ~Passthrough.is_chained + ~Passthrough.is_concurrent + ~Passthrough.n_concurrent_conduits + ~Passthrough.name + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/fluxus.Producer.rst.txt b/docs/_sources/apidoc/fluxus/fluxus.Producer.rst.txt new file mode 100644 index 0000000..02fd9fc --- /dev/null +++ b/docs/_sources/apidoc/fluxus/fluxus.Producer.rst.txt @@ -0,0 +1,48 @@ +fluxus.Producer +=============== + +.. currentmodule:: fluxus + +.. autoclass:: Producer + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~Producer.aiter + ~Producer.aiter_concurrent_conduits + ~Producer.draw + ~Producer.get_connections + ~Producer.get_final_conduits + ~Producer.get_repr_attributes + ~Producer.iter + ~Producer.iter_concurrent_conduits + ~Producer.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~Producer.chained_conduits + ~Producer.final_conduit + ~Producer.is_atomic + ~Producer.is_chained + ~Producer.is_concurrent + ~Producer.n_concurrent_conduits + ~Producer.name + ~Producer.product_type + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/fluxus.Transformer.rst.txt b/docs/_sources/apidoc/fluxus/fluxus.Transformer.rst.txt new file mode 100644 index 0000000..b095237 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/fluxus.Transformer.rst.txt @@ -0,0 +1,54 @@ +fluxus.Transformer +================== + +.. currentmodule:: fluxus + +.. autoclass:: Transformer + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~Transformer.aiter + ~Transformer.aiter_concurrent_conduits + ~Transformer.aprocess + ~Transformer.atransform + ~Transformer.draw + ~Transformer.get_connections + ~Transformer.get_final_conduits + ~Transformer.get_repr_attributes + ~Transformer.is_valid_source + ~Transformer.iter + ~Transformer.iter_concurrent_conduits + ~Transformer.process + ~Transformer.to_expression + ~Transformer.transform + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~Transformer.chained_conduits + ~Transformer.final_conduit + ~Transformer.input_type + ~Transformer.is_atomic + ~Transformer.is_chained + ~Transformer.is_concurrent + ~Transformer.n_concurrent_conduits + ~Transformer.name + ~Transformer.product_type + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/fluxus.core.rst.txt b/docs/_sources/apidoc/fluxus/fluxus.core.rst.txt new file mode 100644 index 0000000..d552bad --- /dev/null +++ b/docs/_sources/apidoc/fluxus/fluxus.core.rst.txt @@ -0,0 +1,62 @@ +fluxus.core +=========== + +.. toctree:: + :maxdepth: 1 + :hidden: + + self + +.. automodule:: fluxus.core + :no-imported-members: + + + + + + ======= + Classes + ======= + + .. autosummary:: + :toctree: core + :template: custom-class-template.rst + :nosignatures: + + AtomicConduit + ConcurrentConduit + Conduit + Processor + SerialConduit + SerialProcessor + SerialSource + Source + + + + + + + + + + + + + + + + + + +========== +Submodules +========== +.. autosummary:: + :toctree: core + :template: custom-module-template.rst + :recursive: + + fluxus.core.producer + fluxus.core.transformer + diff --git a/docs/_sources/apidoc/fluxus/fluxus.functional.rst.txt b/docs/_sources/apidoc/fluxus/fluxus.functional.rst.txt new file mode 100644 index 0000000..d6d0a3b --- /dev/null +++ b/docs/_sources/apidoc/fluxus/fluxus.functional.rst.txt @@ -0,0 +1,69 @@ +fluxus.functional +================= + +.. toctree:: + :maxdepth: 1 + :hidden: + + self + +.. automodule:: fluxus.functional + :no-imported-members: + + + + + + ======= + Classes + ======= + + .. autosummary:: + :toctree: functional + :template: custom-class-template.rst + :nosignatures: + + RunResult + + + + + + ========= + Functions + ========= + + .. autosummary:: + :toctree: functional + :nosignatures: + + chain + parallel + passthrough + run + step + + + + + + + + + + + + + + +========== +Submodules +========== +.. autosummary:: + :toctree: functional + :template: custom-module-template.rst + :recursive: + + fluxus.functional.conduit + fluxus.functional.product + diff --git a/docs/_sources/apidoc/fluxus/fluxus.lineage.rst.txt b/docs/_sources/apidoc/fluxus/fluxus.lineage.rst.txt new file mode 100644 index 0000000..0f919e8 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/fluxus.lineage.rst.txt @@ -0,0 +1,46 @@ +fluxus.lineage +============== + +.. toctree:: + :maxdepth: 1 + :hidden: + + self + +.. automodule:: fluxus.lineage + :no-imported-members: + + + + + + ======= + Classes + ======= + + .. autosummary:: + :toctree: lineage + :template: custom-class-template.rst + :nosignatures: + + HasLineage + LabelingProducer + LabelingTransformer + LineageOrigin + + + + + + + + + + + + + + + + + diff --git a/docs/_sources/apidoc/fluxus/fluxus.simple.rst.txt b/docs/_sources/apidoc/fluxus/fluxus.simple.rst.txt new file mode 100644 index 0000000..de3312c --- /dev/null +++ b/docs/_sources/apidoc/fluxus/fluxus.simple.rst.txt @@ -0,0 +1,44 @@ +fluxus.simple +============= + +.. toctree:: + :maxdepth: 1 + :hidden: + + self + +.. automodule:: fluxus.simple + :no-imported-members: + + + + + + ======= + Classes + ======= + + .. autosummary:: + :toctree: simple + :template: custom-class-template.rst + :nosignatures: + + SimpleAsyncProducer + SimpleProducer + + + + + + + + + + + + + + + + + diff --git a/docs/_sources/apidoc/fluxus/fluxus.util.rst.txt b/docs/_sources/apidoc/fluxus/fluxus.util.rst.txt new file mode 100644 index 0000000..b72b16b --- /dev/null +++ b/docs/_sources/apidoc/fluxus/fluxus.util.rst.txt @@ -0,0 +1,42 @@ +fluxus.util +=========== + +.. toctree:: + :maxdepth: 1 + :hidden: + + self + +.. automodule:: fluxus.util + :no-imported-members: + + + + + + + + + + ========= + Functions + ========= + + .. autosummary:: + :toctree: util + :nosignatures: + + simplify_repr_attributes + + + + + + + + + + + + + diff --git a/docs/_sources/apidoc/fluxus/fluxus.viz.rst.txt b/docs/_sources/apidoc/fluxus/fluxus.viz.rst.txt new file mode 100644 index 0000000..9901543 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/fluxus.viz.rst.txt @@ -0,0 +1,60 @@ +fluxus.viz +========== + +.. toctree:: + :maxdepth: 1 + :hidden: + + self + +.. automodule:: fluxus.viz + :no-imported-members: + + + + + + ======= + Classes + ======= + + .. autosummary:: + :toctree: viz + :template: custom-class-template.rst + :nosignatures: + + FlowDrawer + FlowGraph + FlowGraphStyle + FlowTextStyle + TimelineDrawer + TimelineMatplotStyle + TimelineTextStyle + + + + + + + + + + + + + + + + + + +========== +Submodules +========== +.. autosummary:: + :toctree: viz + :template: custom-module-template.rst + :recursive: + + fluxus.viz.base + diff --git a/docs/_sources/apidoc/fluxus/functional/conduit/fluxus.functional.conduit.DictConsumer.rst.txt b/docs/_sources/apidoc/fluxus/functional/conduit/fluxus.functional.conduit.DictConsumer.rst.txt new file mode 100644 index 0000000..aff0844 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/functional/conduit/fluxus.functional.conduit.DictConsumer.rst.txt @@ -0,0 +1,52 @@ +fluxus.functional.conduit.DictConsumer +====================================== + +.. currentmodule:: fluxus.functional.conduit + +.. autoclass:: DictConsumer + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~DictConsumer.aconsume + ~DictConsumer.aiter_concurrent_conduits + ~DictConsumer.aprocess + ~DictConsumer.consume + ~DictConsumer.draw + ~DictConsumer.get_connections + ~DictConsumer.get_final_conduits + ~DictConsumer.get_repr_attributes + ~DictConsumer.is_valid_source + ~DictConsumer.iter_concurrent_conduits + ~DictConsumer.process + ~DictConsumer.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~DictConsumer.chained_conduits + ~DictConsumer.final_conduit + ~DictConsumer.input_type + ~DictConsumer.is_atomic + ~DictConsumer.is_chained + ~DictConsumer.is_concurrent + ~DictConsumer.n_concurrent_conduits + ~DictConsumer.name + ~DictConsumer.timestamps + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/functional/conduit/fluxus.functional.conduit.DictProducer.rst.txt b/docs/_sources/apidoc/fluxus/functional/conduit/fluxus.functional.conduit.DictProducer.rst.txt new file mode 100644 index 0000000..32776f8 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/functional/conduit/fluxus.functional.conduit.DictProducer.rst.txt @@ -0,0 +1,49 @@ +fluxus.functional.conduit.DictProducer +====================================== + +.. currentmodule:: fluxus.functional.conduit + +.. autoclass:: DictProducer + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~DictProducer.aiter + ~DictProducer.aiter_concurrent_conduits + ~DictProducer.draw + ~DictProducer.get_connections + ~DictProducer.get_final_conduits + ~DictProducer.get_repr_attributes + ~DictProducer.iter + ~DictProducer.iter_concurrent_conduits + ~DictProducer.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~DictProducer.chained_conduits + ~DictProducer.final_conduit + ~DictProducer.is_atomic + ~DictProducer.is_chained + ~DictProducer.is_concurrent + ~DictProducer.n_concurrent_conduits + ~DictProducer.name + ~DictProducer.product_type + ~DictProducer.producer + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/functional/conduit/fluxus.functional.conduit.Step.rst.txt b/docs/_sources/apidoc/fluxus/functional/conduit/fluxus.functional.conduit.Step.rst.txt new file mode 100644 index 0000000..ee2f7db --- /dev/null +++ b/docs/_sources/apidoc/fluxus/functional/conduit/fluxus.functional.conduit.Step.rst.txt @@ -0,0 +1,56 @@ +fluxus.functional.conduit.Step +============================== + +.. currentmodule:: fluxus.functional.conduit + +.. autoclass:: Step + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~Step.aiter + ~Step.aiter_concurrent_conduits + ~Step.aprocess + ~Step.atransform + ~Step.draw + ~Step.get_connections + ~Step.get_final_conduits + ~Step.get_repr_attributes + ~Step.is_valid_source + ~Step.iter + ~Step.iter_concurrent_conduits + ~Step.process + ~Step.to_expression + ~Step.transform + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~Step.chained_conduits + ~Step.final_conduit + ~Step.function + ~Step.input_type + ~Step.is_atomic + ~Step.is_chained + ~Step.is_concurrent + ~Step.n_concurrent_conduits + ~Step.name + ~Step.product_type + ~Step.kwargs + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/functional/fluxus.functional.RunResult.rst.txt b/docs/_sources/apidoc/fluxus/functional/fluxus.functional.RunResult.rst.txt new file mode 100644 index 0000000..5eeb56f --- /dev/null +++ b/docs/_sources/apidoc/fluxus/functional/fluxus.functional.RunResult.rst.txt @@ -0,0 +1,31 @@ +fluxus.functional.RunResult +=========================== + +.. currentmodule:: fluxus.functional + +.. autoclass:: RunResult + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~RunResult.draw_timeline + ~RunResult.get_outputs + ~RunResult.get_outputs_per_path + ~RunResult.to_expression + ~RunResult.to_frame + + + + + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/functional/fluxus.functional.chain.rst.txt b/docs/_sources/apidoc/fluxus/functional/fluxus.functional.chain.rst.txt new file mode 100644 index 0000000..b6a454d --- /dev/null +++ b/docs/_sources/apidoc/fluxus/functional/fluxus.functional.chain.rst.txt @@ -0,0 +1,6 @@ +fluxus.functional.chain +======================= + +.. currentmodule:: fluxus.functional + +.. autofunction:: chain \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/functional/fluxus.functional.conduit.rst.txt b/docs/_sources/apidoc/fluxus/functional/fluxus.functional.conduit.rst.txt new file mode 100644 index 0000000..205d32f --- /dev/null +++ b/docs/_sources/apidoc/fluxus/functional/fluxus.functional.conduit.rst.txt @@ -0,0 +1,45 @@ +fluxus.functional.conduit +========================= + +.. toctree:: + :maxdepth: 1 + :hidden: + + self + +.. automodule:: fluxus.functional.conduit + :no-imported-members: + + + + + + ======= + Classes + ======= + + .. autosummary:: + :toctree: conduit + :template: custom-class-template.rst + :nosignatures: + + DictConsumer + DictProducer + Step + + + + + + + + + + + + + + + + + diff --git a/docs/_sources/apidoc/fluxus/functional/fluxus.functional.parallel.rst.txt b/docs/_sources/apidoc/fluxus/functional/fluxus.functional.parallel.rst.txt new file mode 100644 index 0000000..3eaf565 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/functional/fluxus.functional.parallel.rst.txt @@ -0,0 +1,6 @@ +fluxus.functional.parallel +========================== + +.. currentmodule:: fluxus.functional + +.. autofunction:: parallel \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/functional/fluxus.functional.passthrough.rst.txt b/docs/_sources/apidoc/fluxus/functional/fluxus.functional.passthrough.rst.txt new file mode 100644 index 0000000..7421080 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/functional/fluxus.functional.passthrough.rst.txt @@ -0,0 +1,6 @@ +fluxus.functional.passthrough +============================= + +.. currentmodule:: fluxus.functional + +.. autofunction:: passthrough \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/functional/fluxus.functional.product.rst.txt b/docs/_sources/apidoc/fluxus/functional/fluxus.functional.product.rst.txt new file mode 100644 index 0000000..a0138d4 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/functional/fluxus.functional.product.rst.txt @@ -0,0 +1,43 @@ +fluxus.functional.product +========================= + +.. toctree:: + :maxdepth: 1 + :hidden: + + self + +.. automodule:: fluxus.functional.product + :no-imported-members: + + + + + + ======= + Classes + ======= + + .. autosummary:: + :toctree: product + :template: custom-class-template.rst + :nosignatures: + + DictProduct + + + + + + + + + + + + + + + + + diff --git a/docs/_sources/apidoc/fluxus/functional/fluxus.functional.run.rst.txt b/docs/_sources/apidoc/fluxus/functional/fluxus.functional.run.rst.txt new file mode 100644 index 0000000..867bb01 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/functional/fluxus.functional.run.rst.txt @@ -0,0 +1,6 @@ +fluxus.functional.run +===================== + +.. currentmodule:: fluxus.functional + +.. autofunction:: run \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/functional/fluxus.functional.step.rst.txt b/docs/_sources/apidoc/fluxus/functional/fluxus.functional.step.rst.txt new file mode 100644 index 0000000..9fa8cc5 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/functional/fluxus.functional.step.rst.txt @@ -0,0 +1,6 @@ +fluxus.functional.step +====================== + +.. currentmodule:: fluxus.functional + +.. autofunction:: step \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/functional/product/fluxus.functional.product.DictProduct.rst.txt b/docs/_sources/apidoc/fluxus/functional/product/fluxus.functional.product.DictProduct.rst.txt new file mode 100644 index 0000000..e0c09f9 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/functional/product/fluxus.functional.product.DictProduct.rst.txt @@ -0,0 +1,51 @@ +fluxus.functional.product.DictProduct +===================================== + +.. currentmodule:: fluxus.functional.product + +.. autoclass:: DictProduct + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~DictProduct.from_dict + ~DictProduct.get + ~DictProduct.get_lineage + ~DictProduct.get_lineage_attributes + ~DictProduct.items + ~DictProduct.keys + ~DictProduct.label + ~DictProduct.to_dict + ~DictProduct.to_expression + ~DictProduct.values + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~DictProduct.KEY_END_TIME + ~DictProduct.KEY_START_TIME + ~DictProduct.precursor + ~DictProduct.product_attributes + ~DictProduct.product_labels + ~DictProduct.product_name + ~DictProduct.name + ~DictProduct.attributes + ~DictProduct.start_time + ~DictProduct.end_time + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/lineage/fluxus.lineage.HasLineage.rst.txt b/docs/_sources/apidoc/fluxus/lineage/fluxus.lineage.HasLineage.rst.txt new file mode 100644 index 0000000..cbda64a --- /dev/null +++ b/docs/_sources/apidoc/fluxus/lineage/fluxus.lineage.HasLineage.rst.txt @@ -0,0 +1,38 @@ +fluxus.lineage.HasLineage +========================= + +.. currentmodule:: fluxus.lineage + +.. autoclass:: HasLineage + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~HasLineage.get_lineage + ~HasLineage.get_lineage_attributes + ~HasLineage.label + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~HasLineage.precursor + ~HasLineage.product_attributes + ~HasLineage.product_labels + ~HasLineage.product_name + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/lineage/fluxus.lineage.LabelingProducer.rst.txt b/docs/_sources/apidoc/fluxus/lineage/fluxus.lineage.LabelingProducer.rst.txt new file mode 100644 index 0000000..cb82c91 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/lineage/fluxus.lineage.LabelingProducer.rst.txt @@ -0,0 +1,49 @@ +fluxus.lineage.LabelingProducer +=============================== + +.. currentmodule:: fluxus.lineage + +.. autoclass:: LabelingProducer + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~LabelingProducer.aiter + ~LabelingProducer.aiter_concurrent_conduits + ~LabelingProducer.draw + ~LabelingProducer.get_connections + ~LabelingProducer.get_final_conduits + ~LabelingProducer.get_repr_attributes + ~LabelingProducer.iter + ~LabelingProducer.iter_concurrent_conduits + ~LabelingProducer.label + ~LabelingProducer.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~LabelingProducer.chained_conduits + ~LabelingProducer.final_conduit + ~LabelingProducer.is_atomic + ~LabelingProducer.is_chained + ~LabelingProducer.is_concurrent + ~LabelingProducer.n_concurrent_conduits + ~LabelingProducer.name + ~LabelingProducer.product_type + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/lineage/fluxus.lineage.LabelingTransformer.rst.txt b/docs/_sources/apidoc/fluxus/lineage/fluxus.lineage.LabelingTransformer.rst.txt new file mode 100644 index 0000000..3f5ffab --- /dev/null +++ b/docs/_sources/apidoc/fluxus/lineage/fluxus.lineage.LabelingTransformer.rst.txt @@ -0,0 +1,55 @@ +fluxus.lineage.LabelingTransformer +================================== + +.. currentmodule:: fluxus.lineage + +.. autoclass:: LabelingTransformer + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~LabelingTransformer.aiter + ~LabelingTransformer.aiter_concurrent_conduits + ~LabelingTransformer.aprocess + ~LabelingTransformer.atransform + ~LabelingTransformer.draw + ~LabelingTransformer.get_connections + ~LabelingTransformer.get_final_conduits + ~LabelingTransformer.get_repr_attributes + ~LabelingTransformer.is_valid_source + ~LabelingTransformer.iter + ~LabelingTransformer.iter_concurrent_conduits + ~LabelingTransformer.label + ~LabelingTransformer.process + ~LabelingTransformer.to_expression + ~LabelingTransformer.transform + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~LabelingTransformer.chained_conduits + ~LabelingTransformer.final_conduit + ~LabelingTransformer.input_type + ~LabelingTransformer.is_atomic + ~LabelingTransformer.is_chained + ~LabelingTransformer.is_concurrent + ~LabelingTransformer.n_concurrent_conduits + ~LabelingTransformer.name + ~LabelingTransformer.product_type + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/lineage/fluxus.lineage.LineageOrigin.rst.txt b/docs/_sources/apidoc/fluxus/lineage/fluxus.lineage.LineageOrigin.rst.txt new file mode 100644 index 0000000..02235b4 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/lineage/fluxus.lineage.LineageOrigin.rst.txt @@ -0,0 +1,38 @@ +fluxus.lineage.LineageOrigin +============================ + +.. currentmodule:: fluxus.lineage + +.. autoclass:: LineageOrigin + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~LineageOrigin.get_lineage + ~LineageOrigin.get_lineage_attributes + ~LineageOrigin.label + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~LineageOrigin.precursor + ~LineageOrigin.product_attributes + ~LineageOrigin.product_labels + ~LineageOrigin.product_name + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/simple/fluxus.simple.SimpleAsyncProducer.rst.txt b/docs/_sources/apidoc/fluxus/simple/fluxus.simple.SimpleAsyncProducer.rst.txt new file mode 100644 index 0000000..1995ab7 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/simple/fluxus.simple.SimpleAsyncProducer.rst.txt @@ -0,0 +1,49 @@ +fluxus.simple.SimpleAsyncProducer +================================= + +.. currentmodule:: fluxus.simple + +.. autoclass:: SimpleAsyncProducer + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~SimpleAsyncProducer.aiter + ~SimpleAsyncProducer.aiter_concurrent_conduits + ~SimpleAsyncProducer.draw + ~SimpleAsyncProducer.get_connections + ~SimpleAsyncProducer.get_final_conduits + ~SimpleAsyncProducer.get_repr_attributes + ~SimpleAsyncProducer.iter + ~SimpleAsyncProducer.iter_concurrent_conduits + ~SimpleAsyncProducer.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~SimpleAsyncProducer.chained_conduits + ~SimpleAsyncProducer.final_conduit + ~SimpleAsyncProducer.is_atomic + ~SimpleAsyncProducer.is_chained + ~SimpleAsyncProducer.is_concurrent + ~SimpleAsyncProducer.n_concurrent_conduits + ~SimpleAsyncProducer.name + ~SimpleAsyncProducer.product_type + ~SimpleAsyncProducer.products + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/simple/fluxus.simple.SimpleProducer.rst.txt b/docs/_sources/apidoc/fluxus/simple/fluxus.simple.SimpleProducer.rst.txt new file mode 100644 index 0000000..1735f0b --- /dev/null +++ b/docs/_sources/apidoc/fluxus/simple/fluxus.simple.SimpleProducer.rst.txt @@ -0,0 +1,49 @@ +fluxus.simple.SimpleProducer +============================ + +.. currentmodule:: fluxus.simple + +.. autoclass:: SimpleProducer + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~SimpleProducer.aiter + ~SimpleProducer.aiter_concurrent_conduits + ~SimpleProducer.draw + ~SimpleProducer.get_connections + ~SimpleProducer.get_final_conduits + ~SimpleProducer.get_repr_attributes + ~SimpleProducer.iter + ~SimpleProducer.iter_concurrent_conduits + ~SimpleProducer.to_expression + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~SimpleProducer.chained_conduits + ~SimpleProducer.final_conduit + ~SimpleProducer.is_atomic + ~SimpleProducer.is_chained + ~SimpleProducer.is_concurrent + ~SimpleProducer.n_concurrent_conduits + ~SimpleProducer.name + ~SimpleProducer.product_type + ~SimpleProducer.products + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/util/fluxus.util.simplify_repr_attributes.rst.txt b/docs/_sources/apidoc/fluxus/util/fluxus.util.simplify_repr_attributes.rst.txt new file mode 100644 index 0000000..5140627 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/util/fluxus.util.simplify_repr_attributes.rst.txt @@ -0,0 +1,6 @@ +fluxus.util.simplify\_repr\_attributes +====================================== + +.. currentmodule:: fluxus.util + +.. autofunction:: simplify_repr_attributes \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/viz/base/fluxus.viz.base.FlowStyle.rst.txt b/docs/_sources/apidoc/fluxus/viz/base/fluxus.viz.base.FlowStyle.rst.txt new file mode 100644 index 0000000..5907f01 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/viz/base/fluxus.viz.base.FlowStyle.rst.txt @@ -0,0 +1,31 @@ +fluxus.viz.base.FlowStyle +========================= + +.. currentmodule:: fluxus.viz.base + +.. autoclass:: FlowStyle + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~FlowStyle.finalize_drawing + ~FlowStyle.get_default_style_name + ~FlowStyle.get_named_styles + ~FlowStyle.render_flow + ~FlowStyle.start_drawing + + + + + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/viz/base/fluxus.viz.base.TimelineStyle.rst.txt b/docs/_sources/apidoc/fluxus/viz/base/fluxus.viz.base.TimelineStyle.rst.txt new file mode 100644 index 0000000..6316feb --- /dev/null +++ b/docs/_sources/apidoc/fluxus/viz/base/fluxus.viz.base.TimelineStyle.rst.txt @@ -0,0 +1,31 @@ +fluxus.viz.base.TimelineStyle +============================= + +.. currentmodule:: fluxus.viz.base + +.. autoclass:: TimelineStyle + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~TimelineStyle.finalize_drawing + ~TimelineStyle.get_default_style_name + ~TimelineStyle.get_named_styles + ~TimelineStyle.render_timeline + ~TimelineStyle.start_drawing + + + + + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/viz/fluxus.viz.FlowDrawer.rst.txt b/docs/_sources/apidoc/fluxus/viz/fluxus.viz.FlowDrawer.rst.txt new file mode 100644 index 0000000..c8323c6 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/viz/fluxus.viz.FlowDrawer.rst.txt @@ -0,0 +1,38 @@ +fluxus.viz.FlowDrawer +===================== + +.. currentmodule:: fluxus.viz + +.. autoclass:: FlowDrawer + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~FlowDrawer.draw + ~FlowDrawer.get_default_style + ~FlowDrawer.get_named_styles + ~FlowDrawer.get_style + ~FlowDrawer.get_style_classes + ~FlowDrawer.get_style_kwargs + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~FlowDrawer.style + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/viz/fluxus.viz.FlowGraph.rst.txt b/docs/_sources/apidoc/fluxus/viz/fluxus.viz.FlowGraph.rst.txt new file mode 100644 index 0000000..a32e9a2 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/viz/fluxus.viz.FlowGraph.rst.txt @@ -0,0 +1,35 @@ +fluxus.viz.FlowGraph +==================== + +.. currentmodule:: fluxus.viz + +.. autoclass:: FlowGraph + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~FlowGraph.from_conduit + ~FlowGraph.to_dot + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~FlowGraph.single_conduits + ~FlowGraph.connections + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/viz/fluxus.viz.FlowGraphStyle.rst.txt b/docs/_sources/apidoc/fluxus/viz/fluxus.viz.FlowGraphStyle.rst.txt new file mode 100644 index 0000000..92385d7 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/viz/fluxus.viz.FlowGraphStyle.rst.txt @@ -0,0 +1,39 @@ +fluxus.viz.FlowGraphStyle +========================= + +.. currentmodule:: fluxus.viz + +.. autoclass:: FlowGraphStyle + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~FlowGraphStyle.finalize_drawing + ~FlowGraphStyle.get_default_style_name + ~FlowGraphStyle.get_named_styles + ~FlowGraphStyle.render_flow + ~FlowGraphStyle.start_drawing + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~FlowGraphStyle.colors + ~FlowGraphStyle.file + ~FlowGraphStyle.format + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/viz/fluxus.viz.FlowTextStyle.rst.txt b/docs/_sources/apidoc/fluxus/viz/fluxus.viz.FlowTextStyle.rst.txt new file mode 100644 index 0000000..73cdb9b --- /dev/null +++ b/docs/_sources/apidoc/fluxus/viz/fluxus.viz.FlowTextStyle.rst.txt @@ -0,0 +1,38 @@ +fluxus.viz.FlowTextStyle +======================== + +.. currentmodule:: fluxus.viz + +.. autoclass:: FlowTextStyle + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~FlowTextStyle.finalize_drawing + ~FlowTextStyle.get_default_style_name + ~FlowTextStyle.get_named_styles + ~FlowTextStyle.render_flow + ~FlowTextStyle.start_drawing + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~FlowTextStyle.out + ~FlowTextStyle.width + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/viz/fluxus.viz.TimelineDrawer.rst.txt b/docs/_sources/apidoc/fluxus/viz/fluxus.viz.TimelineDrawer.rst.txt new file mode 100644 index 0000000..e147232 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/viz/fluxus.viz.TimelineDrawer.rst.txt @@ -0,0 +1,38 @@ +fluxus.viz.TimelineDrawer +========================= + +.. currentmodule:: fluxus.viz + +.. autoclass:: TimelineDrawer + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~TimelineDrawer.draw + ~TimelineDrawer.get_default_style + ~TimelineDrawer.get_named_styles + ~TimelineDrawer.get_style + ~TimelineDrawer.get_style_classes + ~TimelineDrawer.get_style_kwargs + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~TimelineDrawer.style + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/viz/fluxus.viz.TimelineMatplotStyle.rst.txt b/docs/_sources/apidoc/fluxus/viz/fluxus.viz.TimelineMatplotStyle.rst.txt new file mode 100644 index 0000000..4243599 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/viz/fluxus.viz.TimelineMatplotStyle.rst.txt @@ -0,0 +1,40 @@ +fluxus.viz.TimelineMatplotStyle +=============================== + +.. currentmodule:: fluxus.viz + +.. autoclass:: TimelineMatplotStyle + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~TimelineMatplotStyle.apply_color_scheme + ~TimelineMatplotStyle.finalize_drawing + ~TimelineMatplotStyle.get_default_style_name + ~TimelineMatplotStyle.get_named_styles + ~TimelineMatplotStyle.get_renderer + ~TimelineMatplotStyle.render_timeline + ~TimelineMatplotStyle.start_drawing + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~TimelineMatplotStyle.ax + ~TimelineMatplotStyle.colors + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/viz/fluxus.viz.TimelineTextStyle.rst.txt b/docs/_sources/apidoc/fluxus/viz/fluxus.viz.TimelineTextStyle.rst.txt new file mode 100644 index 0000000..8e11ccc --- /dev/null +++ b/docs/_sources/apidoc/fluxus/viz/fluxus.viz.TimelineTextStyle.rst.txt @@ -0,0 +1,41 @@ +fluxus.viz.TimelineTextStyle +============================ + +.. currentmodule:: fluxus.viz + +.. autoclass:: TimelineTextStyle + :members: + :no-show-inheritance: + :inherited-members: + :special-members: __call__ + + + + .. rubric:: Method summary + + .. autosummary:: + :nosignatures: + + ~TimelineTextStyle.finalize_drawing + ~TimelineTextStyle.get_default_style_name + ~TimelineTextStyle.get_named_styles + ~TimelineTextStyle.render_timeline + ~TimelineTextStyle.start_drawing + + + + + + .. rubric:: Attribute summary + + .. autosummary:: + + ~TimelineTextStyle.COLUMN_ALIGNMENT + ~TimelineTextStyle.COLUMN_FORMATS + ~TimelineTextStyle.TABLE_HEADER + ~TimelineTextStyle.out + ~TimelineTextStyle.width + + + + .. rubric:: Definitions \ No newline at end of file diff --git a/docs/_sources/apidoc/fluxus/viz/fluxus.viz.base.rst.txt b/docs/_sources/apidoc/fluxus/viz/fluxus.viz.base.rst.txt new file mode 100644 index 0000000..f0139d7 --- /dev/null +++ b/docs/_sources/apidoc/fluxus/viz/fluxus.viz.base.rst.txt @@ -0,0 +1,44 @@ +fluxus.viz.base +=============== + +.. toctree:: + :maxdepth: 1 + :hidden: + + self + +.. automodule:: fluxus.viz.base + :no-imported-members: + + + + + + ======= + Classes + ======= + + .. autosummary:: + :toctree: base + :template: custom-class-template.rst + :nosignatures: + + FlowStyle + TimelineStyle + + + + + + + + + + + + + + + + + diff --git a/docs/_sources/contributor_guide/index.rst.txt b/docs/_sources/contributor_guide/index.rst.txt new file mode 100644 index 0000000..dec6f54 --- /dev/null +++ b/docs/_sources/contributor_guide/index.rst.txt @@ -0,0 +1,27 @@ +.. _contributor-guide-index: + +.. TODO: Link to the ARTKIT contributor guide + +Contributing +============ + +Ways to Contribute +------------------ + +There are many ways to contribute, including: + +- Create issues for bugs or feature requests +- Address an open issue +- Add or improve unit tests +- Create tutorials +- Improve documentation + +We especially encourage contributions that enhance our documentation. + +For major contributions, please reach out to the ARTKIT team in advance +(ARTKIT@bcg.com). + +Please refer to the +`ARTKIT contributor guide `_ for more +detailed information on how to contribute to this project. All instructions that +refer to the ARTKIT repository generally also apply to this *fluxus* repository. diff --git a/docs/_sources/faq.rst.txt b/docs/_sources/faq.rst.txt new file mode 100644 index 0000000..fb452b5 --- /dev/null +++ b/docs/_sources/faq.rst.txt @@ -0,0 +1,22 @@ +.. _faq: + +FAQ +=== + +.. contents:: + :local: + :depth: 2 + +About the project +----------------- + +What is FLUXUS for? +~~~~~~~~~~~~~~~~~~~ + +… + +Who developed FLUXUS, and why? +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +… + diff --git a/docs/_sources/index.rst.txt b/docs/_sources/index.rst.txt new file mode 100644 index 0000000..b77cfb9 --- /dev/null +++ b/docs/_sources/index.rst.txt @@ -0,0 +1,14 @@ +.. _artkit-home: + +Table of Contents +----------------- + +.. toctree:: + :maxdepth: 2 + + Home <_generated/home> + User Guide + API Reference + Contributor Guide + FAQ + Release Notes <_generated/release_notes> diff --git a/docs/_sources/user_guide/index.rst.txt b/docs/_sources/user_guide/index.rst.txt new file mode 100644 index 0000000..8bf55b7 --- /dev/null +++ b/docs/_sources/user_guide/index.rst.txt @@ -0,0 +1,37 @@ +.. _user-guide-index: + +.. toctree:: + :maxdepth: 2 + :caption: Introduction to FLUXUS + :hidden: + + introduction_to_fluxus/building_a_flow + + +User Guide +========== + +This section provides guidance for FLUXUS users, with more content to be added in the +future. + +- **Introduction to FLUXUS**: + Overview of core concepts and mechanics of using FLUXUS + + +Contributing to Our User Guide +------------------------------ + +We are committed to continuously improving our User Guide, ensuring it meets the needs +of our users and accurately reflects the breadth and depth of use cases for FLUXUS. + +To this end, we enthusiastically encourage contributions from our community. +Whether you've identified an opportunity for enhancement, noticed a crucial piece +missing, or have an addition that could benefit others, we invite you to share your +expertise. + +Should you come across any content gaps or areas for significant improvement, please do +not hesitate to open an issue on our `issue tracker `_. +Your contributions help us make our User Guide a more valuable resource for everyone. + +Before submitting contributions to the User Guide, be sure review our +:ref:`Contributor Guide `. diff --git a/docs/_sources/user_guide/introduction_to_fluxus/building_a_flow.ipynb.txt b/docs/_sources/user_guide/introduction_to_fluxus/building_a_flow.ipynb.txt new file mode 100644 index 0000000..26bef1f --- /dev/null +++ b/docs/_sources/user_guide/introduction_to_fluxus/building_a_flow.ipynb.txt @@ -0,0 +1,82 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "563768a5-cf56-4ccd-9db7-cdfe2a117654", + "metadata": {}, + "source": "# Building Your First FLUXUS Pipeline" + }, + { + "cell_type": "markdown", + "id": "1cc6cf47-6d2a-433d-a0ca-9abf4b7063d2", + "metadata": {}, + "source": [ + "## Introduction" + ] + }, + { + "cell_type": "markdown", + "id": "e83b06ba", + "metadata": {}, + "source": [ + "This notebook introduces the basic building blocks for developing flows with FLUXUS. You will learn how to:\n", + "\n", + "- Build and run flows using the functional API\n", + "- Creating custom flow *conduits* using the underlying object-oriented API" + ] + }, + { + "metadata": {}, + "cell_type": "markdown", + "source": "", + "id": "81866155cb418a38" + }, + { + "cell_type": "markdown", + "id": "82774e01-0015-4d0f-9c35-de08e193391d", + "metadata": {}, + "source": [ + "## Functional API\n", + "\n", + "First, we will import the " + ] + }, + { + "metadata": {}, + "cell_type": "code", + "outputs": [], + "execution_count": null, + "source": "", + "id": "23d1a6c6ac64d1f" + }, + { + "cell_type": "markdown", + "id": "c8e8e689", + "metadata": {}, + "source": [ + "## Concluding remarks" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.12.2" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/docs/_static/basic.css b/docs/_static/basic.css new file mode 100644 index 0000000..2af6139 --- /dev/null +++ b/docs/_static/basic.css @@ -0,0 +1,925 @@ +/* + * basic.css + * ~~~~~~~~~ + * + * Sphinx stylesheet -- basic theme. + * + * :copyright: Copyright 2007-2024 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +/* -- main layout ----------------------------------------------------------- */ + +div.clearer { + clear: both; +} + +div.section::after { + display: block; + content: ''; + clear: left; +} + +/* -- relbar ---------------------------------------------------------------- */ + +div.related { + width: 100%; + font-size: 90%; +} + +div.related h3 { + display: none; +} + +div.related ul { + margin: 0; + padding: 0 0 0 10px; + list-style: none; +} + +div.related li { + display: inline; +} + +div.related li.right { + float: right; + margin-right: 5px; +} + +/* -- sidebar --------------------------------------------------------------- */ + +div.sphinxsidebarwrapper { + padding: 10px 5px 0 10px; +} + +div.sphinxsidebar { + float: left; + width: 270px; + margin-left: -100%; + font-size: 90%; + word-wrap: break-word; + overflow-wrap : break-word; +} + +div.sphinxsidebar ul { + list-style: none; +} + +div.sphinxsidebar ul ul, +div.sphinxsidebar ul.want-points { + margin-left: 20px; + list-style: square; +} + +div.sphinxsidebar ul ul { + margin-top: 0; + margin-bottom: 0; +} + +div.sphinxsidebar form { + margin-top: 10px; +} + +div.sphinxsidebar input { + border: 1px solid #98dbcc; + font-family: sans-serif; + font-size: 1em; +} + +div.sphinxsidebar #searchbox form.search { + overflow: hidden; +} + +div.sphinxsidebar #searchbox input[type="text"] { + float: left; + width: 80%; + padding: 0.25em; + box-sizing: border-box; +} + +div.sphinxsidebar #searchbox input[type="submit"] { + float: left; + width: 20%; + border-left: none; + padding: 0.25em; + box-sizing: border-box; +} + + +img { + border: 0; + max-width: 100%; +} + +/* -- search page ----------------------------------------------------------- */ + +ul.search { + margin: 10px 0 0 20px; + padding: 0; +} + +ul.search li { + padding: 5px 0 5px 20px; + background-image: url(file.png); + background-repeat: no-repeat; + background-position: 0 7px; +} + +ul.search li a { + font-weight: bold; +} + +ul.search li p.context { + color: #888; + margin: 2px 0 0 30px; + text-align: left; +} + +ul.keywordmatches li.goodmatch a { + font-weight: bold; +} + +/* -- index page ------------------------------------------------------------ */ + +table.contentstable { + width: 90%; + margin-left: auto; + margin-right: auto; +} + +table.contentstable p.biglink { + line-height: 150%; +} + +a.biglink { + font-size: 1.3em; +} + +span.linkdescr { + font-style: italic; + padding-top: 5px; + font-size: 90%; +} + +/* -- general index --------------------------------------------------------- */ + +table.indextable { + width: 100%; +} + +table.indextable td { + text-align: left; + vertical-align: top; +} + +table.indextable ul { + margin-top: 0; + margin-bottom: 0; + list-style-type: none; +} + +table.indextable > tbody > tr > td > ul { + padding-left: 0em; +} + +table.indextable tr.pcap { + height: 10px; +} + +table.indextable tr.cap { + margin-top: 10px; + background-color: #f2f2f2; +} + +img.toggler { + margin-right: 3px; + margin-top: 3px; + cursor: pointer; +} + +div.modindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +div.genindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +/* -- domain module index --------------------------------------------------- */ + +table.modindextable td { + padding: 2px; + border-collapse: collapse; +} + +/* -- general body styles --------------------------------------------------- */ + +div.body { + min-width: 360px; + max-width: 800px; +} + +div.body p, div.body dd, div.body li, div.body blockquote { + -moz-hyphens: auto; + -ms-hyphens: auto; + -webkit-hyphens: auto; + hyphens: auto; +} + +a.headerlink { + visibility: hidden; +} + +a:visited { + color: #551A8B; +} + +h1:hover > a.headerlink, +h2:hover > a.headerlink, +h3:hover > a.headerlink, +h4:hover > a.headerlink, +h5:hover > a.headerlink, +h6:hover > a.headerlink, +dt:hover > a.headerlink, +caption:hover > a.headerlink, +p.caption:hover > a.headerlink, +div.code-block-caption:hover > a.headerlink { + visibility: visible; +} + +div.body p.caption { + text-align: inherit; +} + +div.body td { + text-align: left; +} + +.first { + margin-top: 0 !important; +} + +p.rubric { + margin-top: 30px; + font-weight: bold; +} + +img.align-left, figure.align-left, .figure.align-left, object.align-left { + clear: left; + float: left; + margin-right: 1em; +} + +img.align-right, figure.align-right, .figure.align-right, object.align-right { + clear: right; + float: right; + margin-left: 1em; +} + +img.align-center, figure.align-center, .figure.align-center, object.align-center { + display: block; + margin-left: auto; + margin-right: auto; +} + +img.align-default, figure.align-default, .figure.align-default { + display: block; + margin-left: auto; + margin-right: auto; +} + +.align-left { + text-align: left; +} + +.align-center { + text-align: center; +} + +.align-default { + text-align: center; +} + +.align-right { + text-align: right; +} + +/* -- sidebars -------------------------------------------------------------- */ + +div.sidebar, +aside.sidebar { + margin: 0 0 0.5em 1em; + border: 1px solid #ddb; + padding: 7px; + background-color: #ffe; + width: 40%; + float: right; + clear: right; + overflow-x: auto; +} + +p.sidebar-title { + font-weight: bold; +} + +nav.contents, +aside.topic, +div.admonition, div.topic, blockquote { + clear: left; +} + +/* -- topics ---------------------------------------------------------------- */ + +nav.contents, +aside.topic, +div.topic { + border: 1px solid #ccc; + padding: 7px; + margin: 10px 0 10px 0; +} + +p.topic-title { + font-size: 1.1em; + font-weight: bold; + margin-top: 10px; +} + +/* -- admonitions ----------------------------------------------------------- */ + +div.admonition { + margin-top: 10px; + margin-bottom: 10px; + padding: 7px; +} + +div.admonition dt { + font-weight: bold; +} + +p.admonition-title { + margin: 0px 10px 5px 0px; + font-weight: bold; +} + +div.body p.centered { + text-align: center; + margin-top: 25px; +} + +/* -- content of sidebars/topics/admonitions -------------------------------- */ + +div.sidebar > :last-child, +aside.sidebar > :last-child, +nav.contents > :last-child, +aside.topic > :last-child, +div.topic > :last-child, +div.admonition > :last-child { + margin-bottom: 0; +} + +div.sidebar::after, +aside.sidebar::after, +nav.contents::after, +aside.topic::after, +div.topic::after, +div.admonition::after, +blockquote::after { + display: block; + content: ''; + clear: both; +} + +/* -- tables ---------------------------------------------------------------- */ + +table.docutils { + margin-top: 10px; + margin-bottom: 10px; + border: 0; + border-collapse: collapse; +} + +table.align-center { + margin-left: auto; + margin-right: auto; +} + +table.align-default { + margin-left: auto; + margin-right: auto; +} + +table caption span.caption-number { + font-style: italic; +} + +table caption span.caption-text { +} + +table.docutils td, table.docutils th { + padding: 1px 8px 1px 5px; + border-top: 0; + border-left: 0; + border-right: 0; + border-bottom: 1px solid #aaa; +} + +th { + text-align: left; + padding-right: 5px; +} + +table.citation { + border-left: solid 1px gray; + margin-left: 1px; +} + +table.citation td { + border-bottom: none; +} + +th > :first-child, +td > :first-child { + margin-top: 0px; +} + +th > :last-child, +td > :last-child { + margin-bottom: 0px; +} + +/* -- figures --------------------------------------------------------------- */ + +div.figure, figure { + margin: 0.5em; + padding: 0.5em; +} + +div.figure p.caption, figcaption { + padding: 0.3em; +} + +div.figure p.caption span.caption-number, +figcaption span.caption-number { + font-style: italic; +} + +div.figure p.caption span.caption-text, +figcaption span.caption-text { +} + +/* -- field list styles ----------------------------------------------------- */ + +table.field-list td, table.field-list th { + border: 0 !important; +} + +.field-list ul { + margin: 0; + padding-left: 1em; +} + +.field-list p { + margin: 0; +} + +.field-name { + -moz-hyphens: manual; + -ms-hyphens: manual; + -webkit-hyphens: manual; + hyphens: manual; +} + +/* -- hlist styles ---------------------------------------------------------- */ + +table.hlist { + margin: 1em 0; +} + +table.hlist td { + vertical-align: top; +} + +/* -- object description styles --------------------------------------------- */ + +.sig { + font-family: 'Consolas', 'Menlo', 'DejaVu Sans Mono', 'Bitstream Vera Sans Mono', monospace; +} + +.sig-name, code.descname { + background-color: transparent; + font-weight: bold; +} + +.sig-name { + font-size: 1.1em; +} + +code.descname { + font-size: 1.2em; +} + +.sig-prename, code.descclassname { + background-color: transparent; +} + +.optional { + font-size: 1.3em; +} + +.sig-paren { + font-size: larger; +} + +.sig-param.n { + font-style: italic; +} + +/* C++ specific styling */ + +.sig-inline.c-texpr, +.sig-inline.cpp-texpr { + font-family: unset; +} + +.sig.c .k, .sig.c .kt, +.sig.cpp .k, .sig.cpp .kt { + color: #0033B3; +} + +.sig.c .m, +.sig.cpp .m { + color: #1750EB; +} + +.sig.c .s, .sig.c .sc, +.sig.cpp .s, .sig.cpp .sc { + color: #067D17; +} + + +/* -- other body styles ----------------------------------------------------- */ + +ol.arabic { + list-style: decimal; +} + +ol.loweralpha { + list-style: lower-alpha; +} + +ol.upperalpha { + list-style: upper-alpha; +} + +ol.lowerroman { + list-style: lower-roman; +} + +ol.upperroman { + list-style: upper-roman; +} + +:not(li) > ol > li:first-child > :first-child, +:not(li) > ul > li:first-child > :first-child { + margin-top: 0px; +} + +:not(li) > ol > li:last-child > :last-child, +:not(li) > ul > li:last-child > :last-child { + margin-bottom: 0px; +} + +ol.simple ol p, +ol.simple ul p, +ul.simple ol p, +ul.simple ul p { + margin-top: 0; +} + +ol.simple > li:not(:first-child) > p, +ul.simple > li:not(:first-child) > p { + margin-top: 0; +} + +ol.simple p, +ul.simple p { + margin-bottom: 0; +} + +aside.footnote > span, +div.citation > span { + float: left; +} +aside.footnote > span:last-of-type, +div.citation > span:last-of-type { + padding-right: 0.5em; +} +aside.footnote > p { + margin-left: 2em; +} +div.citation > p { + margin-left: 4em; +} +aside.footnote > p:last-of-type, +div.citation > p:last-of-type { + margin-bottom: 0em; +} +aside.footnote > p:last-of-type:after, +div.citation > p:last-of-type:after { + content: ""; + clear: both; +} + +dl.field-list { + display: grid; + grid-template-columns: fit-content(30%) auto; +} + +dl.field-list > dt { + font-weight: bold; + word-break: break-word; + padding-left: 0.5em; + padding-right: 5px; +} + +dl.field-list > dd { + padding-left: 0.5em; + margin-top: 0em; + margin-left: 0em; + margin-bottom: 0em; +} + +dl { + margin-bottom: 15px; +} + +dd > :first-child { + margin-top: 0px; +} + +dd ul, dd table { + margin-bottom: 10px; +} + +dd { + margin-top: 3px; + margin-bottom: 10px; + margin-left: 30px; +} + +.sig dd { + margin-top: 0px; + margin-bottom: 0px; +} + +.sig dl { + margin-top: 0px; + margin-bottom: 0px; +} + +dl > dd:last-child, +dl > dd:last-child > :last-child { + margin-bottom: 0; +} + +dt:target, span.highlighted { + background-color: #fbe54e; +} + +rect.highlighted { + fill: #fbe54e; +} + +dl.glossary dt { + font-weight: bold; + font-size: 1.1em; +} + +.versionmodified { + font-style: italic; +} + +.system-message { + background-color: #fda; + padding: 5px; + border: 3px solid red; +} + +.footnote:target { + background-color: #ffa; +} + +.line-block { + display: block; + margin-top: 1em; + margin-bottom: 1em; +} + +.line-block .line-block { + margin-top: 0; + margin-bottom: 0; + margin-left: 1.5em; +} + +.guilabel, .menuselection { + font-family: sans-serif; +} + +.accelerator { + text-decoration: underline; +} + +.classifier { + font-style: oblique; +} + +.classifier:before { + font-style: normal; + margin: 0 0.5em; + content: ":"; + display: inline-block; +} + +abbr, acronym { + border-bottom: dotted 1px; + cursor: help; +} + +.translated { + background-color: rgba(207, 255, 207, 0.2) +} + +.untranslated { + background-color: rgba(255, 207, 207, 0.2) +} + +/* -- code displays --------------------------------------------------------- */ + +pre { + overflow: auto; + overflow-y: hidden; /* fixes display issues on Chrome browsers */ +} + +pre, div[class*="highlight-"] { + clear: both; +} + +span.pre { + -moz-hyphens: none; + -ms-hyphens: none; + -webkit-hyphens: none; + hyphens: none; + white-space: nowrap; +} + +div[class*="highlight-"] { + margin: 1em 0; +} + +td.linenos pre { + border: 0; + background-color: transparent; + color: #aaa; +} + +table.highlighttable { + display: block; +} + +table.highlighttable tbody { + display: block; +} + +table.highlighttable tr { + display: flex; +} + +table.highlighttable td { + margin: 0; + padding: 0; +} + +table.highlighttable td.linenos { + padding-right: 0.5em; +} + +table.highlighttable td.code { + flex: 1; + overflow: hidden; +} + +.highlight .hll { + display: block; +} + +div.highlight pre, +table.highlighttable pre { + margin: 0; +} + +div.code-block-caption + div { + margin-top: 0; +} + +div.code-block-caption { + margin-top: 1em; + padding: 2px 5px; + font-size: small; +} + +div.code-block-caption code { + background-color: transparent; +} + +table.highlighttable td.linenos, +span.linenos, +div.highlight span.gp { /* gp: Generic.Prompt */ + user-select: none; + -webkit-user-select: text; /* Safari fallback only */ + -webkit-user-select: none; /* Chrome/Safari */ + -moz-user-select: none; /* Firefox */ + -ms-user-select: none; /* IE10+ */ +} + +div.code-block-caption span.caption-number { + padding: 0.1em 0.3em; + font-style: italic; +} + +div.code-block-caption span.caption-text { +} + +div.literal-block-wrapper { + margin: 1em 0; +} + +code.xref, a code { + background-color: transparent; + font-weight: bold; +} + +h1 code, h2 code, h3 code, h4 code, h5 code, h6 code { + background-color: transparent; +} + +.viewcode-link { + float: right; +} + +.viewcode-back { + float: right; + font-family: sans-serif; +} + +div.viewcode-block:target { + margin: -1px -10px; + padding: 0 10px; +} + +/* -- math display ---------------------------------------------------------- */ + +img.math { + vertical-align: middle; +} + +div.body div.math p { + text-align: center; +} + +span.eqno { + float: right; +} + +span.eqno a.headerlink { + position: absolute; + z-index: 1; +} + +div.math:hover a.headerlink { + visibility: visible; +} + +/* -- printout stylesheet --------------------------------------------------- */ + +@media print { + div.document, + div.documentwrapper, + div.bodywrapper { + margin: 0 !important; + width: 100%; + } + + div.sphinxsidebar, + div.related, + div.footer, + #top-link { + display: none; + } +} \ No newline at end of file diff --git a/docs/_static/bcgx_logo.png b/docs/_static/bcgx_logo.png new file mode 100644 index 0000000..8a2316c Binary files /dev/null and b/docs/_static/bcgx_logo.png differ diff --git a/docs/_static/css/bcgx.css b/docs/_static/css/bcgx.css new file mode 100644 index 0000000..0cadecc --- /dev/null +++ b/docs/_static/css/bcgx.css @@ -0,0 +1,117 @@ +h1, h2 { + color:#29BA74; +} + +a { + color: #3333ff; +} + +a:hover { + color: #30C1D7; +} + +.bd-page-width { + max-width: 95%; /* default is 88rem */ +} + +.bd-header.navbar { + padding-top: 5px; + padding-bottom: 5px; +} + +code { + color: #295E7E; + background-color: #E0E0E0; + padding-left: 1pt; + padding-right: 1pt; +} + +.bd-content .longtable.table { + display: table; +} + +.longtable.table { + width: 100%; +} + +.longtable.table td, .longtable.table th { + padding: .25rem; + vertical-align: top; + border-top: 1px solid #DEE2E6; + border-bottom: 1px solid #DEE2E6; +} + +.longtable.table tr:nth-child(odd) { + background-color: #F1F5FA; +} + +p.rubric { + border-bottom: none; + border-top: 1px #295E7E solid; + padding-top: 8px; +} + +dl { + padding-bottom: 8px; + padding-top: 8px; + margin-bottom: unset; + margin-top: unset; +} + +dl.py + dl.py { + border-top: 1px #DEE2E6 solid; +} + +.navbar-brand img { + max-width: 175px; + height:auto; + width:auto; +} + +.navbar-nav > .active > .nav-link { + color:#29BA74 !important; +} + +.bd-sidebar .nav > .active:hover > a, .bd-sidebar .nav > .active > a, +.bd-sidebar .nav > li > ul > .active:hover > a, .bd-sidebar .nav > li > ul > .active > a { + color:#29BA74; +} + +.toc-entry > .nav-link.active { + color: #29BA74; + border-left-color: #29BA74; +} + +img.team-pic { + height: 200px; + padding-top: 5pt; + padding-bottom: 5pt; + padding-left: 5pt; + padding-right: 5pt; +} + +/* Primary Sidebar Appearance */ +.bd-links__title { + display: none; +} + +.caption-text { + font-size: 22px; +} + +.nav.bd-sidenav .reference.internal { + font-size: 16px; +} + +/* Secondary Sidebar Appearance */ +#pst-page-navigation-heading-2.onthispage { + font-family: 'Arial', sans-serif; + font-size: 18px; + color: #000000; + padding-bottom: 10px; +} + +.page-toc .nav.section-nav .nav-item .nav-link { + color: #29BA74; + font-size: 15px; +} \ No newline at end of file diff --git a/docs/_static/doctools.js b/docs/_static/doctools.js new file mode 100644 index 0000000..4d67807 --- /dev/null +++ b/docs/_static/doctools.js @@ -0,0 +1,156 @@ +/* + * doctools.js + * ~~~~~~~~~~~ + * + * Base JavaScript utilities for all Sphinx HTML documentation. + * + * :copyright: Copyright 2007-2024 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ +"use strict"; + +const BLACKLISTED_KEY_CONTROL_ELEMENTS = new Set([ + "TEXTAREA", + "INPUT", + "SELECT", + "BUTTON", +]); + +const _ready = (callback) => { + if (document.readyState !== "loading") { + callback(); + } else { + document.addEventListener("DOMContentLoaded", callback); + } +}; + +/** + * Small JavaScript module for the documentation. + */ +const Documentation = { + init: () => { + Documentation.initDomainIndexTable(); + Documentation.initOnKeyListeners(); + }, + + /** + * i18n support + */ + TRANSLATIONS: {}, + PLURAL_EXPR: (n) => (n === 1 ? 0 : 1), + LOCALE: "unknown", + + // gettext and ngettext don't access this so that the functions + // can safely bound to a different name (_ = Documentation.gettext) + gettext: (string) => { + const translated = Documentation.TRANSLATIONS[string]; + switch (typeof translated) { + case "undefined": + return string; // no translation + case "string": + return translated; // translation exists + default: + return translated[0]; // (singular, plural) translation tuple exists + } + }, + + ngettext: (singular, plural, n) => { + const translated = Documentation.TRANSLATIONS[singular]; + if (typeof translated !== "undefined") + return translated[Documentation.PLURAL_EXPR(n)]; + return n === 1 ? singular : plural; + }, + + addTranslations: (catalog) => { + Object.assign(Documentation.TRANSLATIONS, catalog.messages); + Documentation.PLURAL_EXPR = new Function( + "n", + `return (${catalog.plural_expr})` + ); + Documentation.LOCALE = catalog.locale; + }, + + /** + * helper function to focus on search bar + */ + focusSearchBar: () => { + document.querySelectorAll("input[name=q]")[0]?.focus(); + }, + + /** + * Initialise the domain index toggle buttons + */ + initDomainIndexTable: () => { + const toggler = (el) => { + const idNumber = el.id.substr(7); + const toggledRows = document.querySelectorAll(`tr.cg-${idNumber}`); + if (el.src.substr(-9) === "minus.png") { + el.src = `${el.src.substr(0, el.src.length - 9)}plus.png`; + toggledRows.forEach((el) => (el.style.display = "none")); + } else { + el.src = `${el.src.substr(0, el.src.length - 8)}minus.png`; + toggledRows.forEach((el) => (el.style.display = "")); + } + }; + + const togglerElements = document.querySelectorAll("img.toggler"); + togglerElements.forEach((el) => + el.addEventListener("click", (event) => toggler(event.currentTarget)) + ); + togglerElements.forEach((el) => (el.style.display = "")); + if (DOCUMENTATION_OPTIONS.COLLAPSE_INDEX) togglerElements.forEach(toggler); + }, + + initOnKeyListeners: () => { + // only install a listener if it is really needed + if ( + !DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS && + !DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS + ) + return; + + document.addEventListener("keydown", (event) => { + // bail for input elements + if (BLACKLISTED_KEY_CONTROL_ELEMENTS.has(document.activeElement.tagName)) return; + // bail with special keys + if (event.altKey || event.ctrlKey || event.metaKey) return; + + if (!event.shiftKey) { + switch (event.key) { + case "ArrowLeft": + if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; + + const prevLink = document.querySelector('link[rel="prev"]'); + if (prevLink && prevLink.href) { + window.location.href = prevLink.href; + event.preventDefault(); + } + break; + case "ArrowRight": + if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; + + const nextLink = document.querySelector('link[rel="next"]'); + if (nextLink && nextLink.href) { + window.location.href = nextLink.href; + event.preventDefault(); + } + break; + } + } + + // some keyboard layouts may need Shift to get / + switch (event.key) { + case "/": + if (!DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS) break; + Documentation.focusSearchBar(); + event.preventDefault(); + } + }); + }, +}; + +// quick alias for translations +const _ = Documentation.gettext; + +_ready(Documentation.init); diff --git a/docs/_static/documentation_options.js b/docs/_static/documentation_options.js new file mode 100644 index 0000000..7e4c114 --- /dev/null +++ b/docs/_static/documentation_options.js @@ -0,0 +1,13 @@ +const DOCUMENTATION_OPTIONS = { + VERSION: '', + LANGUAGE: 'en', + COLLAPSE_INDEX: false, + BUILDER: 'html', + FILE_SUFFIX: '.html', + LINK_SUFFIX: '.html', + HAS_SOURCE: true, + SOURCELINK_SUFFIX: '.txt', + NAVIGATION_WITH_KEYS: false, + SHOW_SEARCH_SUMMARY: true, + ENABLE_SEARCH_SHORTCUTS: true, +}; \ No newline at end of file diff --git a/docs/_static/file.png b/docs/_static/file.png new file mode 100644 index 0000000..a858a41 Binary files /dev/null and b/docs/_static/file.png differ diff --git a/docs/_static/js/bcgx.js b/docs/_static/js/bcgx.js new file mode 100644 index 0000000..79ce19f --- /dev/null +++ b/docs/_static/js/bcgx.js @@ -0,0 +1,57 @@ +$(document).ready(function() { + $('a.reference.external').attr('target', '_blank'); + DOCUMENTATION_OPTIONS.VERSION = DOCS_VERSIONS.current; + buildVersionSelector(); +}); + + +const buildVersionSelector = function() { + + const versionDropdown = $('