Python: Add the Python process framework (#9363)
### Motivation and Context
An initial PR to add the foundational pieces of the Python Process
framework, which holds it design to be similar to dotnet in that step
types are added to a process builder, and later on, when the step is
run, it is first instantiated and the proper state is provided.
<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
1. Why is this change required?
2. What problem does it solve?
3. What scenario does it contribute to?
4. If it fixes an open issue, please link to the issue here.
-->
### Description
Adding the initial process framework components:
- Closes #9354
**TODO**
- more unit tests will be added to increase the code coverage. Currently
there are several files with no (or low) code coverage.
- more samples will either be added to this PR or a subsequent PR
<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->
### Contribution Checklist
<!-- Before submitting this PR, please make sure: -->
- [X] The code builds clean without any errors or warnings
- [X] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [X] All unit tests pass, and I have added new tests where possible
- [X] I didn't break anyone :smile:
2024-10-24 13:37:45 -04:00
|
|
|
# Copyright (c) Microsoft. All rights reserved.
|
|
|
|
|
|
Python: Allow for factory callbacks in the process framework (#10451)
### Motivation and Context
In the current Python process framework, in runtimes like Dapr, there is
no easy way to pass complex (unserializable) dependencies to a step --
this includes things like a ChatCompletion service or an agent of type
ChatCompletionAgent.
Similar to how the kernel dependency was propagated to the step_actor or
process_actor, we're introducing the ability to specify a factory
callback that will be called as the step is instantiated. The factory is
created, if specified via the optional kwarg when adding a step to the
process builder like:
```python
myBStep = process.add_step(step_type=BStep, factory_function=bstep_factory)
```
The `bstep_factory` looks like (along with its corresponding step)
```python
async def bstep_factory():
"""Creates a BStep instance with ephemeral references like ChatCompletionAgent."""
kernel = Kernel()
kernel.add_service(AzureChatCompletion())
agent = ChatCompletionAgent(kernel=kernel, name="echo", instructions="repeat the input back")
step_instance = BStep()
step_instance.agent = agent
return step_instance
class BStep(KernelProcessStep):
"""A sample BStep that optionally holds a ChatCompletionAgent.
By design, the agent is ephemeral (not stored in state).
"""
# Ephemeral references won't be persisted to Dapr
# because we do not place them in a step state model.
# We'll set this in the factory function:
agent: ChatCompletionAgent | None = None
@kernel_function(name="do_it")
async def do_it(self, context: KernelProcessStepContext):
print("##### BStep ran (do_it).")
await asyncio.sleep(2)
if self.agent:
history = ChatHistory()
history.add_user_message("Hello from BStep!")
async for msg in self.agent.invoke(history):
print(f"BStep got agent response: {msg.content}")
await context.emit_event(process_event="BStepDone", data="I did B")
```
Although this isn't explicitly necessary with the local runtime, the
factory callback will also work, if desired.
<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
1. Why is this change required?
2. What problem does it solve?
3. What scenario does it contribute to?
4. If it fixes an open issue, please link to the issue here.
-->
### Description
Adds the ability to specify a factory callback for a step in the process
framework.
- Adjusts the Dapr FastAPI demo sample to show how one can include a
dependency like a `ChatCompletionAgent` and use the factory callback for
`BStep`. Although the output from the agent isn't needed, it
demonstrates the capability to handle these types of dependencies while
running Dapr.
- Adds unit tests
- Closes #10409
<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->
### Contribution Checklist
<!-- Before submitting this PR, please make sure: -->
- [X] The code builds clean without any errors or warnings
- [X] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [X] All unit tests pass, and I have added new tests where possible
- [X] I didn't break anyone :smile:
2025-02-11 10:39:08 +09:00
|
|
|
from collections.abc import Callable
|
Python: Add the Python process framework (#9363)
### Motivation and Context
An initial PR to add the foundational pieces of the Python Process
framework, which holds it design to be similar to dotnet in that step
types are added to a process builder, and later on, when the step is
run, it is first instantiated and the proper state is provided.
<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
1. Why is this change required?
2. What problem does it solve?
3. What scenario does it contribute to?
4. If it fixes an open issue, please link to the issue here.
-->
### Description
Adding the initial process framework components:
- Closes #9354
**TODO**
- more unit tests will be added to increase the code coverage. Currently
there are several files with no (or low) code coverage.
- more samples will either be added to this PR or a subsequent PR
<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->
### Contribution Checklist
<!-- Before submitting this PR, please make sure: -->
- [X] The code builds clean without any errors or warnings
- [X] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [X] All unit tests pass, and I have added new tests where possible
- [X] I didn't break anyone :smile:
2024-10-24 13:37:45 -04:00
|
|
|
from typing import TYPE_CHECKING, Any
|
|
|
|
|
|
|
|
|
|
from pydantic import Field
|
|
|
|
|
|
|
|
|
|
from semantic_kernel.processes.kernel_process.kernel_process_edge import KernelProcessEdge
|
|
|
|
|
from semantic_kernel.processes.kernel_process.kernel_process_state import KernelProcessState
|
|
|
|
|
from semantic_kernel.processes.kernel_process.kernel_process_step_info import KernelProcessStepInfo
|
2025-04-29 07:51:11 +09:00
|
|
|
from semantic_kernel.processes.process_state_metadata_utils import kernel_process_to_process_state_metadata
|
Python: Introduce feature decorator to allow for experimental and release candidate decorator usage (#10691)
### Motivation and Context
<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
1. Why is this change required?
2. What problem does it solve?
3. What scenario does it contribute to?
4. If it fixes an open issue, please link to the issue here.
-->
This change is required to improve the flexibility and maintainability
of our feature annotation system. Previously, separate decorators (e.g.,
`experimental_function` and `experimental_class`) were used to mark
experimental features, resulting in code duplication and limiting our
ability to handle additional feature stages. As our SDK evolves, we need
a unified approach that can support multiple stages—such as
experimental, release candidate, and future states—while also allowing
version information for release candidate features to be centrally
managed.
### Description
<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->
This PR refactors our feature decorators by introducing a unified
`stage` decorator that updates the docstring and attaches metadata for
both functions and classes. Two convenience decorators, `experimental`
and `release_candidate`, are built on top of `stage`:
- The `experimental` decorator marks features as experimental and sets
an `is_experimental` attribute.
- The `release_candidate` decorator supports multiple usage patterns
(with or without parentheses and with an optional version parameter) to
mark features as release candidate and sets an `is_release_candidate`
attribute.
This unified approach reduces duplication, simplifies the codebase, and
lays the groundwork for easily extending feature stages in the future.
This decorator supports the following usage patterns:
- `@experimental` (for both classes and functions)
- `@release_candidate` (no parentheses)
- `@release_candidate()` (empty parentheses)
- `@release_candidate("1.21.3-rc1")` (positional version)
- `@release_candidate(version="1.21.3-rc1")` (keyword version)
### Contribution Checklist
<!-- Before submitting this PR, please make sure: -->
- [X] The code builds clean without any errors or warnings
- [X] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [X] All unit tests pass, and I have added new tests where possible
- [X] I didn't break anyone :smile:
2025-02-27 09:17:54 +09:00
|
|
|
from semantic_kernel.utils.feature_stage_decorator import experimental
|
Python: Add the Python process framework (#9363)
### Motivation and Context
An initial PR to add the foundational pieces of the Python Process
framework, which holds it design to be similar to dotnet in that step
types are added to a process builder, and later on, when the step is
run, it is first instantiated and the proper state is provided.
<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
1. Why is this change required?
2. What problem does it solve?
3. What scenario does it contribute to?
4. If it fixes an open issue, please link to the issue here.
-->
### Description
Adding the initial process framework components:
- Closes #9354
**TODO**
- more unit tests will be added to increase the code coverage. Currently
there are several files with no (or low) code coverage.
- more samples will either be added to this PR or a subsequent PR
<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->
### Contribution Checklist
<!-- Before submitting this PR, please make sure: -->
- [X] The code builds clean without any errors or warnings
- [X] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [X] All unit tests pass, and I have added new tests where possible
- [X] I didn't break anyone :smile:
2024-10-24 13:37:45 -04:00
|
|
|
|
|
|
|
|
if TYPE_CHECKING:
|
|
|
|
|
from semantic_kernel.processes.kernel_process.kernel_process_edge import KernelProcessEdge
|
2025-04-29 07:51:11 +09:00
|
|
|
from semantic_kernel.processes.kernel_process.kernel_process_step_state_metadata import KernelProcessStateMetadata
|
Python: Add the Python process framework (#9363)
### Motivation and Context
An initial PR to add the foundational pieces of the Python Process
framework, which holds it design to be similar to dotnet in that step
types are added to a process builder, and later on, when the step is
run, it is first instantiated and the proper state is provided.
<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
1. Why is this change required?
2. What problem does it solve?
3. What scenario does it contribute to?
4. If it fixes an open issue, please link to the issue here.
-->
### Description
Adding the initial process framework components:
- Closes #9354
**TODO**
- more unit tests will be added to increase the code coverage. Currently
there are several files with no (or low) code coverage.
- more samples will either be added to this PR or a subsequent PR
<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->
### Contribution Checklist
<!-- Before submitting this PR, please make sure: -->
- [X] The code builds clean without any errors or warnings
- [X] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [X] All unit tests pass, and I have added new tests where possible
- [X] I didn't break anyone :smile:
2024-10-24 13:37:45 -04:00
|
|
|
|
|
|
|
|
|
Python: Introduce feature decorator to allow for experimental and release candidate decorator usage (#10691)
### Motivation and Context
<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
1. Why is this change required?
2. What problem does it solve?
3. What scenario does it contribute to?
4. If it fixes an open issue, please link to the issue here.
-->
This change is required to improve the flexibility and maintainability
of our feature annotation system. Previously, separate decorators (e.g.,
`experimental_function` and `experimental_class`) were used to mark
experimental features, resulting in code duplication and limiting our
ability to handle additional feature stages. As our SDK evolves, we need
a unified approach that can support multiple stages—such as
experimental, release candidate, and future states—while also allowing
version information for release candidate features to be centrally
managed.
### Description
<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->
This PR refactors our feature decorators by introducing a unified
`stage` decorator that updates the docstring and attaches metadata for
both functions and classes. Two convenience decorators, `experimental`
and `release_candidate`, are built on top of `stage`:
- The `experimental` decorator marks features as experimental and sets
an `is_experimental` attribute.
- The `release_candidate` decorator supports multiple usage patterns
(with or without parentheses and with an optional version parameter) to
mark features as release candidate and sets an `is_release_candidate`
attribute.
This unified approach reduces duplication, simplifies the codebase, and
lays the groundwork for easily extending feature stages in the future.
This decorator supports the following usage patterns:
- `@experimental` (for both classes and functions)
- `@release_candidate` (no parentheses)
- `@release_candidate()` (empty parentheses)
- `@release_candidate("1.21.3-rc1")` (positional version)
- `@release_candidate(version="1.21.3-rc1")` (keyword version)
### Contribution Checklist
<!-- Before submitting this PR, please make sure: -->
- [X] The code builds clean without any errors or warnings
- [X] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [X] All unit tests pass, and I have added new tests where possible
- [X] I didn't break anyone :smile:
2025-02-27 09:17:54 +09:00
|
|
|
@experimental
|
Python: Add the Python process framework (#9363)
### Motivation and Context
An initial PR to add the foundational pieces of the Python Process
framework, which holds it design to be similar to dotnet in that step
types are added to a process builder, and later on, when the step is
run, it is first instantiated and the proper state is provided.
<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
1. Why is this change required?
2. What problem does it solve?
3. What scenario does it contribute to?
4. If it fixes an open issue, please link to the issue here.
-->
### Description
Adding the initial process framework components:
- Closes #9354
**TODO**
- more unit tests will be added to increase the code coverage. Currently
there are several files with no (or low) code coverage.
- more samples will either be added to this PR or a subsequent PR
<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->
### Contribution Checklist
<!-- Before submitting this PR, please make sure: -->
- [X] The code builds clean without any errors or warnings
- [X] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [X] All unit tests pass, and I have added new tests where possible
- [X] I didn't break anyone :smile:
2024-10-24 13:37:45 -04:00
|
|
|
class KernelProcess(KernelProcessStepInfo):
|
|
|
|
|
"""A kernel process."""
|
|
|
|
|
|
|
|
|
|
steps: list[KernelProcessStepInfo] = Field(default_factory=list)
|
Python: Allow for factory callbacks in the process framework (#10451)
### Motivation and Context
In the current Python process framework, in runtimes like Dapr, there is
no easy way to pass complex (unserializable) dependencies to a step --
this includes things like a ChatCompletion service or an agent of type
ChatCompletionAgent.
Similar to how the kernel dependency was propagated to the step_actor or
process_actor, we're introducing the ability to specify a factory
callback that will be called as the step is instantiated. The factory is
created, if specified via the optional kwarg when adding a step to the
process builder like:
```python
myBStep = process.add_step(step_type=BStep, factory_function=bstep_factory)
```
The `bstep_factory` looks like (along with its corresponding step)
```python
async def bstep_factory():
"""Creates a BStep instance with ephemeral references like ChatCompletionAgent."""
kernel = Kernel()
kernel.add_service(AzureChatCompletion())
agent = ChatCompletionAgent(kernel=kernel, name="echo", instructions="repeat the input back")
step_instance = BStep()
step_instance.agent = agent
return step_instance
class BStep(KernelProcessStep):
"""A sample BStep that optionally holds a ChatCompletionAgent.
By design, the agent is ephemeral (not stored in state).
"""
# Ephemeral references won't be persisted to Dapr
# because we do not place them in a step state model.
# We'll set this in the factory function:
agent: ChatCompletionAgent | None = None
@kernel_function(name="do_it")
async def do_it(self, context: KernelProcessStepContext):
print("##### BStep ran (do_it).")
await asyncio.sleep(2)
if self.agent:
history = ChatHistory()
history.add_user_message("Hello from BStep!")
async for msg in self.agent.invoke(history):
print(f"BStep got agent response: {msg.content}")
await context.emit_event(process_event="BStepDone", data="I did B")
```
Although this isn't explicitly necessary with the local runtime, the
factory callback will also work, if desired.
<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
1. Why is this change required?
2. What problem does it solve?
3. What scenario does it contribute to?
4. If it fixes an open issue, please link to the issue here.
-->
### Description
Adds the ability to specify a factory callback for a step in the process
framework.
- Adjusts the Dapr FastAPI demo sample to show how one can include a
dependency like a `ChatCompletionAgent` and use the factory callback for
`BStep`. Although the output from the agent isn't needed, it
demonstrates the capability to handle these types of dependencies while
running Dapr.
- Adds unit tests
- Closes #10409
<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->
### Contribution Checklist
<!-- Before submitting this PR, please make sure: -->
- [X] The code builds clean without any errors or warnings
- [X] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [X] All unit tests pass, and I have added new tests where possible
- [X] I didn't break anyone :smile:
2025-02-11 10:39:08 +09:00
|
|
|
factories: dict[str, Callable] = Field(default_factory=dict)
|
Python: Add the Python process framework (#9363)
### Motivation and Context
An initial PR to add the foundational pieces of the Python Process
framework, which holds it design to be similar to dotnet in that step
types are added to a process builder, and later on, when the step is
run, it is first instantiated and the proper state is provided.
<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
1. Why is this change required?
2. What problem does it solve?
3. What scenario does it contribute to?
4. If it fixes an open issue, please link to the issue here.
-->
### Description
Adding the initial process framework components:
- Closes #9354
**TODO**
- more unit tests will be added to increase the code coverage. Currently
there are several files with no (or low) code coverage.
- more samples will either be added to this PR or a subsequent PR
<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->
### Contribution Checklist
<!-- Before submitting this PR, please make sure: -->
- [X] The code builds clean without any errors or warnings
- [X] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [X] All unit tests pass, and I have added new tests where possible
- [X] I didn't break anyone :smile:
2024-10-24 13:37:45 -04:00
|
|
|
|
|
|
|
|
def __init__(
|
|
|
|
|
self,
|
|
|
|
|
state: KernelProcessState,
|
|
|
|
|
steps: list[KernelProcessStepInfo],
|
|
|
|
|
edges: dict[str, list["KernelProcessEdge"]] | None = None,
|
Python: Allow for factory callbacks in the process framework (#10451)
### Motivation and Context
In the current Python process framework, in runtimes like Dapr, there is
no easy way to pass complex (unserializable) dependencies to a step --
this includes things like a ChatCompletion service or an agent of type
ChatCompletionAgent.
Similar to how the kernel dependency was propagated to the step_actor or
process_actor, we're introducing the ability to specify a factory
callback that will be called as the step is instantiated. The factory is
created, if specified via the optional kwarg when adding a step to the
process builder like:
```python
myBStep = process.add_step(step_type=BStep, factory_function=bstep_factory)
```
The `bstep_factory` looks like (along with its corresponding step)
```python
async def bstep_factory():
"""Creates a BStep instance with ephemeral references like ChatCompletionAgent."""
kernel = Kernel()
kernel.add_service(AzureChatCompletion())
agent = ChatCompletionAgent(kernel=kernel, name="echo", instructions="repeat the input back")
step_instance = BStep()
step_instance.agent = agent
return step_instance
class BStep(KernelProcessStep):
"""A sample BStep that optionally holds a ChatCompletionAgent.
By design, the agent is ephemeral (not stored in state).
"""
# Ephemeral references won't be persisted to Dapr
# because we do not place them in a step state model.
# We'll set this in the factory function:
agent: ChatCompletionAgent | None = None
@kernel_function(name="do_it")
async def do_it(self, context: KernelProcessStepContext):
print("##### BStep ran (do_it).")
await asyncio.sleep(2)
if self.agent:
history = ChatHistory()
history.add_user_message("Hello from BStep!")
async for msg in self.agent.invoke(history):
print(f"BStep got agent response: {msg.content}")
await context.emit_event(process_event="BStepDone", data="I did B")
```
Although this isn't explicitly necessary with the local runtime, the
factory callback will also work, if desired.
<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
1. Why is this change required?
2. What problem does it solve?
3. What scenario does it contribute to?
4. If it fixes an open issue, please link to the issue here.
-->
### Description
Adds the ability to specify a factory callback for a step in the process
framework.
- Adjusts the Dapr FastAPI demo sample to show how one can include a
dependency like a `ChatCompletionAgent` and use the factory callback for
`BStep`. Although the output from the agent isn't needed, it
demonstrates the capability to handle these types of dependencies while
running Dapr.
- Adds unit tests
- Closes #10409
<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->
### Contribution Checklist
<!-- Before submitting this PR, please make sure: -->
- [X] The code builds clean without any errors or warnings
- [X] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [X] All unit tests pass, and I have added new tests where possible
- [X] I didn't break anyone :smile:
2025-02-11 10:39:08 +09:00
|
|
|
factories: dict[str, Callable] | None = None,
|
Python: Add the Python process framework (#9363)
### Motivation and Context
An initial PR to add the foundational pieces of the Python Process
framework, which holds it design to be similar to dotnet in that step
types are added to a process builder, and later on, when the step is
run, it is first instantiated and the proper state is provided.
<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
1. Why is this change required?
2. What problem does it solve?
3. What scenario does it contribute to?
4. If it fixes an open issue, please link to the issue here.
-->
### Description
Adding the initial process framework components:
- Closes #9354
**TODO**
- more unit tests will be added to increase the code coverage. Currently
there are several files with no (or low) code coverage.
- more samples will either be added to this PR or a subsequent PR
<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->
### Contribution Checklist
<!-- Before submitting this PR, please make sure: -->
- [X] The code builds clean without any errors or warnings
- [X] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [X] All unit tests pass, and I have added new tests where possible
- [X] I didn't break anyone :smile:
2024-10-24 13:37:45 -04:00
|
|
|
):
|
Python: Allow for factory callbacks in the process framework (#10451)
### Motivation and Context
In the current Python process framework, in runtimes like Dapr, there is
no easy way to pass complex (unserializable) dependencies to a step --
this includes things like a ChatCompletion service or an agent of type
ChatCompletionAgent.
Similar to how the kernel dependency was propagated to the step_actor or
process_actor, we're introducing the ability to specify a factory
callback that will be called as the step is instantiated. The factory is
created, if specified via the optional kwarg when adding a step to the
process builder like:
```python
myBStep = process.add_step(step_type=BStep, factory_function=bstep_factory)
```
The `bstep_factory` looks like (along with its corresponding step)
```python
async def bstep_factory():
"""Creates a BStep instance with ephemeral references like ChatCompletionAgent."""
kernel = Kernel()
kernel.add_service(AzureChatCompletion())
agent = ChatCompletionAgent(kernel=kernel, name="echo", instructions="repeat the input back")
step_instance = BStep()
step_instance.agent = agent
return step_instance
class BStep(KernelProcessStep):
"""A sample BStep that optionally holds a ChatCompletionAgent.
By design, the agent is ephemeral (not stored in state).
"""
# Ephemeral references won't be persisted to Dapr
# because we do not place them in a step state model.
# We'll set this in the factory function:
agent: ChatCompletionAgent | None = None
@kernel_function(name="do_it")
async def do_it(self, context: KernelProcessStepContext):
print("##### BStep ran (do_it).")
await asyncio.sleep(2)
if self.agent:
history = ChatHistory()
history.add_user_message("Hello from BStep!")
async for msg in self.agent.invoke(history):
print(f"BStep got agent response: {msg.content}")
await context.emit_event(process_event="BStepDone", data="I did B")
```
Although this isn't explicitly necessary with the local runtime, the
factory callback will also work, if desired.
<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
1. Why is this change required?
2. What problem does it solve?
3. What scenario does it contribute to?
4. If it fixes an open issue, please link to the issue here.
-->
### Description
Adds the ability to specify a factory callback for a step in the process
framework.
- Adjusts the Dapr FastAPI demo sample to show how one can include a
dependency like a `ChatCompletionAgent` and use the factory callback for
`BStep`. Although the output from the agent isn't needed, it
demonstrates the capability to handle these types of dependencies while
running Dapr.
- Adds unit tests
- Closes #10409
<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->
### Contribution Checklist
<!-- Before submitting this PR, please make sure: -->
- [X] The code builds clean without any errors or warnings
- [X] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [X] All unit tests pass, and I have added new tests where possible
- [X] I didn't break anyone :smile:
2025-02-11 10:39:08 +09:00
|
|
|
"""Initialize the kernel process.
|
|
|
|
|
|
|
|
|
|
Args:
|
|
|
|
|
state: The state of the process.
|
|
|
|
|
steps: The steps of the process.
|
|
|
|
|
edges: The edges of the process. Defaults to None.
|
|
|
|
|
factories: The factories of the process. This allows for the creation of
|
|
|
|
|
steps that require complex dependencies that cannot be JSON serialized or deserialized.
|
|
|
|
|
"""
|
Python: Add the Python process framework (#9363)
### Motivation and Context
An initial PR to add the foundational pieces of the Python Process
framework, which holds it design to be similar to dotnet in that step
types are added to a process builder, and later on, when the step is
run, it is first instantiated and the proper state is provided.
<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
1. Why is this change required?
2. What problem does it solve?
3. What scenario does it contribute to?
4. If it fixes an open issue, please link to the issue here.
-->
### Description
Adding the initial process framework components:
- Closes #9354
**TODO**
- more unit tests will be added to increase the code coverage. Currently
there are several files with no (or low) code coverage.
- more samples will either be added to this PR or a subsequent PR
<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->
### Contribution Checklist
<!-- Before submitting this PR, please make sure: -->
- [X] The code builds clean without any errors or warnings
- [X] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [X] All unit tests pass, and I have added new tests where possible
- [X] I didn't break anyone :smile:
2024-10-24 13:37:45 -04:00
|
|
|
if not state:
|
|
|
|
|
raise ValueError("state cannot be None")
|
|
|
|
|
if not steps:
|
|
|
|
|
raise ValueError("steps cannot be None")
|
|
|
|
|
if not state.name:
|
|
|
|
|
raise ValueError("state.Name cannot be None")
|
|
|
|
|
|
|
|
|
|
process_steps = []
|
|
|
|
|
process_steps.extend(steps)
|
|
|
|
|
|
|
|
|
|
args: dict[str, Any] = {
|
|
|
|
|
"steps": process_steps,
|
|
|
|
|
"inner_step_type": KernelProcess,
|
|
|
|
|
"state": state,
|
|
|
|
|
"output_edges": edges or {},
|
|
|
|
|
}
|
|
|
|
|
|
Python: Allow for factory callbacks in the process framework (#10451)
### Motivation and Context
In the current Python process framework, in runtimes like Dapr, there is
no easy way to pass complex (unserializable) dependencies to a step --
this includes things like a ChatCompletion service or an agent of type
ChatCompletionAgent.
Similar to how the kernel dependency was propagated to the step_actor or
process_actor, we're introducing the ability to specify a factory
callback that will be called as the step is instantiated. The factory is
created, if specified via the optional kwarg when adding a step to the
process builder like:
```python
myBStep = process.add_step(step_type=BStep, factory_function=bstep_factory)
```
The `bstep_factory` looks like (along with its corresponding step)
```python
async def bstep_factory():
"""Creates a BStep instance with ephemeral references like ChatCompletionAgent."""
kernel = Kernel()
kernel.add_service(AzureChatCompletion())
agent = ChatCompletionAgent(kernel=kernel, name="echo", instructions="repeat the input back")
step_instance = BStep()
step_instance.agent = agent
return step_instance
class BStep(KernelProcessStep):
"""A sample BStep that optionally holds a ChatCompletionAgent.
By design, the agent is ephemeral (not stored in state).
"""
# Ephemeral references won't be persisted to Dapr
# because we do not place them in a step state model.
# We'll set this in the factory function:
agent: ChatCompletionAgent | None = None
@kernel_function(name="do_it")
async def do_it(self, context: KernelProcessStepContext):
print("##### BStep ran (do_it).")
await asyncio.sleep(2)
if self.agent:
history = ChatHistory()
history.add_user_message("Hello from BStep!")
async for msg in self.agent.invoke(history):
print(f"BStep got agent response: {msg.content}")
await context.emit_event(process_event="BStepDone", data="I did B")
```
Although this isn't explicitly necessary with the local runtime, the
factory callback will also work, if desired.
<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
1. Why is this change required?
2. What problem does it solve?
3. What scenario does it contribute to?
4. If it fixes an open issue, please link to the issue here.
-->
### Description
Adds the ability to specify a factory callback for a step in the process
framework.
- Adjusts the Dapr FastAPI demo sample to show how one can include a
dependency like a `ChatCompletionAgent` and use the factory callback for
`BStep`. Although the output from the agent isn't needed, it
demonstrates the capability to handle these types of dependencies while
running Dapr.
- Adds unit tests
- Closes #10409
<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->
### Contribution Checklist
<!-- Before submitting this PR, please make sure: -->
- [X] The code builds clean without any errors or warnings
- [X] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [X] All unit tests pass, and I have added new tests where possible
- [X] I didn't break anyone :smile:
2025-02-11 10:39:08 +09:00
|
|
|
if factories:
|
|
|
|
|
args["factories"] = factories
|
|
|
|
|
|
Python: Add the Python process framework (#9363)
### Motivation and Context
An initial PR to add the foundational pieces of the Python Process
framework, which holds it design to be similar to dotnet in that step
types are added to a process builder, and later on, when the step is
run, it is first instantiated and the proper state is provided.
<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
1. Why is this change required?
2. What problem does it solve?
3. What scenario does it contribute to?
4. If it fixes an open issue, please link to the issue here.
-->
### Description
Adding the initial process framework components:
- Closes #9354
**TODO**
- more unit tests will be added to increase the code coverage. Currently
there are several files with no (or low) code coverage.
- more samples will either be added to this PR or a subsequent PR
<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->
### Contribution Checklist
<!-- Before submitting this PR, please make sure: -->
- [X] The code builds clean without any errors or warnings
- [X] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [X] All unit tests pass, and I have added new tests where possible
- [X] I didn't break anyone :smile:
2024-10-24 13:37:45 -04:00
|
|
|
super().__init__(**args)
|
2025-04-29 07:51:11 +09:00
|
|
|
|
|
|
|
|
def to_process_state_metadata(self) -> "KernelProcessStateMetadata":
|
|
|
|
|
"""Converts a kernel process to process state metadata."""
|
|
|
|
|
return kernel_process_to_process_state_metadata(self)
|