SIGN IN SIGN UP

Integrate cutting-edge LLM technology quickly and easily into your apps

0 0 11 C#
OPENAI_API_KEY=""
OPENAI_CHAT_MODEL_ID=""
OPENAI_TEXT_MODEL_ID=""
OPENAI_EMBEDDING_MODEL_ID=""
OPENAI_ORG_ID=""
AZURE_OPENAI_CHAT_DEPLOYMENT_NAME=""
AZURE_OPENAI_TEXT_DEPLOYMENT_NAME=""
AZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAME=""
AZURE_OPENAI_ENDPOINT=""
AZURE_OPENAI_API_KEY=""
AZURE_OPENAI_API_VERSION=""
AZURE_AI_SEARCH_ENDPOINT=""
AZURE_AI_SEARCH_SERVICE=""
AZURE_AI_SEARCH_API_KEY=""
AZURE_AI_SEARCH_INDEX_NAME=""
Python: Adding MongoDB Atlas Vector Search Connector (#2818) ### Motivation and Context Resolves: #2591 <!-- Thank you for your contribution to the semantic-kernel repo! Please help reviewers and future users, providing the following information: 1. Why is this change required? 2. What problem does it solve? 3. What scenario does it contribute to? 4. If it fixes an open issue, please link to the issue here. --> ### Description <!-- Describe your changes, the overall approach, the underlying design. These notes will help understanding how your code works. Thanks! --> * Added in the `MongoDBAtlasMemoryStore`. A memory store abstraction that facilitates connections between a MongoDB Atlas cluster to conduct [Atlas Vector Search](https://www.mongodb.com/products/platform/atlas-vector-search) on the Microsoft Semantic Kernel. * Leverages our async python driver [motor](https://motor.readthedocs.io/en/stable/) to keep with the heuristic of asynchronous code ### Contribution Checklist <!-- Before submitting this PR, please make sure: --> - [x] The code builds clean without any errors or warnings - [x] The PR follows the [SK Contribution Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md) and the [pre-submission formatting script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts) raises no violations - [x] All unit tests pass, and I have added new tests where possible - [x] I didn't break anyone :smile: ### Callout * When writing my test, I noticed there's a habit of other tests conducting a `try` on imports. Grepping through those implementations, I see some of them [encapsulate their respective libraries in function calls rather than doing direct imports in the module.](https://github.com/microsoft/semantic-kernel/blob/main/python/semantic_kernel/connectors/memory/chroma/chroma_memory_store.py#L55-L63) Is this the general code pattern we should follow? Or is it fine to have our imports at the top of the module fine? --------- Co-authored-by: Steven Silvester <steven.silvester@ieee.org> Co-authored-by: Abby Harrison <54643756+awharrison-28@users.noreply.github.com> Co-authored-by: Abby Harrison <abby.harrison@microsoft.com>
2023-10-02 17:25:28 -04:00
MONGODB_ATLAS_CONNECTION_STRING=""
Creating Pinecone memory store for Python (#769) ### Motivation and Context Adding a Pinecone Memory store using the Pinecone vector database client for SK python Add Pinecone vector db memory to SK Python Memory storage for SK and vectors ### Description - Using the existing pinecone python module/client - Reviewing with console exceptions responses that come back from pinecone, will update if any were omitted - Currently investigating and would like feedback if I should respond to throttling errors in a different manner - Currently testing each API response, slight delay may occur due to throttling of my starter account in Pinecone. ### Contribution Checklist <!-- Before submitting this PR, please make sure: --> - [ ] The code builds clean without any errors or warnings - [X] The PR follows SK Contribution Guidelines (https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md) - [ ] The code follows the .NET coding conventions (https://learn.microsoft.com/dotnet/csharp/fundamentals/coding-style/coding-conventions) verified with `dotnet format` - [ ] All unit tests pass, and I have added new tests where possible - [X] I didn't break anyone :smile: --------- Co-authored-by: Lee Miller <lemiller@microsoft.com> Co-authored-by: Casey Schadewitz <casey.schadewitz@IntelliTect.com> Co-authored-by: cschadewitz <schadewitzcasey@gmail.com> Co-authored-by: Abby Harrison <54643756+awharrison-28@users.noreply.github.com> Co-authored-by: Devis Lucato <dluc@users.noreply.github.com> Co-authored-by: Dmytro Struk <13853051+dmytrostruk@users.noreply.github.com> Co-authored-by: Shawn Callegari <36091529+shawncal@users.noreply.github.com>
2023-06-29 16:09:58 -04:00
PINECONE_API_KEY=""
PINECONE_ENVIRONMENT=""
Python: postgres memory store (#1354) ### Motivation and Context <!-- Thank you for your contribution to the semantic-kernel repo! Please help reviewers and future users, providing the following information: 1. Why is this change required? Brings parity to python implementation 2. What problem does it solve? With the pgvector extension, Postgres is a viable option for the storing of embeddings for memory in Semantic Kernel. The C# implementation is already capable of this, but the feature has yet to be added to python. 3. What scenario does it contribute to? Additional options for memory storage 4. If it fixes an open issue, please link to the issue here. --> ### Description Implemention of the MemoryStore to use a Postgresql (with pgvector extension) to resolve #1270 Taking a different approach from the .net implementation: Collections as postgres schemas as opposed to collection being another column in the table, willing to change to match but would love to have a discussion comparing the options. ### Contribution Checklist <!-- Before submitting this PR, please make sure: --> - [x] The code builds clean without any errors or warnings - [x] The PR follows SK Contribution Guidelines (https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md) - [x] The code follows the .NET coding conventions (https://learn.microsoft.com/dotnet/csharp/fundamentals/coding-style/coding-conventions) verified with `dotnet format` - [x] All unit tests pass, and I have added new tests where possible - [x] I didn't break anyone :smile: --------- Co-authored-by: Dmytro Struk <13853051+dmytrostruk@users.noreply.github.com> Co-authored-by: Shawn Callegari <36091529+shawncal@users.noreply.github.com> Co-authored-by: Abby Harrison <abby.harrison@microsoft.com> Co-authored-by: Abby Harrison <54643756+awharrison-28@users.noreply.github.com>
2023-07-11 14:52:08 -07:00
POSTGRES_CONNECTION_STRING=""
WEAVIATE_URL=""
WEAVIATE_API_KEY=""
GOOGLE_SEARCH_ENGINE_ID=""
BRAVE_API_KEY=""
Python: Redis memory connector (#2132) ### Motivation and Context <!-- Thank you for your contribution to the semantic-kernel repo! Please help reviewers and future users, providing the following information: 1. Why is this change required? 2. What problem does it solve? 3. What scenario does it contribute to? 4. If it fixes an open issue, please link to the issue here. --> 1. .NET supports Redis memory while Python does not 2. Match up Python features w/ .NET 3. Users can utilize a Redis backend for memories 4. Closes #1981 ### Description <!-- Describe your changes, the overall approach, the underlying design. These notes will help understanding how your code works. Thanks! --> Implemented a memory connector for Redis and made unit tests for it. Also added .env configuration for a connection string. ### Contribution Checklist <!-- Before submitting this PR, please make sure: --> - [X] The code builds clean without any errors or warnings - [X] The PR follows the [SK Contribution Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md) and the [pre-submission formatting script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#dev-scripts) raises no violations - [X] All unit tests pass, and I have added new tests where possible - [X] I didn't break anyone :smile: --------- Co-authored-by: Dmytro Struk <13853051+dmytrostruk@users.noreply.github.com> Co-authored-by: Abby Harrison <54643756+awharrison-28@users.noreply.github.com> Co-authored-by: Abby Harrison <abby.harrison@microsoft.com>
2023-08-31 13:38:16 -07:00
REDIS_CONNECTION_STRING=""
AZCOSMOS_API=""
AZCOSMOS_CONNSTR=""
AZCOSMOS_DATABASE_NAME=""
AZCOSMOS_CONTAINER_NAME=""
ASTRADB_APP_TOKEN=""
ASTRADB_ID=""
ASTRADB_REGION=""
Python: Add ACA Python Sessions (Code Interpreter) Core Plugin, samples, and tests (#6158) ### Motivation and Context Adding a new core plugin to Semantic Kernel Python that leverages the Azure Container Apps Python Sessions Container. This container allows one, with the proper resource, to run Python in a safe, managed environment. <!-- Thank you for your contribution to the semantic-kernel repo! Please help reviewers and future users, providing the following information: 1. Why is this change required? 2. What problem does it solve? 3. What scenario does it contribute to? 4. If it fixes an open issue, please link to the issue here. --> ### Description This PR introduces: - The Python Sessions (code interpreter) plugin to execute code, upload a file to the container, list files, and download files. - It includes a README.md with the steps to set up the ACA resource. - New samples to show use as a plugin and auto function calling - Unit tests <!-- Describe your changes, the overall approach, the underlying design. These notes will help understanding how your code works. Thanks! --> ### Contribution Checklist <!-- Before submitting this PR, please make sure: --> - [X] The code builds clean without any errors or warnings - [X] The PR follows the [SK Contribution Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md) and the [pre-submission formatting script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts) raises no violations - [X] All unit tests pass, and I have added new tests where possible - [X] I didn't break anyone :smile:
2024-05-08 17:01:51 -04:00
ASTRADB_KEYSPACE=""
Python: Introduce Pydantic settings (#6193) ### Motivation and Context SK Python is tightly coupled to the use of a `.env` file to read all secrets, keys, endpoints, and more. This doesn't scale well for users who wish to be able to use environment variables with their SK Applications. By introducing Pydantic Settings, it is possible to use both environment variables as well as have a fall-back to a `.env` file (via a `env_file_path` parameter), if desired. By introducing Pydantic Settings, we are removing the requirement to have to create Text/Embedding/Chat completion objects with an `api_key` or other previously required information (in the case of AzureChatCompletion that means an `endpoint`, an `api_key`, a `deployment_name`, and an `api_version`). When the AI connector is created, the Pydantic settings are loaded either via env vars or the fall-back `.env` file, and that means the user can create a chat completion object like: ```python chat_completion = OpenAIChatCompletion(service_id="test") ``` or, to optionally override the `ai_model_id` env var: ```python chat_completion = OpenAIChatCompletion(service_id="test", ai_model_id="gpt-4-1106") ``` Note: we have left the ability to specific an `api_key`/`org_id` for `OpenAIChatCompletion` or a `deployment_name`, `endpoint`, `base_url`, and `api_version` for `AzureChatCompletion` as before, but if your settings are configured to use env vars/.env file then there is no need to pass this information. <!-- Thank you for your contribution to the semantic-kernel repo! Please help reviewers and future users, providing the following information: 1. Why is this change required? 2. What problem does it solve? 3. What scenario does it contribute to? 4. If it fixes an open issue, please link to the issue here. --> ### Description The PR introduces the use of Pydantic settings and removes the use of the python-dotenv library. - Closes #1779 - Updates notebooks, samples, code and tests to remove the explicit config of api_key or other previous .env files values. - Adds new unit test config using monkeypatch to simulate env variables for testing - All unit and integration tests passing <!-- Describe your changes, the overall approach, the underlying design. These notes will help understanding how your code works. Thanks! --> ### Contribution Checklist <!-- Before submitting this PR, please make sure: --> - [X] The code builds clean without any errors or warnings - [X] The PR follows the [SK Contribution Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md) and the [pre-submission formatting script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts) raises no violations - [X] All unit tests pass, and I have added new tests where possible - [ ] I didn't break anyone :smile:
2024-05-16 07:44:40 -04:00
ACA_POOL_MANAGEMENT_ENDPOINT=""
BOOKING_SAMPLE_CLIENT_ID=""
BOOKING_SAMPLE_TENANT_ID=""
BOOKING_SAMPLE_CLIENT_SECRET=""
BOOKING_SAMPLE_BUSINESS_ID=""
BOOKING_SAMPLE_SERVICE_ID=""
CREW_AI_ENDPOINT=""
CREW_AI_TOKEN=""