Skip to content

Commit

Permalink
Merge branch 'main' into PORT-12403-create-cloud-resources-using-ia-c
Browse files Browse the repository at this point in the history
  • Loading branch information
kodjomiles authored Jan 22, 2025
2 parents be90577 + 6d9cfe7 commit bfc9f3b
Show file tree
Hide file tree
Showing 73 changed files with 7,499 additions and 588 deletions.
119 changes: 119 additions & 0 deletions .github/workflows/kapa-weekly-report.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,119 @@
name: Weekly Kapa Usage Report

on:
schedule:
# Runs at 7:00 AM UTC every Monday
- cron: '0 7 * * 1'
workflow_dispatch: # Allows manual triggering

jobs:
generate-and-send-report:
runs-on: ubuntu-latest
steps:
- name: Get Current Date
id: date
run: |
echo "start_date=$(date -d '7 days ago' -u +%Y-%m-%dT00:00:00Z)" >> $GITHUB_ENV
echo "end_date=$(date -u +%Y-%m-%dT23:59:59Z)" >> $GITHUB_ENV
- name: Fetch Kapa Analytics
id: fetch-analytics
run: |
# Make the API call and store response directly to file
curl -v -s -X GET \
"https://api.kapa.ai/query/v1/projects/e64464bc-19b5-4cd2-9779-2930e2ca0b81/analytics/activity/?start_date_time=${{ env.start_date }}&end_date_time=${{ env.end_date }}&aggregation_period=WEEK" \
-H "X-API-KEY: ${{ secrets.KAPA_API_KEY }}" \
-o kapa_response.json
# Debug: Print the first part of the response
echo "API Response (first 500 chars):"
head -c 500 kapa_response.json
# Validate JSON response
if ! jq empty kapa_response.json; then
echo "Error: Invalid JSON response from API"
echo "Full response:"
cat kapa_response.json
exit 1
fi
# Verify the response has the expected structure
if ! jq -e '.aggregate_statistics' kapa_response.json > /dev/null; then
echo "Error: Response missing aggregate_statistics"
echo "Response structure:"
jq '.' kapa_response.json
exit 1
fi
if ! jq -e '.time_series' kapa_response.json > /dev/null; then
echo "Error: Response missing time_series"
echo "Response structure:"
jq '.' kapa_response.json
exit 1
fi
# Create formatted message sections
period_text="Period: $(date -d '${{ env.start_date }}' '+%B %d') - $(date -d '${{ env.end_date }}' '+%B %d, %Y')"
# Calculate uncertain percentage
total_questions=$(jq -r '.aggregate_statistics.total_questions' kapa_response.json)
uncertain_questions=$(jq -r '.aggregate_statistics.total_questions_uncertain' kapa_response.json)
if [ "$total_questions" -gt 0 ]; then
uncertain_percentage=$(echo "scale=1; $uncertain_questions * 100 / $total_questions" | bc)
else
uncertain_percentage=0
fi
key_metrics="*Key Metrics:*\n• Total Questions: $(jq -r '.aggregate_statistics.total_questions' kapa_response.json)\n• Unique Users: $(jq -r '.aggregate_statistics.total_unique_users' kapa_response.json)\n• Uncertain Responses: $uncertain_questions ($uncertain_percentage%)"
quality_metrics="*Quality Metrics:*\n• Upvotes: $(jq -r '.aggregate_statistics.total_upvotes' kapa_response.json)\n• Downvotes: $(jq -r '.aggregate_statistics.total_downvotes' kapa_response.json)"
# Calculate success rate
total_votes=$(jq -r '.aggregate_statistics.total_upvotes + .aggregate_statistics.total_downvotes' kapa_response.json)
if [ "$total_votes" -gt 0 ]; then
success_rate=$(echo "scale=1; $(jq -r '.aggregate_statistics.total_upvotes' kapa_response.json) * 100 / $total_votes" | bc)
success_text="\n*Success Rate:* ${success_rate}%"
else
success_text=""
fi
# Get counts for each integration
# Community channel count
community_count=$(jq -r '
.time_series |
map(.count_by_integration[]) |
map(select(.type == "SLACK_CHANNEL" and (.description | test("community"; "i")))) |
map(.count) |
add // 0
' kapa_response.json)
# Docs AI count
docs_ai_count=$(jq -r '
.time_series |
map(.count_by_integration[]) |
map(select(.description == "Docs AI")) |
map(.count) |
add // 0
' kapa_response.json)
# Ask-ai internal Slack count
internal_slack_count=$(jq -r '
.time_series |
map(.count_by_integration[]) |
map(select(.description == "Ask-ai internal Slack")) |
map(.count) |
add // 0
' kapa_response.json)
# Combine all integrations in the desired order
integration_text="\n\n*Questions by Integration:*"
integration_text+="\n• Docs widget: $docs_ai_count"
integration_text+="\n• Community Slack channel: $community_count"
integration_text+="\n• Internal Slack channel: $internal_slack_count"
# Send to Slack using curl
curl -X POST -H 'Content-type: application/json' \
--data "{
\"text\": \"📊 *Weekly Kapa.ai Usage Report*\n${period_text}\n\n${key_metrics}\n\n${quality_metrics}${success_text}${integration_text}\"
}" \
${{ secrets.SLACK_WEBHOOK_URL }}
52 changes: 51 additions & 1 deletion docs/actions-and-automations/define-automations/examples.md
Original file line number Diff line number Diff line change
Expand Up @@ -250,4 +250,54 @@ The following example uses a [`Send Slack message`](/actions-and-automations/set
- Note the `condition` block that checks if the status of the action run has changed from `IN_PROGRESS` to `FAILURE`. Only this specific change will trigger the automation.
- The `invocationMethod` specifies a webhook that sends a message to a Slack channel.
- The message includes details about the failed deployment, such as the service name, image, and environment.
- The message also includes a link to the action run page in Port.
- The message also includes a link to the action run page in Port.

---

## Approve a self-service action based on an input value

When configuring [manual approval](/actions-and-automations/create-self-service-experiences/set-self-service-actions-rbac/#configure-manual-approval-for-actions) for a self-service action, in some cases you may want to automatically approve/decline the action if a certain input value is provided.

For example, the following automation will automatically approve a deployment if the `type` input is set to `Testing`:

```json showLineNumbers
{
"identifier": "approve_deployment_based_on_input",
"title": "Automatically approve testing deployments",
"description": "Automatically approve testing deployments",
"trigger": {
"type": "automation",
"event": {
"type": "RUN_CREATED",
"actionIdentifier": "deploy_service"
},
"condition": {
"type": "JQ",
"expressions": [
".diff.after.properties.type == \"Testing\""
],
"combinator": "and"
}
},
"invocationMethod": {
"type": "WEBHOOK",
"url": "https://api.getport.io/v1/actions/runs/{{.event.diff.after.id}}/approval",
"agent": false,
"synchronized": true,
"method": "PATCH",
"headers": {},
"body": {
"status": "APPROVE",
"description": "Approved"
}
},
"publish": true
}
```

### Explanation

- This automation is triggered whenever a new run is created for the `deploy_service` action.
- The `condition` block checks if the `type` input is set to `Testing`, and will only trigger the automation if this is the case.
- The backend of the automation directly makes an API call to approve the relevant run.
- Note that if the `condition` is not met, the automation will not be triggered.
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ This page will introduce the agent and guide you through the installation and co

## Prerequisites

- Connection credentials to Kafka are required. To obtain them, contact us via the intercom bubble in the bottom-right corner, or via our [community Slack](https://www.getport.io/community).
- Connection credentials to Kafka are required. To obtain them, contact us using Intercom/Slack/mail to [[email protected]](mailto:support@getport.io).
- [Helm](https://helm.sh) must be installed in order to install the relevant chart.
- In order to trigger a GitLab Pipeline, you need to have a [GitLab trigger token](https://docs.gitlab.com/ee/ci/triggers/).

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ The steps shown in the image above are as follows:
## Further steps

- See the [Triggering example](#Triggering-example) for Circle CI.
- Contact us through Intercom to set up a Kafka topic for your organization.
- Contact us using Intercom/Slack/mail to [[email protected]](mailto:[email protected]) to set up a Kafka topic for your organization.
- [Install the Port execution agent to trigger the Circle CI pipeline](#Installation).

## Triggering example
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ In this guide, we will show how to deploy a new `AWS Lambda function`, that will
## Prerequisites

:::note
To follow this example, please contact us via Intercom to receive a dedicated Kafka topic.
To follow this example, please contact us using Intercom/Slack/mail to [[email protected]](mailto:[email protected]) to receive a dedicated Kafka topic.
:::

- AWS CLI installed and configured to your desired AWS account;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ Our Port agent is open source - see it [**here**](https://github.com/port-labs/p
:::

:::note
To use the execution agent, please contact us via Intercom to receive a dedicated Kafka topic.
To use the execution agent, please contact us using Intercom/Slack/mail to [[email protected]](mailto:[email protected]) to receive a dedicated Kafka topic.
:::

The data flow when using the Port execution agent is as follows:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ token** or **team token**.
## Further steps

- See the [Triggering example](#Triggering-example) for Terraform Cloud.
- Contact us through Intercom to set up a Kafka topic for your organization.
- Contact us using Intercom/Slack/mail to [[email protected]](mailto:[email protected]) to set up a Kafka topic for your organization.
- [Install the Port execution agent to trigger the Terraform Cloud Run](#Installation).

## Triggering example
Expand Down
71 changes: 71 additions & 0 deletions docs/api-reference/search-a-blueprints-entities.api.mdx

Large diffs are not rendered by default.

5 changes: 5 additions & 0 deletions docs/api-reference/search-entities.api.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,11 @@ import Heading from "@theme/Heading";

</MethodEndpoint>

:::tip New search route available
The [search a blueprint's entities](/api-reference/search-a-blueprints-entities) route offers more options, such as paginated results and the ability to filter results by relations/scorecards.

We recommend using it instead of this route.
:::


This route allows you to search for entities in your software catalog based on a given set of rules.<br/><br/>To learn more about entities, check out the [documentation](https://docs.port.io/build-your-software-catalog/sync-data-to-catalog/#entities).<br/><br/>For more details about Port's search mechanism, rules, and operators - see the [search & query documentation](https://docs.port.io/search-and-query/).
Expand Down
54 changes: 54 additions & 0 deletions docs/api-reference/security.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
---
id: security
title: "Security"
description: ""
sidebar_label: Security
sidebar_position: 2
hide_title: true
---

import EnterpriseNote from "/docs/generalTemplates/_enterprise_feature_notice.md";

# Security

This page includes security information about Port API and the interactions of your infrastructure with it.

## Address Allowlisting

Port's REST API is served through a network of Application Load Balancers (ALBs) and as such is not served from a closed list of IP addresses.

For cases where your internal network has strict limitations on the addresses that outbound requests can be made to, you will need to add the following addresses to your network's allowlist:

- For Port's EU tenant:
- [https://api.getport.io](https://api.getport.io)
- [https://ingest.getport.io](https://ingest.getport.io)
- For Port's US tenant:
- [https://api.us.getport.io](https://api.us.getport.io)
- [https://ingest.us.getport.io](https://ingest.us.getport.io)

## AWS PrivateLink

<EnterpriseNote />

Port supports AWS PrivateLink to provide secure connectivity between your AWS [VPC](https://docs.aws.amazon.com/vpc/latest/userguide/what-is-amazon-vpc.html) and Port's API.

AWS [PrivateLink](https://aws.amazon.com/privatelink/) is an AWS service that provides private connectivity between resources in different AWS VPCs.

With AWS PrivateLink you can make requests to Port's API from your AWS VPC, while ensuring that the traffic remains within the AWS data center, and without exposing your data to the internet.

### Setup

To setup an AWS PrivateLink between your VPC and Port, you can follow [this AWS guide](https://docs.aws.amazon.com/vpc/latest/privatelink/create-endpoint-service.html#connect-to-endpoint-service).

In step 5 of the guide, you are required to provide the **Service name**, which is the address of the PrivateLink address provided by Port, here are the Port API PrivateLink service names:

| Service | Public Address | PrivateLink Region | PrivateLink Service Name |
| ------------- | ------------------------------------------------------------ | ------------------ | --------------------------------------------------------- |
| Port API EU | [https://api.getport.io](https://api.getport.io) | `eu-west-1` | `com.amazonaws.vpce.eu-west-1.vpce-svc-02addcefd47049d3f` |
| Ingest API EU | [https://ingest.getport.io](https://ingest.getport.io) | `eu-west-1` | `com.amazonaws.vpce.eu-west-1.vpce-svc-01c8de843e5776402` |
| Port API US | [https://api.us.getport.io](https://api.us.getport.io) | `us-east-1` | `com.amazonaws.vpce.us-east-1.vpce-svc-047de27e65a0392a7` |
| Ingest API US | [https://ingest.us.getport.io](https://ingest.us.getport.io) | `us-east-1` | `com.amazonaws.vpce.us-east-1.vpce-svc-052d7ea18ebda9652` |

**Note:** In case your AWS resources are in a different region than the ones Port's PrivateLinks are hosted at, refer to step 6 of the guide to setup a cross region connection.

After following the guide, please contact Port's support team using Intercom/Slack/mail to [[email protected]](mailto:[email protected]) and we will finalize the setup.
10 changes: 10 additions & 0 deletions docs/api-reference/sidebar.ts
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,10 @@ const sidebar: SidebarsConfig = {
type: "doc",
id: "api-reference/rate-limits",
},
{
type: "doc",
id: "api-reference/security",
},
{
type: "html",
value: '<hr style="margin: 0.8rem">',
Expand Down Expand Up @@ -150,6 +154,12 @@ const sidebar: SidebarsConfig = {
label: "Delete all entities of a blueprint",
className: "api-method delete",
},
{
type: "doc",
id: "api-reference/search-a-blueprints-entities",
label: "Search a blueprint's entities",
className: "api-method post",
},
{
type: "doc",
id: "api-reference/search-entities",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,12 +29,15 @@ This integration allows you to:
- [`Monitor`](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitor-details)
- [`Service`](https://docs.datadoghq.com/api/latest/service-definition/#get-all-service-definitions)
- [`SLO`](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-all-slos)
- [`SLO History`](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-an-slos-history)
- [`Service Metric`](https://docs.datadoghq.com/api/latest/metrics/#query-timeseries-points)
- [`SLO History`](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-an-slos-history)*
- [`Service Metric`](https://docs.datadoghq.com/api/latest/metrics/#query-timeseries-points)*
- [`User`](https://docs.datadoghq.com/api/latest/users/#list-all-users)
- [`Team`](https://docs.datadoghq.com/api/latest/teams/#get-all-teams)
<br />

*_SLO History and Service Metric resources are not collected out of the box. Follow the examples [here](https://docs.port.io/build-your-software-catalog/sync-data-to-catalog/apm-alerting/datadog/examples) to configure blueprints and resource mappings._

<br />

## Setup

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,14 +9,12 @@ resources:
port:
entity:
mappings:
identifier: .Identifier
title: .Properties.StackId | split("/")[1]
identifier: .StackId
title: .StackName
blueprint: '"cloudformationStack"'
properties:
link: >-
'https://console.aws.amazon.com/go/view?arn=' +
.Properties.StackId
arn: .Properties.StackId
link: '"https://console.aws.amazon.com/go/view?arn=" + .StackId'
arn: .StackId
kind: .__Kind
region: .__Region
relations:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ import DynamoDBBlueprint from './storage/\_dynamodb.mdx'
import ElasticacheBlueprint from './storage/\_elasticache.mdx'
import RDSBlueprint from './storage/\_rds.mdx'
import StorageAppConfig from './storage/\_port_app_config.mdx'
import UnsupportedResources from './unsupported/\_resources.mdx'

# Mapping Extra Resources

Expand All @@ -36,15 +37,29 @@ This page will help you understand what kind of AWS resources are supported by t

## Is the resource supported by the AWS Integration?

The AWS Integration is relying on AWS's Cloud Control API. That means:
The AWS Integration relies on AWS's Cloud Control API. That means:

1. Does the type of resource I want to ingest listed [here](https://docs.aws.amazon.com/cloudcontrolapi/latest/userguide/supported-resources.html)?
1. Is the type of resource I want to ingest listed [here](https://docs.aws.amazon.com/cloudcontrolapi/latest/userguide/supported-resources.html) and supported by the list method?
- **If so**: Great! It's supported.
- **If not**: Please contact us or contribute by [adding support](https://ocean.getport.io/develop-an-integration/) to [the integration](https://github.com/port-labs/ocean/tree/main/integrations/aws) yourself.

:::info Resource limitation
In Cloud Control, some resources require an input in order to be queried. Currently, the integration does not support passing these inputs, which means those resources are currently not supported.
:::
For the full list of supported resources, refer to [AWS Cloud Control API Supported Resources](https://docs.aws.amazon.com/cloudcontrolapi/latest/userguide/supported-resources.html).

## Resources supported by Cloud Control API but not supported in AWS Integration

The AWS Integration relies on AWS's Cloud Control API. While many resources are supported, some require additional inputs to be queried, which the integration currently does not support.
Below is a list of the resources that are unsupported due to this limitation.

### List of unsupported resources

<UnsupportedResources/>

### What can you do?

- **Contact us**: If you require support for any of these resources, please reach out to our team for assistance.
- **Submit a feature request**: Contribute to our integration's improvement by [submitting a feature request](https://roadmap.getport.io/).
- **Contribute directly**: Developers are encouraged to [contribute](https://ocean.getport.io/develop-an-integration/) by adding support for these resources [here](https://github.com/port-labs/ocean/tree/main/integrations/aws).


## Configuration

Expand Down
Loading

0 comments on commit bfc9f3b

Please sign in to comment.