diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
index 114dd8df44ab..b1219bd72fb5 100644
--- a/.pre-commit-config.yaml
+++ b/.pre-commit-config.yaml
@@ -6,14 +6,14 @@ repos:
name: isort (python)
- repo: https://github.com/psf/black
- rev: 24.4.2
+ rev: 24.8.0
hooks:
- id: black
language_version: python3
exclude: migrations
- repo: https://github.com/pycqa/flake8
- rev: 7.1.0
+ rev: 7.1.1
hooks:
- id: flake8
name: flake8
@@ -31,6 +31,8 @@ repos:
hooks:
- id: prettier
exclude: ^(frontend/|CHANGELOG.md|.github/docker_build_comment_template.md)
+ additional_dependencies:
+ - prettier@3.3.3 # SEE: https://github.com/pre-commit/pre-commit/issues/3133
- repo: https://github.com/python-poetry/poetry
rev: 1.8.0
diff --git a/.release-please-manifest.json b/.release-please-manifest.json
index b900f8a162a6..04bca1732ea1 100644
--- a/.release-please-manifest.json
+++ b/.release-please-manifest.json
@@ -1,3 +1,3 @@
{
- ".": "2.134.0"
+ ".": "2.136.0"
}
\ No newline at end of file
diff --git a/CHANGELOG.md b/CHANGELOG.md
index affdbff3ed21..3669c6f0b6c5 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,5 +1,56 @@
# Changelog
+## [2.136.0](https://github.com/Flagsmith/flagsmith/compare/v2.135.1...v2.136.0) (2024-08-13)
+
+
+### Features
+
+* Add automatic tagging for github integration ([#4028](https://github.com/Flagsmith/flagsmith/issues/4028)) ([7920e8e](https://github.com/Flagsmith/flagsmith/commit/7920e8e22e15fc2f91dbd56582679b1f3064e4a9))
+* Add tags for GitHub integration FE ([#4035](https://github.com/Flagsmith/flagsmith/issues/4035)) ([3c46a31](https://github.com/Flagsmith/flagsmith/commit/3c46a31f6060f7a12206fa27409073da946130d8))
+* Support Aptible deployments ([#4340](https://github.com/Flagsmith/flagsmith/issues/4340)) ([3b47ae0](https://github.com/Flagsmith/flagsmith/commit/3b47ae07848c1330210d18bd8bb4194fa5d9262e))
+* Use environment feature state instead of fetching feature states ([#4188](https://github.com/Flagsmith/flagsmith/issues/4188)) ([b1d49a6](https://github.com/Flagsmith/flagsmith/commit/b1d49a63b5d7c1fe319185586f486829626840cd))
+
+
+### Bug Fixes
+
+* ensure that usage notification logic is independent of other organisations notifications ([#4480](https://github.com/Flagsmith/flagsmith/issues/4480)) ([6660af5](https://github.com/Flagsmith/flagsmith/commit/6660af56047e8d7083486b2dab20df44d0fc9303))
+* Remove warning about non-unique health namespace ([#4479](https://github.com/Flagsmith/flagsmith/issues/4479)) ([6ef7a74](https://github.com/Flagsmith/flagsmith/commit/6ef7a742f0f56aef2335da380770dc7f307d53c5))
+
+
+### Infrastructure (Flagsmith SaaS Only)
+
+* reduce task retention days to 7 ([#4484](https://github.com/Flagsmith/flagsmith/issues/4484)) ([4349149](https://github.com/Flagsmith/flagsmith/commit/4349149700ad92e824c115c93f1912c7009c9aa2))
+
+## [2.135.1](https://github.com/Flagsmith/flagsmith/compare/v2.135.0...v2.135.1) (2024-08-12)
+
+
+### Infrastructure (Flagsmith SaaS Only)
+
+* bump feature evaluation cache to 300 ([#4471](https://github.com/Flagsmith/flagsmith/issues/4471)) ([abbf24b](https://github.com/Flagsmith/flagsmith/commit/abbf24bf987e8f74cb2ecf3ec3d82456d9892654))
+
+## [2.135.0](https://github.com/Flagsmith/flagsmith/compare/v2.134.1...v2.135.0) (2024-08-09)
+
+
+### Features
+
+* **app_analytics:** Add cache for feature evaluation ([#4418](https://github.com/Flagsmith/flagsmith/issues/4418)) ([2dfbf99](https://github.com/Flagsmith/flagsmith/commit/2dfbf99cdc8d8529aa487a4e471df46c0dbc6878))
+* Support blank identifiers, assume transient ([#4449](https://github.com/Flagsmith/flagsmith/issues/4449)) ([0014a5b](https://github.com/Flagsmith/flagsmith/commit/0014a5b4312d1ee7d7dd7b914434f26408ee18b7))
+
+
+### Bug Fixes
+
+* Identity overrides are not deleted when deleting Edge identities ([#4460](https://github.com/Flagsmith/flagsmith/issues/4460)) ([2ab73ed](https://github.com/Flagsmith/flagsmith/commit/2ab73edc7352bec8324eb808ba70d6508fe5eed6))
+* show correct SAML Frontend URL on edit ([#4462](https://github.com/Flagsmith/flagsmith/issues/4462)) ([13ad7ef](https://github.com/Flagsmith/flagsmith/commit/13ad7ef7e6613bdd640cdfca7ce99a892b3893be))
+
+## [2.134.1](https://github.com/Flagsmith/flagsmith/compare/v2.134.0...v2.134.1) (2024-08-07)
+
+
+### Bug Fixes
+
+* don't allow bypassing `ALLOW_REGISTRATION_WITHOUT_INVITE` behaviour ([#4454](https://github.com/Flagsmith/flagsmith/issues/4454)) ([0e6deec](https://github.com/Flagsmith/flagsmith/commit/0e6deec6404c3e78edf5f36b36ea0f2dcef3dd06))
+* protect get environment document endpoint ([#4459](https://github.com/Flagsmith/flagsmith/issues/4459)) ([bee01c7](https://github.com/Flagsmith/flagsmith/commit/bee01c7f21cae19e7665ede3284f96989d33940f))
+* Set grace period to a singular event ([#4455](https://github.com/Flagsmith/flagsmith/issues/4455)) ([3225c47](https://github.com/Flagsmith/flagsmith/commit/3225c47043f9647a7426b7f05890bde29b681acc))
+
## [2.134.0](https://github.com/Flagsmith/flagsmith/compare/v2.133.1...v2.134.0) (2024-08-02)
diff --git a/api/app/settings/common.py b/api/app/settings/common.py
index b5705b60770a..c3fdc55384c7 100644
--- a/api/app/settings/common.py
+++ b/api/app/settings/common.py
@@ -330,6 +330,11 @@
USE_POSTGRES_FOR_ANALYTICS = env.bool("USE_POSTGRES_FOR_ANALYTICS", default=False)
USE_CACHE_FOR_USAGE_DATA = env.bool("USE_CACHE_FOR_USAGE_DATA", default=False)
+PG_API_USAGE_CACHE_SECONDS = env.int("PG_API_USAGE_CACHE_SECONDS", default=60)
+
+FEATURE_EVALUATION_CACHE_SECONDS = env.int(
+ "FEATURE_EVALUATION_CACHE_SECONDS", default=60
+)
ENABLE_API_USAGE_TRACKING = env.bool("ENABLE_API_USAGE_TRACKING", default=True)
@@ -512,6 +517,10 @@
LOGOUT_URL = "/admin/logout/"
# Enable E2E tests
+E2E_TEST_AUTH_TOKEN = env.str("E2E_TEST_AUTH_TOKEN", default=None)
+if E2E_TEST_AUTH_TOKEN is not None:
+ MIDDLEWARE.append("e2etests.middleware.E2ETestMiddleware")
+
ENABLE_FE_E2E = env.bool("ENABLE_FE_E2E", default=False)
# Email associated with user that is used by front end for end to end testing purposes
E2E_TEST_EMAIL_DOMAIN = "flagsmithe2etestdomain.io"
@@ -521,6 +530,18 @@
E2E_CHANGE_EMAIL_USER = f"e2e_change_email@{E2E_TEST_EMAIL_DOMAIN}"
# User email address used for the rest of the E2E tests
E2E_USER = f"e2e_user@{E2E_TEST_EMAIL_DOMAIN}"
+E2E_NON_ADMIN_USER_WITH_ORG_PERMISSIONS = (
+ f"e2e_non_admin_user_with_org_permissions@{E2E_TEST_EMAIL_DOMAIN}"
+)
+E2E_NON_ADMIN_USER_WITH_PROJECT_PERMISSIONS = (
+ f"e2e_non_admin_user_with_project_permissions@{E2E_TEST_EMAIL_DOMAIN}"
+)
+E2E_NON_ADMIN_USER_WITH_ENV_PERMISSIONS = (
+ f"e2e_non_admin_user_with_env_permissions@{E2E_TEST_EMAIL_DOMAIN}"
+)
+E2E_NON_ADMIN_USER_WITH_A_ROLE = (
+ f"e2e_non_admin_user_with_a_role@{E2E_TEST_EMAIL_DOMAIN}"
+)
# Identity for E2E segment tests
E2E_IDENTITY = "test-identity"
diff --git a/api/app/urls.py b/api/app/urls.py
index a110c7d111c0..65caa58df8ba 100644
--- a/api/app/urls.py
+++ b/api/app/urls.py
@@ -15,6 +15,9 @@
re_path(r"^api/v2/", include("api.urls.v2", namespace="api-v2")),
re_path(r"^admin/", admin.site.urls),
re_path(r"^health", include("health_check.urls", namespace="health")),
+ # Aptible health checks must be on /healthcheck and cannot redirect
+ # see https://www.aptible.com/docs/core-concepts/apps/connecting-to-apps/app-endpoints/https-endpoints/health-checks
+ path("healthcheck", include("health_check.urls", namespace="aptible")),
re_path(r"^version", views.version_info, name="version-info"),
re_path(
r"^sales-dashboard/",
diff --git a/api/app_analytics/analytics_db_service.py b/api/app_analytics/analytics_db_service.py
index 5ccd612247db..59933d808966 100644
--- a/api/app_analytics/analytics_db_service.py
+++ b/api/app_analytics/analytics_db_service.py
@@ -35,8 +35,7 @@ def get_usage_data(
) -> list[UsageData]:
now = timezone.now()
- date_stop = date_start = None
- period_starts_at = period_ends_at = None
+ date_start = date_stop = None
match period:
case constants.CURRENT_BILLING_PERIOD:
@@ -47,10 +46,8 @@ def get_usage_data(
days=30
)
month_delta = relativedelta(now, starts_at).months
- period_starts_at = relativedelta(months=month_delta) + starts_at
- period_ends_at = now
- date_start = f"-{(now - period_starts_at).days}d"
- date_stop = "now()"
+ date_start = relativedelta(months=month_delta) + starts_at
+ date_stop = now
case constants.PREVIOUS_BILLING_PERIOD:
if not getattr(organisation, "subscription_information_cache", None):
@@ -61,16 +58,12 @@ def get_usage_data(
)
month_delta = relativedelta(now, starts_at).months - 1
month_delta += relativedelta(now, starts_at).years * 12
- period_starts_at = relativedelta(months=month_delta) + starts_at
- period_ends_at = relativedelta(months=month_delta + 1) + starts_at
- date_start = f"-{(now - period_starts_at).days}d"
- date_stop = f"-{(now - period_ends_at).days}d"
+ date_start = relativedelta(months=month_delta) + starts_at
+ date_stop = relativedelta(months=month_delta + 1) + starts_at
case constants.NINETY_DAY_PERIOD:
- period_starts_at = now - relativedelta(days=90)
- period_ends_at = now
- date_start = "-90d"
- date_stop = "now()"
+ date_start = now - relativedelta(days=90)
+ date_stop = now
if settings.USE_POSTGRES_FOR_ANALYTICS:
kwargs = {
@@ -79,10 +72,10 @@ def get_usage_data(
"project_id": project_id,
}
- if period_starts_at:
- assert period_ends_at
- kwargs["date_start"] = period_starts_at
- kwargs["date_stop"] = period_ends_at
+ if date_start:
+ assert date_stop
+ kwargs["date_start"] = date_start
+ kwargs["date_stop"] = date_stop
return get_usage_data_from_local_db(**kwargs)
diff --git a/api/app_analytics/cache.py b/api/app_analytics/cache.py
index aea7c84f7184..f3f2a74416d7 100644
--- a/api/app_analytics/cache.py
+++ b/api/app_analytics/cache.py
@@ -1,7 +1,9 @@
-from app_analytics.tasks import track_request
-from django.utils import timezone
+from collections import defaultdict
-CACHE_FLUSH_INTERVAL = 60 # seconds
+from app_analytics.tasks import track_feature_evaluation, track_request
+from app_analytics.track import track_feature_evaluation_influxdb
+from django.conf import settings
+from django.utils import timezone
class APIUsageCache:
@@ -29,5 +31,52 @@ def track_request(self, resource: int, host: str, environment_key: str):
self._cache[key] = 1
else:
self._cache[key] += 1
- if (timezone.now() - self._last_flushed_at).seconds > CACHE_FLUSH_INTERVAL:
+ if (
+ timezone.now() - self._last_flushed_at
+ ).seconds > settings.PG_API_USAGE_CACHE_SECONDS:
+ self._flush()
+
+
+class FeatureEvaluationCache:
+ def __init__(self):
+ self._cache = {}
+ self._last_flushed_at = timezone.now()
+
+ def _flush(self):
+ evaluation_data = defaultdict(dict)
+ for (environment_id, feature_name), eval_count in self._cache.items():
+ evaluation_data[environment_id][feature_name] = eval_count
+
+ for environment_id, feature_evaluations in evaluation_data.items():
+ if settings.USE_POSTGRES_FOR_ANALYTICS:
+ track_feature_evaluation.delay(
+ kwargs={
+ "environment_id": environment_id,
+ "feature_evaluations": feature_evaluations,
+ }
+ )
+
+ elif settings.INFLUXDB_TOKEN:
+ track_feature_evaluation_influxdb.delay(
+ kwargs={
+ "environment_id": environment_id,
+ "feature_evaluations": feature_evaluations,
+ }
+ )
+
+ self._cache = {}
+ self._last_flushed_at = timezone.now()
+
+ def track_feature_evaluation(
+ self, environment_id: int, feature_name: str, evaluation_count: int
+ ):
+ key = (environment_id, feature_name)
+ if key not in self._cache:
+ self._cache[key] = evaluation_count
+ else:
+ self._cache[key] += evaluation_count
+
+ if (
+ timezone.now() - self._last_flushed_at
+ ).seconds > settings.FEATURE_EVALUATION_CACHE_SECONDS:
self._flush()
diff --git a/api/app_analytics/influxdb_wrapper.py b/api/app_analytics/influxdb_wrapper.py
index 4a88c596338b..4a61ece6ee17 100644
--- a/api/app_analytics/influxdb_wrapper.py
+++ b/api/app_analytics/influxdb_wrapper.py
@@ -1,8 +1,10 @@
import logging
import typing
from collections import defaultdict
+from datetime import datetime, timedelta
from django.conf import settings
+from django.utils import timezone
from influxdb_client import InfluxDBClient, Point
from influxdb_client.client.exceptions import InfluxDBError
from influxdb_client.client.write_api import SYNCHRONOUS
@@ -20,11 +22,6 @@
influx_org = settings.INFLUXDB_ORG
read_bucket = settings.INFLUXDB_BUCKET + "_downsampled_15m"
-range_bucket_mappings = {
- "-24h": settings.INFLUXDB_BUCKET + "_downsampled_15m",
- "-7d": settings.INFLUXDB_BUCKET + "_downsampled_15m",
- "-30d": settings.INFLUXDB_BUCKET + "_downsampled_1h",
-}
retries = Retry(connect=3, read=3, redirect=3)
# Set a timeout to prevent threads being potentially stuck open due to network weirdness
influxdb_client = InfluxDBClient(
@@ -43,6 +40,13 @@
)
+def get_range_bucket_mappings(date_start: datetime) -> str:
+ now = timezone.now()
+ if (now - date_start).days > 10:
+ return settings.INFLUXDB_BUCKET + "_downsampled_1h"
+ return settings.INFLUXDB_BUCKET + "_downsampled_15m"
+
+
class InfluxDBWrapper:
def __init__(self, name):
self.name = name
@@ -76,15 +80,22 @@ def write(self):
@staticmethod
def influx_query_manager(
- date_start: str = "-30d",
- date_stop: str = "now()",
+ date_start: datetime | None = None,
+ date_stop: datetime | None = None,
drop_columns: typing.Tuple[str, ...] = DEFAULT_DROP_COLUMNS,
filters: str = "|> filter(fn:(r) => r._measurement == 'api_call')",
extra: str = "",
bucket: str = read_bucket,
):
+ now = timezone.now()
+ if date_start is None:
+ date_start = now - timedelta(days=30)
+
+ if date_stop is None:
+ date_stop = now
+
# Influx throws an error for an empty range, so just return a list.
- if date_start == "-0d" and date_stop == "now()":
+ if date_start == date_stop:
return []
query_api = influxdb_client.query_api()
@@ -92,7 +103,7 @@ def influx_query_manager(
query = (
f'from(bucket:"{bucket}")'
- f" |> range(start: {date_start}, stop: {date_stop})"
+ f" |> range(start: {date_start.isoformat()}, stop: {date_stop.isoformat()})"
f" {filters}"
f" |> drop(columns: {drop_columns_input})"
f"{extra}"
@@ -108,7 +119,9 @@ def influx_query_manager(
def get_events_for_organisation(
- organisation_id: id, date_start: str = "-30d", date_stop: str = "now()"
+ organisation_id: id,
+ date_start: datetime | None = None,
+ date_stop: datetime | None = None,
) -> int:
"""
Query influx db for usage for given organisation id
@@ -116,6 +129,13 @@ def get_events_for_organisation(
:param organisation_id: an id of the organisation to get usage for
:return: a number of request counts for organisation
"""
+ now = timezone.now()
+ if date_start is None:
+ date_start = now - timedelta(days=30)
+
+ if date_stop is None:
+ date_stop = now
+
result = InfluxDBWrapper.influx_query_manager(
filters=build_filter_string(
[
@@ -145,7 +165,9 @@ def get_events_for_organisation(
def get_event_list_for_organisation(
- organisation_id: int, date_start: str = "-30d", date_stop: str = "now()"
+ organisation_id: int,
+ date_start: datetime | None = None,
+ date_stop: datetime | None = None,
) -> tuple[dict[str, list[int]], list[str]]:
"""
Query influx db for usage for given organisation id
@@ -154,6 +176,13 @@ def get_event_list_for_organisation(
:return: a number of request counts for organisation in chart.js scheme
"""
+ now = timezone.now()
+ if date_start is None:
+ date_start = now - timedelta(days=30)
+
+ if date_stop is None:
+ date_stop = now
+
results = InfluxDBWrapper.influx_query_manager(
filters=f'|> filter(fn:(r) => r._measurement == "api_call") \
|> filter(fn: (r) => r["organisation_id"] == "{organisation_id}")',
@@ -163,15 +192,12 @@ def get_event_list_for_organisation(
)
dataset = defaultdict(list)
labels = []
+
+ date_difference = date_stop - date_start
+ required_records = date_difference.days + 1
for result in results:
for record in result.records:
dataset[record["resource"]].append(record["_value"])
- if date_stop == "now()":
- required_records = abs(int(date_start[:-1])) + 1
- else:
- required_records = (
- abs(int(date_start[:-1])) - abs(int(date_stop[:-1])) + 1
- )
if len(labels) != required_records:
labels.append(record.values["_time"].strftime("%Y-%m-%d"))
return dataset, labels
@@ -181,8 +207,8 @@ def get_multiple_event_list_for_organisation(
organisation_id: int,
project_id: int = None,
environment_id: int = None,
- date_start: str = "-30d",
- date_stop: str = "now()",
+ date_start: datetime | None = None,
+ date_stop: datetime | None = None,
) -> list[UsageData]:
"""
Query influx db for usage for given organisation id
@@ -193,6 +219,13 @@ def get_multiple_event_list_for_organisation(
:return: a number of requests for flags, traits, identities, environment-document
"""
+ now = timezone.now()
+ if date_start is None:
+ date_start = now - timedelta(days=30)
+
+ if date_stop is None:
+ date_stop = now
+
filters = [
'r._measurement == "api_call"',
f'r["organisation_id"] == "{organisation_id}"',
@@ -227,9 +260,16 @@ def get_usage_data(
organisation_id: int,
project_id: int | None = None,
environment_id: int | None = None,
- date_start: str = "-30d",
- date_stop: str = "now()",
+ date_start: datetime | None = None,
+ date_stop: datetime | None = None,
) -> list[UsageData]:
+ now = timezone.now()
+ if date_start is None:
+ date_start = now - timedelta(days=30)
+
+ if date_stop is None:
+ date_stop = now
+
events_list = get_multiple_event_list_for_organisation(
organisation_id=organisation_id,
project_id=project_id,
@@ -243,7 +283,7 @@ def get_usage_data(
def get_multiple_event_list_for_feature(
environment_id: int,
feature_name: str,
- date_start: str = "-30d",
+ date_start: datetime | None = None,
aggregate_every: str = "24h",
) -> list[dict]:
"""
@@ -264,11 +304,14 @@ def get_multiple_event_list_for_feature(
:param environment_id: an id of the environment to get usage for
:param feature_name: the name of the feature to get usage for
- :param date_start: the influx time period to filter on, e.g. -30d, -7d, etc.
+ :param date_start: the influx datetime period to filter on
:param aggregate_every: the influx time period to aggregate the data by, e.g. 24h
:return: a list of dicts with feature and request count in a specific environment
"""
+ now = timezone.now()
+ if date_start is None:
+ date_start = now - timedelta(days=30)
results = InfluxDBWrapper.influx_query_manager(
date_start=date_start,
@@ -297,15 +340,20 @@ def get_multiple_event_list_for_feature(
def get_feature_evaluation_data(
feature_name: str, environment_id: int, period: str = "30d"
) -> typing.List[FeatureEvaluationData]:
+ assert period.endswith("d")
+ days = int(period[:-1])
+ date_start = timezone.now() - timedelta(days=days)
data = get_multiple_event_list_for_feature(
feature_name=feature_name,
environment_id=environment_id,
- date_start=f"-{period}",
+ date_start=date_start,
)
return FeatureEvaluationDataSchema(many=True).load(data)
-def get_top_organisations(date_start: str, limit: str = ""):
+def get_top_organisations(
+ date_start: datetime | None = None, limit: str = ""
+) -> dict[int, int]:
"""
Query influx db top used organisations
@@ -315,10 +363,14 @@ def get_top_organisations(date_start: str, limit: str = ""):
:return: top organisations in descending order based on api calls.
"""
+ now = timezone.now()
+ if date_start is None:
+ date_start = now - timedelta(days=30)
+
if limit:
limit = f"|> limit(n:{limit})"
- bucket = range_bucket_mappings[date_start]
+ bucket = get_range_bucket_mappings(date_start)
results = InfluxDBWrapper.influx_query_manager(
date_start=date_start,
bucket=bucket,
@@ -333,6 +385,7 @@ def get_top_organisations(date_start: str, limit: str = ""):
)
dataset = {}
+
for result in results:
for record in result.records:
try:
@@ -347,7 +400,9 @@ def get_top_organisations(date_start: str, limit: str = ""):
return dataset
-def get_current_api_usage(organisation_id: int, date_start: str) -> int:
+def get_current_api_usage(
+ organisation_id: int, date_start: datetime | None = None
+) -> int:
"""
Query influx db for api usage
@@ -356,6 +411,9 @@ def get_current_api_usage(organisation_id: int, date_start: str) -> int:
:return: number of current api calls
"""
+ now = timezone.now()
+ if date_start is None:
+ date_start = now - timedelta(days=30)
bucket = read_bucket
results = InfluxDBWrapper.influx_query_manager(
diff --git a/api/app_analytics/management/commands/sendapiusagetoinflux.py b/api/app_analytics/management/commands/sendapiusagetoinflux.py
index 92ecafbe184a..a7e00979cda8 100644
--- a/api/app_analytics/management/commands/sendapiusagetoinflux.py
+++ b/api/app_analytics/management/commands/sendapiusagetoinflux.py
@@ -101,3 +101,5 @@ def handle(
write_api = influxdb_client.write_api(write_options=SYNCHRONOUS)
write_api.write(bucket=bucket_name, record=record)
+
+ self.stdout.write(self.style.SUCCESS("Successfully sent data to InfluxDB"))
diff --git a/api/app_analytics/views.py b/api/app_analytics/views.py
index d89b76e3fb97..1f71efc33a2a 100644
--- a/api/app_analytics/views.py
+++ b/api/app_analytics/views.py
@@ -4,14 +4,9 @@
get_total_events_count,
get_usage_data,
)
-from app_analytics.tasks import (
- track_feature_evaluation,
- track_feature_evaluation_v2,
-)
-from app_analytics.track import (
- track_feature_evaluation_influxdb,
- track_feature_evaluation_influxdb_v2,
-)
+from app_analytics.cache import FeatureEvaluationCache
+from app_analytics.tasks import track_feature_evaluation_v2
+from app_analytics.track import track_feature_evaluation_influxdb_v2
from django.conf import settings
from drf_yasg.utils import swagger_auto_schema
from rest_framework import status
@@ -38,6 +33,7 @@
)
logger = logging.getLogger(__name__)
+feature_evaluation_cache = FeatureEvaluationCache()
class SDKAnalyticsFlagsV2(CreateAPIView):
@@ -141,26 +137,10 @@ def post(self, request, *args, **kwargs):
content_type="application/json",
status=status.HTTP_200_OK,
)
-
- if settings.USE_POSTGRES_FOR_ANALYTICS:
- track_feature_evaluation.delay(
- args=(
- request.environment.id,
- request.data,
- )
+ for feature_name, eval_count in self.request.data.items():
+ feature_evaluation_cache.track_feature_evaluation(
+ request.environment.id, feature_name, eval_count
)
- elif settings.INFLUXDB_TOKEN:
- # Due to load issues on the task processor, we
- # explicitly run this task in a separate thread.
- # TODO: batch influx data to prevent large amounts
- # of tasks.
- track_feature_evaluation_influxdb.run_in_thread(
- args=(
- request.environment.id,
- request.data,
- )
- )
-
return Response(status=status.HTTP_200_OK)
def _is_data_valid(self) -> bool:
diff --git a/api/conftest.py b/api/conftest.py
index ab0aa198e9e3..ff6732ee44d3 100644
--- a/api/conftest.py
+++ b/api/conftest.py
@@ -1018,14 +1018,20 @@ def flagsmith_environments_v2_table(dynamodb: DynamoDBServiceResource) -> Table:
@pytest.fixture()
-def feature_external_resource(
- feature: Feature, post_request_mock: MagicMock, mocker: MockerFixture
-) -> FeatureExternalResource:
- mocker.patch(
+def mock_github_client_generate_token(mocker: MockerFixture) -> MagicMock:
+ return mocker.patch(
"integrations.github.client.generate_token",
return_value="mocked_token",
)
+
+@pytest.fixture()
+def feature_external_resource(
+ feature: Feature,
+ post_request_mock: MagicMock,
+ mocker: MockerFixture,
+ mock_github_client_generate_token: MagicMock,
+) -> FeatureExternalResource:
return FeatureExternalResource.objects.create(
url="https://github.com/repositoryownertest/repositorynametest/issues/11",
type="GITHUB_ISSUE",
@@ -1035,16 +1041,26 @@ def feature_external_resource(
@pytest.fixture()
-def feature_with_value_external_resource(
- feature_with_value: Feature,
+def feature_external_resource_gh_pr(
+ feature: Feature,
post_request_mock: MagicMock,
mocker: MockerFixture,
+ mock_github_client_generate_token: MagicMock,
) -> FeatureExternalResource:
- mocker.patch(
- "integrations.github.client.generate_token",
- return_value="mocked_token",
+ return FeatureExternalResource.objects.create(
+ url="https://github.com/repositoryownertest/repositorynametest/pull/1",
+ type="GITHUB_PR",
+ feature=feature,
+ metadata='{"status": "open"}',
)
+
+@pytest.fixture()
+def feature_with_value_external_resource(
+ feature_with_value: Feature,
+ post_request_mock: MagicMock,
+ mock_github_client_generate_token: MagicMock,
+) -> FeatureExternalResource:
return FeatureExternalResource.objects.create(
url="https://github.com/repositoryownertest/repositorynametest/issues/11",
type="GITHUB_ISSUE",
@@ -1069,6 +1085,7 @@ def github_repository(
repository_owner="repositoryownertest",
repository_name="repositorynametest",
project=project,
+ tagging_enabled=True,
)
@@ -1120,3 +1137,10 @@ def handle(self, record: logging.LogRecord) -> None:
self.messages.append(self.format(record))
return InspectingHandler()
+
+
+@pytest.fixture
+def set_github_webhook_secret() -> None:
+ from django.conf import settings
+
+ settings.GITHUB_WEBHOOK_SECRET = "secret-key"
diff --git a/api/core/management/commands/waitfordb.py b/api/core/management/commands/waitfordb.py
index 91c37468bc26..811af752994e 100644
--- a/api/core/management/commands/waitfordb.py
+++ b/api/core/management/commands/waitfordb.py
@@ -45,6 +45,7 @@ def handle(
database: str,
**options: Any,
) -> None:
+
start = time.monotonic()
wait_between_checks = 0.25
diff --git a/api/custom_auth/oauth/serializers.py b/api/custom_auth/oauth/serializers.py
index 6a1e80ab90af..cf16c008191b 100644
--- a/api/custom_auth/oauth/serializers.py
+++ b/api/custom_auth/oauth/serializers.py
@@ -6,13 +6,11 @@
from django.db.models import F
from rest_framework import serializers
from rest_framework.authtoken.models import Token
-from rest_framework.exceptions import PermissionDenied
-from organisations.invites.models import Invite
from users.auth_type import AuthType
from users.models import SignUpType
-from ..constants import USER_REGISTRATION_WITHOUT_INVITE_ERROR_MESSAGE
+from ..serializers import InviteLinkValidationMixin
from .github import GithubUser
from .google import get_user_info
@@ -20,7 +18,7 @@
UserModel = get_user_model()
-class OAuthLoginSerializer(serializers.Serializer):
+class OAuthLoginSerializer(InviteLinkValidationMixin, serializers.Serializer):
access_token = serializers.CharField(
required=True,
help_text="Code or access token returned from the FE interaction with the third party login provider.",
@@ -85,12 +83,9 @@ def _get_user(self, user_data: dict):
if not existing_user:
sign_up_type = self.validated_data.get("sign_up_type")
- if not (
- settings.ALLOW_REGISTRATION_WITHOUT_INVITE
- or sign_up_type == SignUpType.INVITE_LINK.value
- or Invite.objects.filter(email=email).exists()
- ):
- raise PermissionDenied(USER_REGISTRATION_WITHOUT_INVITE_ERROR_MESSAGE)
+ self._validate_registration_invite(
+ email=email, sign_up_type=self.validated_data.get("sign_up_type")
+ )
return UserModel.objects.create(
**user_data, email=email.lower(), sign_up_type=sign_up_type
diff --git a/api/custom_auth/serializers.py b/api/custom_auth/serializers.py
index 55bb43e595ae..11bd80828a6b 100644
--- a/api/custom_auth/serializers.py
+++ b/api/custom_auth/serializers.py
@@ -5,7 +5,7 @@
from rest_framework.exceptions import PermissionDenied
from rest_framework.validators import UniqueValidator
-from organisations.invites.models import Invite
+from organisations.invites.models import Invite, InviteLink
from users.auth_type import AuthType
from users.constants import DEFAULT_DELETE_ORPHAN_ORGANISATIONS_VALUE
from users.models import FFAdminUser, SignUpType
@@ -23,7 +23,28 @@ class Meta:
fields = ("key",)
-class CustomUserCreateSerializer(UserCreateSerializer):
+class InviteLinkValidationMixin:
+ invite_hash = serializers.CharField(required=False, write_only=True)
+
+ def _validate_registration_invite(self, email: str, sign_up_type: str) -> None:
+ if settings.ALLOW_REGISTRATION_WITHOUT_INVITE:
+ return
+
+ valid = False
+
+ match sign_up_type:
+ case SignUpType.INVITE_LINK.value:
+ valid = InviteLink.objects.filter(
+ hash=self.initial_data.get("invite_hash")
+ ).exists()
+ case SignUpType.INVITE_EMAIL.value:
+ valid = Invite.objects.filter(email__iexact=email.lower()).exists()
+
+ if not valid:
+ raise PermissionDenied(USER_REGISTRATION_WITHOUT_INVITE_ERROR_MESSAGE)
+
+
+class CustomUserCreateSerializer(UserCreateSerializer, InviteLinkValidationMixin):
key = serializers.SerializerMethodField()
class Meta(UserCreateSerializer.Meta):
@@ -58,6 +79,10 @@ def validate(self, attrs):
self.context.get("request"), email=email, raise_exception=True
)
+ self._validate_registration_invite(
+ email=email, sign_up_type=attrs.get("sign_up_type")
+ )
+
attrs["email"] = email.lower()
return attrs
@@ -66,16 +91,6 @@ def get_key(instance):
token, _ = Token.objects.get_or_create(user=instance)
return token.key
- def save(self, **kwargs):
- if not (
- settings.ALLOW_REGISTRATION_WITHOUT_INVITE
- or self.validated_data.get("sign_up_type") == SignUpType.INVITE_LINK.value
- or Invite.objects.filter(email=self.validated_data.get("email"))
- ):
- raise PermissionDenied(USER_REGISTRATION_WITHOUT_INVITE_ERROR_MESSAGE)
-
- return super(CustomUserCreateSerializer, self).save(**kwargs)
-
class CustomUserDelete(serializers.Serializer):
current_password = serializers.CharField(
diff --git a/api/e2etests/e2e_seed_data.py b/api/e2etests/e2e_seed_data.py
index d21d0054031b..eba97bd65cb8 100644
--- a/api/e2etests/e2e_seed_data.py
+++ b/api/e2etests/e2e_seed_data.py
@@ -4,13 +4,34 @@
from edge_api.identities.models import EdgeIdentity
from environments.identities.models import Identity
from environments.models import Environment
+from environments.permissions.constants import (
+ UPDATE_FEATURE_STATE,
+ VIEW_ENVIRONMENT,
+ VIEW_IDENTITIES,
+)
+from environments.permissions.models import UserEnvironmentPermission
from organisations.models import Organisation, OrganisationRole, Subscription
-from projects.models import Project
-from users.models import FFAdminUser
+from organisations.permissions.models import UserOrganisationPermission
+from organisations.permissions.permissions import (
+ CREATE_PROJECT,
+ MANAGE_USER_GROUPS,
+)
+from organisations.subscriptions.constants import SCALE_UP
+from projects.models import Project, UserProjectPermission
+from projects.permissions import (
+ CREATE_ENVIRONMENT,
+ CREATE_FEATURE,
+ VIEW_AUDIT_LOG,
+ VIEW_PROJECT,
+)
+from users.models import FFAdminUser, UserPermissionGroup
# Password used by all the test users
PASSWORD = "Str0ngp4ssw0rd!"
+PROJECT_PERMISSION_PROJECT = "My Test Project 5 Project Permission"
+ENV_PERMISSION_PROJECT = "My Test Project 6 Env Permission"
+
def delete_user_and_its_organisations(user_email: str) -> None:
user: FFAdminUser | None = FFAdminUser.objects.filter(email=user_email).first()
@@ -25,6 +46,18 @@ def teardown() -> None:
delete_user_and_its_organisations(user_email=settings.E2E_SIGNUP_USER)
delete_user_and_its_organisations(user_email=settings.E2E_USER)
delete_user_and_its_organisations(user_email=settings.E2E_CHANGE_EMAIL_USER)
+ delete_user_and_its_organisations(
+ user_email=settings.E2E_NON_ADMIN_USER_WITH_ORG_PERMISSIONS
+ )
+ delete_user_and_its_organisations(
+ user_email=settings.E2E_NON_ADMIN_USER_WITH_PROJECT_PERMISSIONS
+ )
+ delete_user_and_its_organisations(
+ user_email=settings.E2E_NON_ADMIN_USER_WITH_ENV_PERMISSIONS
+ )
+ delete_user_and_its_organisations(
+ user_email=settings.E2E_NON_ADMIN_USER_WITH_A_ROLE
+ )
def seed_data() -> None:
@@ -36,6 +69,44 @@ def seed_data() -> None:
username=settings.E2E_USER,
)
org_admin.add_organisation(organisation, OrganisationRole.ADMIN)
+ non_admin_user_with_org_permissions: FFAdminUser = FFAdminUser.objects.create_user(
+ email=settings.E2E_NON_ADMIN_USER_WITH_ORG_PERMISSIONS,
+ password=PASSWORD,
+ )
+ non_admin_user_with_project_permissions: FFAdminUser = (
+ FFAdminUser.objects.create_user(
+ email=settings.E2E_NON_ADMIN_USER_WITH_PROJECT_PERMISSIONS,
+ password=PASSWORD,
+ )
+ )
+ non_admin_user_with_env_permissions: FFAdminUser = FFAdminUser.objects.create_user(
+ email=settings.E2E_NON_ADMIN_USER_WITH_ENV_PERMISSIONS,
+ password=PASSWORD,
+ )
+ non_admin_user_with_a_role: FFAdminUser = FFAdminUser.objects.create_user(
+ email=settings.E2E_NON_ADMIN_USER_WITH_A_ROLE,
+ password=PASSWORD,
+ )
+ non_admin_user_with_org_permissions.add_organisation(
+ organisation,
+ )
+ non_admin_user_with_project_permissions.add_organisation(
+ organisation,
+ )
+ non_admin_user_with_env_permissions.add_organisation(
+ organisation,
+ )
+ non_admin_user_with_a_role.add_organisation(
+ organisation,
+ )
+
+ # Add permissions to the non-admin user with org permissions
+ user_org_permission = UserOrganisationPermission.objects.create(
+ user=non_admin_user_with_org_permissions, organisation=organisation
+ )
+ user_org_permission.add_permission(CREATE_PROJECT)
+ user_org_permission.add_permission(MANAGE_USER_GROUPS)
+ UserPermissionGroup.objects.create(name="TestGroup", organisation=organisation)
# We add different projects and environments to give each e2e test its own isolated context.
project_test_data = [
@@ -49,7 +120,17 @@ def seed_data() -> None:
{"name": "My Test Project 2", "environments": ["Development"]},
{"name": "My Test Project 3", "environments": ["Development"]},
{"name": "My Test Project 4", "environments": ["Development"]},
+ {
+ "name": PROJECT_PERMISSION_PROJECT,
+ "environments": ["Development"],
+ },
+ {"name": ENV_PERMISSION_PROJECT, "environments": ["Development"]},
+ {"name": "My Test Project 7 Role", "environments": ["Development"]},
]
+ # Upgrade organisation seats
+ Subscription.objects.filter(organisation__in=org_admin.organisations.all()).update(
+ max_seats=8, plan=SCALE_UP, subscription_id="test_subscription_id"
+ )
# Create projects and environments
projects = []
@@ -58,19 +139,59 @@ def seed_data() -> None:
project = Project.objects.create(
name=project_info["name"], organisation=organisation
)
+ if project_info["name"] == PROJECT_PERMISSION_PROJECT:
+ # Add permissions to the non-admin user with project permissions
+ user_proj_permission: UserProjectPermission = (
+ UserProjectPermission.objects.create(
+ user=non_admin_user_with_project_permissions, project=project
+ )
+ )
+ [
+ user_proj_permission.add_permission(permission_key)
+ for permission_key in [
+ VIEW_PROJECT,
+ CREATE_ENVIRONMENT,
+ CREATE_FEATURE,
+ VIEW_AUDIT_LOG,
+ ]
+ ]
projects.append(project)
for env_name in project_info["environments"]:
environment = Environment.objects.create(name=env_name, project=project)
+
+ if project_info["name"] == ENV_PERMISSION_PROJECT:
+ # Add permissions to the non-admin user with env permissions
+ user_env_permission = UserEnvironmentPermission.objects.create(
+ user=non_admin_user_with_env_permissions, environment=environment
+ )
+ user_env_proj_permission: UserProjectPermission = (
+ UserProjectPermission.objects.create(
+ user=non_admin_user_with_env_permissions, project=project
+ )
+ )
+ user_env_proj_permission.add_permission(VIEW_PROJECT)
+ user_env_proj_permission.add_permission(CREATE_FEATURE)
+ [
+ user_env_permission.add_permission(permission_key)
+ for permission_key in [
+ VIEW_ENVIRONMENT,
+ UPDATE_FEATURE_STATE,
+ VIEW_IDENTITIES,
+ ]
+ ]
environments.append(environment)
- # We're only creating identities for 3 of the 5 environments because
+ # We're only creating identities for 6 of the 7 environments because
# they are necessary for the environments created above and to keep
# the e2e tests isolated."
identities_test_data = [
{"identifier": settings.E2E_IDENTITY, "environment": environments[2]},
{"identifier": settings.E2E_IDENTITY, "environment": environments[3]},
{"identifier": settings.E2E_IDENTITY, "environment": environments[4]},
+ {"identifier": settings.E2E_IDENTITY, "environment": environments[5]},
+ {"identifier": settings.E2E_IDENTITY, "environment": environments[6]},
+ {"identifier": settings.E2E_IDENTITY, "environment": environments[7]},
]
for identity_info in identities_test_data:
@@ -82,8 +203,3 @@ def seed_data() -> None:
EdgeIdentity(engine_identity).save()
else:
Identity.objects.create(**identity_info)
-
- # Upgrade organisation seats
- Subscription.objects.filter(organisation__in=org_admin.organisations.all()).update(
- max_seats=2
- )
diff --git a/api/e2etests/middleware.py b/api/e2etests/middleware.py
new file mode 100644
index 000000000000..c57c3aeef4ec
--- /dev/null
+++ b/api/e2etests/middleware.py
@@ -0,0 +1,16 @@
+from django.conf import settings
+
+
+class E2ETestMiddleware:
+ def __init__(self, get_response):
+ self.get_response = get_response
+
+ def __call__(self, request):
+ request.is_e2e = False
+ if (
+ request.META.get("HTTP_X_E2E_TEST_AUTH_TOKEN")
+ == settings.E2E_TEST_AUTH_TOKEN
+ ):
+ request.is_e2e = True
+
+ return self.get_response(request)
diff --git a/api/e2etests/permissions.py b/api/e2etests/permissions.py
index 43a2b5f1a46f..c7d977261190 100644
--- a/api/e2etests/permissions.py
+++ b/api/e2etests/permissions.py
@@ -1,13 +1,8 @@
-import os
-
+from django.views import View
from rest_framework.permissions import BasePermission
+from rest_framework.request import Request
class E2ETestPermission(BasePermission):
- def has_permission(self, request, view):
- if "E2E_TEST_AUTH_TOKEN" not in os.environ:
- return False
- return (
- request.META.get("HTTP_X_E2E_TEST_AUTH_TOKEN")
- == os.environ["E2E_TEST_AUTH_TOKEN"]
- )
+ def has_permission(self, request: Request, view: View) -> bool:
+ return getattr(request, "is_e2e", False) is True
diff --git a/api/edge_api/identities/models.py b/api/edge_api/identities/models.py
index d67bcb5831c8..1e062ecd005e 100644
--- a/api/edge_api/identities/models.py
+++ b/api/edge_api/identities/models.py
@@ -167,15 +167,52 @@ def remove_feature_override(self, feature_state: FeatureStateModel) -> None:
def save(self, user: FFAdminUser = None, master_api_key: MasterAPIKey = None):
self.dynamo_wrapper.put_item(self.to_document())
- changes = self._get_changes()
- if changes["feature_overrides"]:
+ changeset = self._get_changes()
+ self._update_feature_overrides(
+ changeset=changeset,
+ user=user,
+ master_api_key=master_api_key,
+ )
+ self._reset_initial_state()
+
+ def delete(
+ self, user: FFAdminUser = None, master_api_key: MasterAPIKey = None
+ ) -> None:
+ self.dynamo_wrapper.delete_item(self._engine_identity_model.composite_key)
+ self._engine_identity_model.identity_features.clear()
+ changeset = self._get_changes()
+ self._update_feature_overrides(
+ changeset=changeset,
+ user=user,
+ master_api_key=master_api_key,
+ )
+ self._reset_initial_state()
+
+ def synchronise_features(self, valid_feature_names: typing.Collection[str]) -> None:
+ identity_feature_names = {
+ fs.feature.name for fs in self._engine_identity_model.identity_features
+ }
+ if not identity_feature_names.issubset(valid_feature_names):
+ self._engine_identity_model.prune_features(list(valid_feature_names))
+ sync_identity_document_features.delay(args=(str(self.identity_uuid),))
+
+ def to_document(self) -> dict:
+ return map_engine_identity_to_identity_document(self._engine_identity_model)
+
+ def _update_feature_overrides(
+ self,
+ changeset: IdentityChangeset,
+ user: FFAdminUser = None,
+ master_api_key: MasterAPIKey = None,
+ ) -> None:
+ if changeset["feature_overrides"]:
# TODO: would this be simpler if we put a wrapper around FeatureStateModel instead?
generate_audit_log_records.delay(
kwargs={
"environment_api_key": self.environment_api_key,
"identifier": self.identifier,
"user_id": getattr(user, "id", None),
- "changes": changes,
+ "changes": changeset,
"identity_uuid": str(self.identity_uuid),
"master_api_key_id": getattr(master_api_key, "id", None),
}
@@ -183,23 +220,11 @@ def save(self, user: FFAdminUser = None, master_api_key: MasterAPIKey = None):
update_flagsmith_environments_v2_identity_overrides.delay(
kwargs={
"environment_api_key": self.environment_api_key,
- "changes": changes,
+ "changes": changeset,
"identity_uuid": str(self.identity_uuid),
"identifier": self.identifier,
}
)
- self._reset_initial_state()
-
- def synchronise_features(self, valid_feature_names: typing.Collection[str]) -> None:
- identity_feature_names = {
- fs.feature.name for fs in self._engine_identity_model.identity_features
- }
- if not identity_feature_names.issubset(valid_feature_names):
- self._engine_identity_model.prune_features(list(valid_feature_names))
- sync_identity_document_features.delay(args=(str(self.identity_uuid),))
-
- def to_document(self) -> dict:
- return map_engine_identity_to_identity_document(self._engine_identity_model)
def _get_changes(self) -> IdentityChangeset:
previous_instance = self._initial_state
diff --git a/api/edge_api/identities/views.py b/api/edge_api/identities/views.py
index c8ac3c0e3d50..9065324457c5 100644
--- a/api/edge_api/identities/views.py
+++ b/api/edge_api/identities/views.py
@@ -160,7 +160,11 @@ def get_environment_from_request(self):
)
def perform_destroy(self, instance):
- EdgeIdentity.dynamo_wrapper.delete_item(instance["composite_key"])
+ edge_identity = EdgeIdentity.from_identity_document(instance)
+ edge_identity.delete(
+ user=self.request.user,
+ master_api_key=getattr(self.request, "master_api_key", None),
+ )
@swagger_auto_schema(
responses={200: EdgeIdentityTraitsSerializer(many=True)},
@@ -281,7 +285,7 @@ def list(self, request, *args, **kwargs):
def perform_destroy(self, instance):
self.identity.remove_feature_override(instance)
self.identity.save(
- user=self.request.user.id,
+ user=self.request.user,
master_api_key=getattr(self.request, "master_api_key", None),
)
diff --git a/api/environments/identities/models.py b/api/environments/identities/models.py
index 9fe99df13df9..60f36f8a4464 100644
--- a/api/environments/identities/models.py
+++ b/api/environments/identities/models.py
@@ -4,12 +4,12 @@
from django.db import models
from django.db.models import Prefetch, Q
from django.utils import timezone
-from flag_engine.identities.traits.types import TraitValue
from flag_engine.segments.evaluator import evaluate_identity_in_segment
from environments.identities.managers import IdentityManager
from environments.identities.traits.models import Trait
from environments.models import Environment
+from environments.sdk.types import SDKTraitData
from features.models import FeatureState
from features.multivariate.models import MultivariateFeatureStateValue
from segments.models import Segment
@@ -196,7 +196,11 @@ def get_all_user_traits(self):
def __str__(self):
return "Account %s" % self.identifier
- def generate_traits(self, trait_data_items, persist=False):
+ def generate_traits(
+ self,
+ trait_data_items: list[SDKTraitData],
+ persist: bool = False,
+ ) -> list[Trait]:
"""
Given a list of trait data items, validated by TraitSerializerFull, generate
a list of TraitModel objects for the given identity.
@@ -232,7 +236,7 @@ def generate_traits(self, trait_data_items, persist=False):
def update_traits(
self,
- trait_data_items: list[dict[str, TraitValue]],
+ trait_data_items: list[SDKTraitData],
) -> list[Trait]:
"""
Given a list of traits, update any that already exist and create any new ones.
diff --git a/api/environments/identities/serializers.py b/api/environments/identities/serializers.py
index bd4b1ffaed1f..6c0d9df9229e 100644
--- a/api/environments/identities/serializers.py
+++ b/api/environments/identities/serializers.py
@@ -59,6 +59,7 @@ class _TraitSerializer(serializers.Serializer):
help_text="Can be of type string, boolean, float or integer."
)
+ identifier = serializers.CharField()
flags = serializers.ListField(child=SDKFeatureStateSerializer())
traits = serializers.ListSerializer(child=_TraitSerializer())
diff --git a/api/environments/identities/traits/fields.py b/api/environments/identities/traits/fields.py
index fd5d3d335f9c..ceadc03b69c2 100644
--- a/api/environments/identities/traits/fields.py
+++ b/api/environments/identities/traits/fields.py
@@ -1,4 +1,5 @@
import logging
+from typing import Any
from rest_framework import serializers
@@ -6,6 +7,7 @@
ACCEPTED_TRAIT_VALUE_TYPES,
TRAIT_STRING_VALUE_MAX_LENGTH,
)
+from environments.sdk.types import SDKTraitValueData
from features.value_types import STRING
logger = logging.getLogger(__name__)
@@ -16,7 +18,7 @@ class TraitValueField(serializers.Field):
Custom field to extract the type of the field on deserialization.
"""
- def to_internal_value(self, data):
+ def to_internal_value(self, data: Any) -> SDKTraitValueData:
data_type = type(data).__name__
if data_type not in ACCEPTED_TRAIT_VALUE_TYPES:
@@ -28,7 +30,7 @@ def to_internal_value(self, data):
)
return {"type": data_type, "value": data}
- def to_representation(self, value):
+ def to_representation(self, value: Any) -> Any:
return_value = value.get("value") if isinstance(value, dict) else value
if return_value is None:
diff --git a/api/environments/identities/views.py b/api/environments/identities/views.py
index f403b8001dc1..bbf9e2ce566d 100644
--- a/api/environments/identities/views.py
+++ b/api/environments/identities/views.py
@@ -309,6 +309,7 @@ def _get_all_feature_states_for_user_response(
serializer = serializer_class(
{
"flags": all_feature_states,
+ "identifier": identity.identifier,
"traits": identity.identity_traits.all(),
},
context=self.get_serializer_context(),
diff --git a/api/environments/permissions/permissions.py b/api/environments/permissions/permissions.py
index dc033d7249f1..020b7a065249 100644
--- a/api/environments/permissions/permissions.py
+++ b/api/environments/permissions/permissions.py
@@ -45,6 +45,8 @@ def has_permission(self, request, view):
def has_object_permission(self, request, view, obj):
if view.action == "clone":
return request.user.has_project_permission(CREATE_ENVIRONMENT, obj.project)
+ elif view.action == "get_document":
+ return request.user.has_environment_permission(VIEW_ENVIRONMENT, obj)
return request.user.is_environment_admin(obj) or view.action in [
"user_permissions"
diff --git a/api/environments/sdk/serializers.py b/api/environments/sdk/serializers.py
index 8ee5d254b94b..7029daa143b9 100644
--- a/api/environments/sdk/serializers.py
+++ b/api/environments/sdk/serializers.py
@@ -2,7 +2,6 @@
from collections import defaultdict
from core.constants import BOOLEAN, FLOAT, INTEGER, STRING
-from django.utils import timezone
from rest_framework import serializers
from environments.identities.models import Identity
@@ -12,6 +11,12 @@
from environments.identities.traits.fields import TraitValueField
from environments.identities.traits.models import Trait
from environments.identities.traits.serializers import TraitSerializerBasic
+from environments.sdk.services import (
+ get_identified_transient_identity_and_traits,
+ get_persisted_identity_and_traits,
+ get_transient_identity_and_traits,
+)
+from environments.sdk.types import SDKTraitData
from features.serializers import (
FeatureStateSerializerFull,
SDKFeatureStateSerializer,
@@ -125,7 +130,11 @@ def create(self, validated_data):
class IdentifyWithTraitsSerializer(
HideSensitiveFieldsSerializerMixin, serializers.Serializer
):
- identifier = serializers.CharField(write_only=True, required=True)
+ identifier = serializers.CharField(
+ required=False,
+ allow_blank=True,
+ allow_null=True,
+ )
transient = serializers.BooleanField(write_only=True, default=False)
traits = TraitSerializerBasic(required=False, many=True)
flags = SDKFeatureStateSerializer(read_only=True, many=True)
@@ -137,44 +146,46 @@ def save(self, **kwargs):
Create the identity with the associated traits
(optionally store traits if flag set on org)
"""
+ identifier = self.validated_data.get("identifier")
environment = self.context["environment"]
-
transient = self.validated_data["transient"]
- trait_data_items = self.validated_data.get("traits", [])
+ sdk_trait_data: list[SDKTraitData] = self.validated_data.get("traits", [])
- if transient:
- identity = Identity(
- created_date=timezone.now(),
- identifier=self.validated_data["identifier"],
+ if not identifier:
+ # We have a fully transient identity that should never be persisted.
+ identity, traits = get_transient_identity_and_traits(
environment=environment,
+ sdk_trait_data=sdk_trait_data,
)
- trait_models = identity.generate_traits(trait_data_items, persist=False)
- else:
- identity, created = Identity.objects.get_or_create(
- identifier=self.validated_data["identifier"], environment=environment
+ elif transient:
+ # Don't persist incoming data but load presently stored
+ # overrides and traits, if any.
+ identity, traits = get_identified_transient_identity_and_traits(
+ environment=environment,
+ identifier=identifier,
+ sdk_trait_data=sdk_trait_data,
)
- if not created and environment.project.organisation.persist_trait_data:
- # if this is an update and we're persisting traits, then we need to
- # partially update any traits and return the full list
- trait_models = identity.update_traits(trait_data_items)
- else:
- # generate traits for the identity and store them if configured to do so
- trait_models = identity.generate_traits(
- trait_data_items,
- persist=environment.project.organisation.persist_trait_data,
- )
+ else:
+ # Persist the identity in accordance with individual trait transiency
+ # and persistence settings outside of request context.
+ identity, traits = get_persisted_identity_and_traits(
+ environment=environment,
+ identifier=identifier,
+ sdk_trait_data=sdk_trait_data,
+ )
all_feature_states = identity.get_all_feature_states(
- traits=trait_models,
+ traits=traits,
additional_filters=self.context.get("feature_states_additional_filters"),
)
- identify_integrations(identity, all_feature_states, trait_models)
+ identify_integrations(identity, all_feature_states, traits)
return {
"identity": identity,
- "traits": trait_models,
+ "identifier": identity.identifier,
+ "traits": traits,
"flags": all_feature_states,
}
diff --git a/api/environments/sdk/services.py b/api/environments/sdk/services.py
new file mode 100644
index 000000000000..500f35ac8b92
--- /dev/null
+++ b/api/environments/sdk/services.py
@@ -0,0 +1,120 @@
+import hashlib
+import uuid
+from itertools import chain
+from operator import itemgetter
+from typing import TypeAlias
+
+from django.utils import timezone
+
+from environments.identities.models import Identity
+from environments.identities.traits.models import Trait
+from environments.models import Environment
+from environments.sdk.types import SDKTraitData
+
+IdentityAndTraits: TypeAlias = tuple[Identity, list[Trait]]
+
+
+def get_transient_identity_and_traits(
+ environment: Environment,
+ sdk_trait_data: list[SDKTraitData],
+) -> IdentityAndTraits:
+ """
+ Get a transient `Identity` instance with a randomly generated identifier.
+ All traits are marked as transient.
+ """
+ return (
+ (
+ identity := _get_transient_identity(
+ environment=environment,
+ identifier=get_transient_identifier(sdk_trait_data),
+ )
+ ),
+ identity.generate_traits(_ensure_transient(sdk_trait_data), persist=False),
+ )
+
+
+def get_identified_transient_identity_and_traits(
+ environment: Environment,
+ identifier: str,
+ sdk_trait_data: list[SDKTraitData],
+) -> IdentityAndTraits:
+ """
+ Get a transient `Identity` instance.
+ If present in storage, it's a previously persisted identity with its traits,
+ combined with incoming traits provided to `sdk_trait_data` argument.
+ All traits constructed from `sdk_trait_data` are marked as transient.
+ """
+ sdk_trait_data = _ensure_transient(sdk_trait_data)
+ if identity := Identity.objects.filter(
+ environment=environment,
+ identifier=identifier,
+ ).first():
+ return identity, identity.update_traits(sdk_trait_data)
+ return (
+ identity := _get_transient_identity(
+ environment=environment,
+ identifier=identifier,
+ )
+ ), identity.generate_traits(sdk_trait_data, persist=False)
+
+
+def get_persisted_identity_and_traits(
+ environment: Environment,
+ identifier: str,
+ sdk_trait_data: list[SDKTraitData],
+) -> IdentityAndTraits:
+ """
+ Retrieve a previously persisted `Identity` instance or persist a new one.
+ Traits are persisted based on the organisation-level setting or a
+ `"transient"` attribute provided with each individual trait.
+ """
+ identity, created = Identity.objects.get_or_create(
+ environment=environment,
+ identifier=identifier,
+ )
+ persist_trait_data = environment.project.organisation.persist_trait_data
+ if created:
+ return identity, identity.generate_traits(
+ sdk_trait_data,
+ persist=persist_trait_data,
+ )
+ if persist_trait_data:
+ return identity, identity.update_traits(sdk_trait_data)
+ return identity, list(
+ {
+ trait.trait_key: trait
+ for trait in chain(
+ identity.identity_traits.all(),
+ identity.generate_traits(sdk_trait_data, persist=False),
+ )
+ }.values()
+ )
+
+
+def get_transient_identifier(sdk_trait_data: list[SDKTraitData]) -> str:
+ if sdk_trait_data:
+ return hashlib.sha256(
+ "".join(
+ f'{trait["trait_key"]}{trait["trait_value"]["value"]}'
+ for trait in sorted(sdk_trait_data, key=itemgetter("trait_key"))
+ ).encode(),
+ usedforsecurity=False,
+ ).hexdigest()
+ return uuid.uuid4().hex
+
+
+def _get_transient_identity(
+ environment: Environment,
+ identifier: str,
+) -> Identity:
+ return Identity(
+ created_date=timezone.now(),
+ environment=environment,
+ identifier=identifier,
+ )
+
+
+def _ensure_transient(sdk_trait_data: list[SDKTraitData]) -> list[SDKTraitData]:
+ for sdk_trait_data_item in sdk_trait_data:
+ sdk_trait_data_item["transient"] = True
+ return sdk_trait_data
diff --git a/api/environments/sdk/types.py b/api/environments/sdk/types.py
new file mode 100644
index 000000000000..cf369f5a1907
--- /dev/null
+++ b/api/environments/sdk/types.py
@@ -0,0 +1,14 @@
+import typing
+
+from typing_extensions import NotRequired
+
+
+class SDKTraitValueData(typing.TypedDict):
+ type: str
+ value: str
+
+
+class SDKTraitData(typing.TypedDict):
+ trait_key: str
+ trait_value: SDKTraitValueData
+ transient: NotRequired[bool]
diff --git a/api/environments/views.py b/api/environments/views.py
index 4afc0ab52567..085fa6853c10 100644
--- a/api/environments/views.py
+++ b/api/environments/views.py
@@ -225,7 +225,10 @@ def user_permissions(self, request, *args, **kwargs):
@swagger_auto_schema(responses={200: SDKEnvironmentDocumentModel})
@action(detail=True, methods=["GET"], url_path="document")
def get_document(self, request, api_key: str):
- return Response(Environment.get_environment_document(api_key))
+ environment = (
+ self.get_object()
+ ) # use get_object to ensure permissions check is performed
+ return Response(Environment.get_environment_document(environment.api_key))
@swagger_auto_schema(request_body=no_body, responses={202: ""})
@action(detail=True, methods=["POST"], url_path="enable-v2-versioning")
diff --git a/api/features/feature_external_resources/models.py b/api/features/feature_external_resources/models.py
index 6dcfc4d99a7f..8df2428e3158 100644
--- a/api/features/feature_external_resources/models.py
+++ b/api/features/feature_external_resources/models.py
@@ -1,3 +1,4 @@
+import json
import logging
from django.db import models
@@ -11,9 +12,11 @@
from environments.models import Environment
from features.models import Feature, FeatureState
+from integrations.github.constants import GitHubEventType, GitHubTag
from integrations.github.github import call_github_task
+from integrations.github.models import GithubRepository
from organisations.models import Organisation
-from webhooks.webhooks import WebhookEventType
+from projects.tags.models import Tag, TagType
logger = logging.getLogger(__name__)
@@ -24,6 +27,20 @@ class ResourceType(models.TextChoices):
GITHUB_PR = "GITHUB_PR", "GitHub PR"
+tag_by_type_and_state = {
+ ResourceType.GITHUB_ISSUE.value: {
+ "open": GitHubTag.ISSUE_OPEN.value,
+ "closed": GitHubTag.ISSUE_CLOSED.value,
+ },
+ ResourceType.GITHUB_PR.value: {
+ "open": GitHubTag.PR_OPEN.value,
+ "closed": GitHubTag.PR_CLOSED.value,
+ "merged": GitHubTag.PR_MERGED.value,
+ "draft": GitHubTag.PR_DRAFT.value,
+ },
+}
+
+
class FeatureExternalResource(LifecycleModelMixin, models.Model):
url = models.URLField()
type = models.CharField(max_length=20, choices=ResourceType.choices)
@@ -49,12 +66,33 @@ class Meta:
@hook(AFTER_SAVE)
def execute_after_save_actions(self):
+ # Tag the feature with the external resource type
+ metadata = json.loads(self.metadata) if self.metadata else {}
+ state = metadata.get("state", "open")
+
# Add a comment to GitHub Issue/PR when feature is linked to the GH external resource
+ # and tag the feature with the corresponding tag if tagging is enabled
if (
- Organisation.objects.prefetch_related("github_config")
+ github_configuration := Organisation.objects.prefetch_related(
+ "github_config"
+ )
.get(id=self.feature.project.organisation_id)
.github_config.first()
):
+ github_repo = GithubRepository.objects.get(
+ github_configuration=github_configuration.id,
+ project=self.feature.project,
+ )
+ if github_repo.tagging_enabled:
+ github_tag = Tag.objects.get(
+ label=tag_by_type_and_state[self.type][state],
+ project=self.feature.project,
+ is_system_tag=True,
+ type=TagType.GITHUB.value,
+ )
+ self.feature.tags.add(github_tag)
+ self.feature.save()
+
feature_states: list[FeatureState] = []
environments = Environment.objects.filter(
@@ -74,7 +112,7 @@ def execute_after_save_actions(self):
call_github_task(
organisation_id=self.feature.project.organisation_id,
- type=WebhookEventType.FEATURE_EXTERNAL_RESOURCE_ADDED.value,
+ type=GitHubEventType.FEATURE_EXTERNAL_RESOURCE_ADDED.value,
feature=self.feature,
segment_name=None,
url=None,
@@ -92,7 +130,7 @@ def execute_before_save_actions(self) -> None:
call_github_task(
organisation_id=self.feature.project.organisation_id,
- type=WebhookEventType.FEATURE_EXTERNAL_RESOURCE_REMOVED.value,
+ type=GitHubEventType.FEATURE_EXTERNAL_RESOURCE_REMOVED.value,
feature=self.feature,
segment_name=None,
url=self.url,
diff --git a/api/features/feature_external_resources/views.py b/api/features/feature_external_resources/views.py
index c9636bba1132..002a8e5da89a 100644
--- a/api/features/feature_external_resources/views.py
+++ b/api/features/feature_external_resources/views.py
@@ -1,10 +1,16 @@
+import re
+
from django.shortcuts import get_object_or_404
from rest_framework import status, viewsets
from rest_framework.response import Response
from features.models import Feature
from features.permissions import FeatureExternalResourcePermissions
-from integrations.github.client import get_github_issue_pr_title_and_state
+from integrations.github.client import (
+ get_github_issue_pr_title_and_state,
+ label_github_issue_pr,
+)
+from integrations.github.models import GithubRepository
from organisations.models import Organisation
from .models import FeatureExternalResource
@@ -48,14 +54,13 @@ def create(self, request, *args, **kwargs):
),
)
- if not (
- (
- Organisation.objects.prefetch_related("github_config")
- .get(id=feature.project.organisation_id)
- .github_config.first()
- )
- or not hasattr(feature.project, "github_project")
- ):
+ github_configuration = (
+ Organisation.objects.prefetch_related("github_config")
+ .get(id=feature.project.organisation_id)
+ .github_config.first()
+ )
+
+ if not github_configuration or not hasattr(feature.project, "github_project"):
return Response(
data={
"detail": "This Project doesn't have a valid GitHub integration configuration"
@@ -63,7 +68,42 @@ def create(self, request, *args, **kwargs):
content_type="application/json",
status=status.HTTP_400_BAD_REQUEST,
)
- return super().create(request, *args, **kwargs)
+
+ # Get repository owner and name, and issue/PR number from the external resource URL
+ url = request.data.get("url")
+ if request.data.get("type") == "GITHUB_PR":
+ pattern = r"github.com/([^/]+)/([^/]+)/pull/(\d+)$"
+ elif request.data.get("type") == "GITHUB_ISSUE":
+ pattern = r"github.com/([^/]+)/([^/]+)/issues/(\d+)$"
+ else:
+ return Response(
+ data={"detail": "Incorrect GitHub type"},
+ content_type="application/json",
+ status=status.HTTP_400_BAD_REQUEST,
+ )
+
+ url_match = re.search(pattern, url)
+ if url_match:
+ owner, repo, issue = url_match.groups()
+ if GithubRepository.objects.get(
+ github_configuration=github_configuration,
+ repository_owner=owner,
+ repository_name=repo,
+ ).tagging_enabled:
+ label_github_issue_pr(
+ installation_id=github_configuration.installation_id,
+ owner=owner,
+ repo=repo,
+ issue=issue,
+ )
+ response = super().create(request, *args, **kwargs)
+ return response
+ else:
+ return Response(
+ data={"detail": "Invalid GitHub Issue/PR URL"},
+ content_type="application/json",
+ status=status.HTTP_400_BAD_REQUEST,
+ )
def perform_update(self, serializer):
external_resource_id = int(self.kwargs["pk"])
diff --git a/api/features/models.py b/api/features/models.py
index 6a80c5c90251..1493a54c4849 100644
--- a/api/features/models.py
+++ b/api/features/models.py
@@ -74,6 +74,7 @@
STRING,
)
from features.versioning.models import EnvironmentFeatureVersion
+from integrations.github.constants import GitHubEventType
from metadata.models import Metadata
from projects.models import Project
from projects.tags.models import Tag
@@ -139,7 +140,6 @@ class Meta:
@hook(AFTER_SAVE)
def create_github_comment(self) -> None:
from integrations.github.github import call_github_task
- from webhooks.webhooks import WebhookEventType
if (
self.external_resources.exists()
@@ -150,7 +150,7 @@ def create_github_comment(self) -> None:
call_github_task(
organisation_id=self.project.organisation_id,
- type=WebhookEventType.FLAG_DELETED.value,
+ type=GitHubEventType.FLAG_DELETED.value,
feature=self,
segment_name=None,
url=None,
@@ -406,7 +406,6 @@ def _get_environment(self) -> "Environment":
@hook(AFTER_DELETE)
def create_github_comment(self) -> None:
from integrations.github.github import call_github_task
- from webhooks.webhooks import WebhookEventType
if (
self.feature.external_resources.exists()
@@ -416,7 +415,7 @@ def create_github_comment(self) -> None:
call_github_task(
self.feature.project.organisation_id,
- WebhookEventType.SEGMENT_OVERRIDE_DELETED.value,
+ GitHubEventType.SEGMENT_OVERRIDE_DELETED.value,
self.feature,
self.segment.name,
None,
diff --git a/api/features/serializers.py b/api/features/serializers.py
index a2a297238637..5d4de0005517 100644
--- a/api/features/serializers.py
+++ b/api/features/serializers.py
@@ -19,6 +19,7 @@
from environments.sdk.serializers_mixins import (
HideSensitiveFieldsSerializerMixin,
)
+from integrations.github.constants import GitHubEventType
from integrations.github.github import call_github_task
from metadata.serializers import MetadataSerializer, SerializerWithMetadata
from projects.models import Project
@@ -30,7 +31,6 @@
from util.drf_writable_nested.serializers import (
DeleteBeforeUpdateWritableNestedModelSerializer,
)
-from webhooks.webhooks import WebhookEventType
from .constants import INTERSECTION, UNION
from .feature_segments.serializers import (
@@ -478,7 +478,7 @@ def save(self, **kwargs):
call_github_task(
organisation_id=feature_state.feature.project.organisation_id,
- type=WebhookEventType.FLAG_UPDATED.value,
+ type=GitHubEventType.FLAG_UPDATED.value,
feature=feature_state.feature,
segment_name=None,
url=None,
diff --git a/api/features/versioning/serializers.py b/api/features/versioning/serializers.py
index ec21cdfae5ae..90295f222b8a 100644
--- a/api/features/versioning/serializers.py
+++ b/api/features/versioning/serializers.py
@@ -8,10 +8,10 @@
CustomCreateSegmentOverrideFeatureStateSerializer,
)
from features.versioning.models import EnvironmentFeatureVersion
+from integrations.github.constants import GitHubEventType
from integrations.github.github import call_github_task
from segments.models import Segment
from users.models import FFAdminUser
-from webhooks.webhooks import WebhookEventType
class CustomEnvironmentFeatureVersionFeatureStateSerializer(
@@ -36,7 +36,7 @@ def save(self, **kwargs):
call_github_task(
organisation_id=feature_state.environment.project.organisation_id,
- type=WebhookEventType.FLAG_UPDATED.value,
+ type=GitHubEventType.FLAG_UPDATED.value,
feature=feature_state.feature,
segment_name=None,
url=None,
diff --git a/api/features/views.py b/api/features/views.py
index ec0aad480f73..b276bd8f4b98 100644
--- a/api/features/views.py
+++ b/api/features/views.py
@@ -1,5 +1,6 @@
import logging
import typing
+from datetime import timedelta
from functools import reduce
from app_analytics.analytics_db_service import get_feature_evaluation_data
@@ -9,6 +10,7 @@
from django.conf import settings
from django.core.cache import caches
from django.db.models import Max, Q, QuerySet
+from django.utils import timezone
from django.utils.decorators import method_decorator
from django.views.decorators.cache import cache_page
from drf_yasg import openapi
@@ -352,8 +354,15 @@ def get_influx_data(self, request, pk, project_pk):
query_serializer = GetInfluxDataQuerySerializer(data=request.query_params)
query_serializer.is_valid(raise_exception=True)
+ period = query_serializer.data["period"]
+ now = timezone.now()
+ if period.endswith("h"):
+ date_start = now - timedelta(hours=int(period[:-1]))
+ elif period.endswith("d"):
+ date_start = now - timedelta(days=int(period[:-1]))
+ else:
+ raise ValidationError("Malformed period supplied")
- date_start = f"-{query_serializer.data['period']}"
events_list = get_multiple_event_list_for_feature(
feature_name=feature.name,
date_start=date_start,
diff --git a/api/integrations/github/client.py b/api/integrations/github/client.py
index b4307fd736f9..4d4293b30a48 100644
--- a/api/integrations/github/client.py
+++ b/api/integrations/github/client.py
@@ -1,3 +1,4 @@
+import json
import logging
from enum import Enum
from typing import Any
@@ -5,11 +6,15 @@
import requests
from django.conf import settings
from github import Auth, Github
+from requests.exceptions import HTTPError
from integrations.github.constants import (
GITHUB_API_CALLS_TIMEOUT,
GITHUB_API_URL,
GITHUB_API_VERSION,
+ GITHUB_FLAGSMITH_LABEL,
+ GITHUB_FLAGSMITH_LABEL_COLOR,
+ GITHUB_FLAGSMITH_LABEL_DESCRIPTION,
)
from integrations.github.dataclasses import (
IssueQueryParams,
@@ -159,6 +164,9 @@ def fetch_search_github_resource(
"id": i["id"],
"title": i["title"],
"number": i["number"],
+ "state": i["state"],
+ "merged": i.get("merged", False),
+ "draft": i.get("draft", False),
}
for i in json_response["items"]
]
@@ -244,3 +252,45 @@ def fetch_github_repo_contributors(
]
return build_paginated_response(results, response)
+
+
+def create_flagsmith_flag_label(
+ installation_id: str, owner: str, repo: str
+) -> dict[str, Any]:
+ # Create "Flagsmith Flag" label in linked repo
+ url = f"{GITHUB_API_URL}repos/{owner}/{repo}/labels"
+ headers = build_request_headers(installation_id)
+ payload = {
+ "name": GITHUB_FLAGSMITH_LABEL,
+ "color": GITHUB_FLAGSMITH_LABEL_COLOR,
+ "description": GITHUB_FLAGSMITH_LABEL_DESCRIPTION,
+ }
+ try:
+ response = requests.post(
+ url, json=payload, headers=headers, timeout=GITHUB_API_CALLS_TIMEOUT
+ )
+ response.raise_for_status()
+ return response.json()
+
+ except HTTPError:
+ response_content = response.content.decode("utf-8")
+ error_data = json.loads(response_content)
+ if any(
+ error["code"] == "already_exists" for error in error_data.get("errors", [])
+ ):
+ logger.warning("Label already exists")
+ return {"message": "Label already exists"}, 200
+
+
+def label_github_issue_pr(
+ installation_id: str, owner: str, repo: str, issue: str
+) -> dict[str, Any]:
+ # Label linked GitHub Issue or PR with the "Flagsmith Flag" label
+ url = f"{GITHUB_API_URL}repos/{owner}/{repo}/issues/{issue}/labels"
+ headers = build_request_headers(installation_id)
+ payload = [GITHUB_FLAGSMITH_LABEL]
+ response = requests.post(
+ url, json=payload, headers=headers, timeout=GITHUB_API_CALLS_TIMEOUT
+ )
+ response.raise_for_status()
+ return response.json()
diff --git a/api/integrations/github/constants.py b/api/integrations/github/constants.py
index 929ea690b87e..897b3a2c6725 100644
--- a/api/integrations/github/constants.py
+++ b/api/integrations/github/constants.py
@@ -1,3 +1,5 @@
+from enum import Enum
+
GITHUB_API_URL = "https://api.github.com/"
GITHUB_API_VERSION = "2022-11-28"
@@ -7,11 +9,49 @@
| :--- | :----- | :------ | :------ |\n"""
FEATURE_TABLE_ROW = "| [%s](%s) | %s | %s | %s |\n"
LINK_SEGMENT_TITLE = "Segment `%s` values:\n"
-UNLINKED_FEATURE_TEXT = "### The feature flag `%s` was unlinked from the issue/PR"
-UPDATED_FEATURE_TEXT = "Flagsmith Feature `%s` has been updated:\n"
-DELETED_FEATURE_TEXT = "### The Feature Flag `%s` was deleted"
+UNLINKED_FEATURE_TEXT = "**The feature flag `%s` was unlinked from the issue/PR**"
+UPDATED_FEATURE_TEXT = "**Flagsmith Feature `%s` has been updated:**\n"
+FEATURE_UPDATED_FROM_GHA_TEXT = (
+ "**Flagsmith Feature `%s` has been updated from GHA:**\n"
+)
+DELETED_FEATURE_TEXT = "**The Feature Flag `%s` was deleted**"
DELETED_SEGMENT_OVERRIDE_TEXT = (
- "### The Segment Override `%s` for Feature Flag `%s` was deleted"
+ "**The Segment Override `%s` for Feature Flag `%s` was deleted**"
)
FEATURE_ENVIRONMENT_URL = "%s/project/%s/environment/%s/features?feature=%s&tab=%s"
GITHUB_API_CALLS_TIMEOUT = 10
+
+GITHUB_TAG_COLOR = "#838992"
+GITHUB_FLAGSMITH_LABEL = "Flagsmith Flag"
+GITHUB_FLAGSMITH_LABEL_DESCRIPTION = (
+ "This GitHub Issue/PR is linked to a Flagsmith Feature Flag"
+)
+GITHUB_FLAGSMITH_LABEL_COLOR = "6633FF"
+
+
+class GitHubEventType(Enum):
+ FLAG_UPDATED = "FLAG_UPDATED"
+ FLAG_DELETED = "FLAG_DELETED"
+ FLAG_UPDATED_FROM_GHA = "FLAG_UPDATED_FROM_GHA"
+ FEATURE_EXTERNAL_RESOURCE_ADDED = "FEATURE_EXTERNAL_RESOURCE_ADDED"
+ FEATURE_EXTERNAL_RESOURCE_REMOVED = "FEATURE_EXTERNAL_RESOURCE_REMOVED"
+ SEGMENT_OVERRIDE_DELETED = "SEGMENT_OVERRIDE_DELETED"
+
+
+class GitHubTag(Enum):
+ PR_OPEN = "PR Open"
+ PR_MERGED = "PR Merged"
+ PR_CLOSED = "PR Closed"
+ PR_DRAFT = "PR Draft"
+ ISSUE_OPEN = "Issue Open"
+ ISSUE_CLOSED = "Issue Closed"
+
+
+github_tag_description = {
+ GitHubTag.PR_OPEN.value: "This feature has a linked PR open",
+ GitHubTag.PR_MERGED.value: "This feature has a linked PR merged",
+ GitHubTag.PR_CLOSED.value: "This feature has a linked PR closed",
+ GitHubTag.PR_DRAFT.value: "This feature has a linked PR draft",
+ GitHubTag.ISSUE_OPEN.value: "This feature has a linked issue open",
+ GitHubTag.ISSUE_CLOSED.value: "This feature has a linked issue closed",
+}
diff --git a/api/integrations/github/github.py b/api/integrations/github/github.py
index ddbbdcc04698..6b3573903dde 100644
--- a/api/integrations/github/github.py
+++ b/api/integrations/github/github.py
@@ -4,6 +4,7 @@
from typing import Any
from core.helpers import get_current_site_url
+from django.db.models import Q
from django.utils.formats import get_format
from features.models import Feature, FeatureState, FeatureStateValue
@@ -17,14 +18,69 @@
LINK_SEGMENT_TITLE,
UNLINKED_FEATURE_TEXT,
UPDATED_FEATURE_TEXT,
+ GitHubEventType,
+ GitHubTag,
)
from integrations.github.dataclasses import GithubData
from integrations.github.models import GithubConfiguration
from integrations.github.tasks import call_github_app_webhook_for_feature_state
-from webhooks.webhooks import WebhookEventType
+from projects.tags.models import Tag, TagType
logger = logging.getLogger(__name__)
+tag_by_event_type = {
+ "pull_request": {
+ "closed": GitHubTag.PR_CLOSED.value,
+ "converted_to_draft": GitHubTag.PR_DRAFT.value,
+ "opened": GitHubTag.PR_OPEN.value,
+ "reopened": GitHubTag.PR_OPEN.value,
+ "ready_for_review": GitHubTag.PR_OPEN.value,
+ "merged": GitHubTag.PR_MERGED.value,
+ },
+ "issues": {
+ "closed": GitHubTag.ISSUE_CLOSED.value,
+ "opened": GitHubTag.ISSUE_OPEN.value,
+ "reopened": GitHubTag.ISSUE_OPEN.value,
+ },
+}
+
+
+def tag_feature_per_github_event(
+ event_type: str, action: str, metadata: dict[str, Any]
+) -> None:
+
+ # Get Feature with external resource of type GITHUB and url matching the resource URL
+ feature = Feature.objects.filter(
+ Q(external_resources__type="GITHUB_PR")
+ | Q(external_resources__type="GITHUB_ISSUE"),
+ external_resources__url=metadata.get("html_url"),
+ ).first()
+
+ if feature:
+ if (
+ event_type == "pull_request"
+ and action == "closed"
+ and metadata.get("merged")
+ ):
+ action = "merged"
+ # Get corresponding project Tag to tag the feature
+ github_tag = Tag.objects.get(
+ label=tag_by_event_type[event_type][action],
+ project=feature.project_id,
+ is_system_tag=True,
+ type=TagType.GITHUB.value,
+ )
+ tag_label_pattern = "Issue" if event_type == "issues" else "PR"
+ # Remove all GITHUB tags from the feature which label starts with issue or pr depending on event_type
+ feature.tags.remove(
+ *feature.tags.filter(
+ Q(type=TagType.GITHUB.value) & Q(label__startswith=tag_label_pattern)
+ )
+ )
+
+ feature.tags.add(github_tag)
+ feature.save()
+
def handle_installation_deleted(payload: dict[str, Any]) -> None:
installation_id = payload.get("installation", {}).get("id")
@@ -42,6 +98,10 @@ def handle_installation_deleted(payload: dict[str, Any]) -> None:
def handle_github_webhook_event(event_type: str, payload: dict[str, Any]) -> None:
if event_type == "installation" and payload.get("action") == "deleted":
handle_installation_deleted(payload)
+ elif event_type in tag_by_event_type:
+ action = str(payload.get("action"))
+ metadata = payload.get("issue", {}) or payload.get("pull_request", {})
+ tag_feature_per_github_event(event_type, action, metadata)
def generate_body_comment(
@@ -53,13 +113,12 @@ def generate_body_comment(
segment_name: str | None = None,
) -> str:
- is_update = event_type == WebhookEventType.FLAG_UPDATED.value
- is_removed = event_type == WebhookEventType.FEATURE_EXTERNAL_RESOURCE_REMOVED.value
+ is_removed = event_type == GitHubEventType.FEATURE_EXTERNAL_RESOURCE_REMOVED.value
is_segment_override_deleted = (
- event_type == WebhookEventType.SEGMENT_OVERRIDE_DELETED.value
+ event_type == GitHubEventType.SEGMENT_OVERRIDE_DELETED.value
)
- if event_type == WebhookEventType.FLAG_DELETED.value:
+ if event_type == GitHubEventType.FLAG_DELETED.value:
return DELETED_FEATURE_TEXT % (name)
if is_removed:
@@ -68,7 +127,12 @@ def generate_body_comment(
if is_segment_override_deleted and segment_name is not None:
return DELETED_SEGMENT_OVERRIDE_TEXT % (segment_name, name)
- result = UPDATED_FEATURE_TEXT % (name) if is_update else LINK_FEATURE_TITLE % (name)
+ result = ""
+ if event_type == GitHubEventType.FLAG_UPDATED.value:
+ result = UPDATED_FEATURE_TEXT % (name)
+ else:
+ result = LINK_FEATURE_TITLE % (name)
+
last_segment_name = ""
if len(feature_states) > 0 and not feature_states[0].get("segment_name"):
result += FEATURE_TABLE_HEADER
@@ -125,7 +189,7 @@ def generate_data(
if check_not_none(feature_state_value):
feature_env_data["feature_state_value"] = feature_state_value
- if type is not WebhookEventType.FEATURE_EXTERNAL_RESOURCE_REMOVED.value:
+ if type is not GitHubEventType.FEATURE_EXTERNAL_RESOURCE_REMOVED.value:
feature_env_data["environment_name"] = feature_state.environment.name
feature_env_data["enabled"] = feature_state.enabled
feature_env_data["last_updated"] = feature_state.updated_at.strftime(
@@ -150,7 +214,7 @@ def generate_data(
type=type,
url=(
url
- if type == WebhookEventType.FEATURE_EXTERNAL_RESOURCE_REMOVED.value
+ if type == GitHubEventType.FEATURE_EXTERNAL_RESOURCE_REMOVED.value
else None
),
feature_states=feature_states_list if feature_states else None,
diff --git a/api/integrations/github/migrations/0004_githubrepository_tagging_enabled.py b/api/integrations/github/migrations/0004_githubrepository_tagging_enabled.py
new file mode 100644
index 000000000000..a3ded07330d2
--- /dev/null
+++ b/api/integrations/github/migrations/0004_githubrepository_tagging_enabled.py
@@ -0,0 +1,18 @@
+# Generated by Django 3.2.25 on 2024-08-07 17:47
+
+from django.db import migrations, models
+
+
+class Migration(migrations.Migration):
+
+ dependencies = [
+ ('github', '0003_auto_20240528_0640'),
+ ]
+
+ operations = [
+ migrations.AddField(
+ model_name='githubrepository',
+ name='tagging_enabled',
+ field=models.BooleanField(default=False),
+ ),
+ ]
diff --git a/api/integrations/github/models.py b/api/integrations/github/models.py
index 532e0760b9ff..546b009d601e 100644
--- a/api/integrations/github/models.py
+++ b/api/integrations/github/models.py
@@ -3,8 +3,14 @@
from core.models import SoftDeleteExportableModel
from django.db import models
-from django_lifecycle import BEFORE_DELETE, LifecycleModelMixin, hook
+from django_lifecycle import (
+ AFTER_CREATE,
+ BEFORE_DELETE,
+ LifecycleModelMixin,
+ hook,
+)
+from integrations.github.constants import GITHUB_TAG_COLOR
from organisations.models import Organisation
logger: logging.Logger = logging.getLogger(name=__name__)
@@ -48,6 +54,7 @@ class GithubRepository(LifecycleModelMixin, SoftDeleteExportableModel):
null=False,
on_delete=models.CASCADE,
)
+ tagging_enabled = models.BooleanField(default=False)
class Meta:
constraints = [
@@ -84,3 +91,23 @@ def delete_feature_external_resources(
# Filter by url containing the repository owner and name
url__regex=pattern,
).delete()
+
+ @hook(AFTER_CREATE)
+ def create_github_tags(
+ self,
+ ) -> None:
+ from integrations.github.constants import (
+ GitHubTag,
+ github_tag_description,
+ )
+ from projects.tags.models import Tag, TagType
+
+ for tag_label in GitHubTag:
+ tag, created = Tag.objects.get_or_create(
+ color=GITHUB_TAG_COLOR,
+ description=github_tag_description[tag_label.value],
+ label=tag_label.value,
+ project=self.project,
+ is_system_tag=True,
+ type=TagType.GITHUB.value,
+ )
diff --git a/api/integrations/github/serializers.py b/api/integrations/github/serializers.py
index b3dc96c5263e..9d1cf3a81635 100644
--- a/api/integrations/github/serializers.py
+++ b/api/integrations/github/serializers.py
@@ -27,6 +27,7 @@ class Meta:
"project",
"repository_owner",
"repository_name",
+ "tagging_enabled",
)
read_only_fields = (
"id",
diff --git a/api/integrations/github/tasks.py b/api/integrations/github/tasks.py
index 8e94c9007b5d..970067daf0c4 100644
--- a/api/integrations/github/tasks.py
+++ b/api/integrations/github/tasks.py
@@ -6,8 +6,8 @@
from features.models import Feature
from integrations.github.client import post_comment_to_github
+from integrations.github.constants import GitHubEventType
from integrations.github.dataclasses import CallGithubData
-from webhooks.webhooks import WebhookEventType
logger = logging.getLogger(__name__)
@@ -29,8 +29,9 @@ def send_post_request(data: CallGithubData) -> None:
)
if (
- event_type == WebhookEventType.FLAG_UPDATED.value
- or event_type == WebhookEventType.FLAG_DELETED.value
+ event_type == GitHubEventType.FLAG_UPDATED.value
+ or event_type == GitHubEventType.FLAG_DELETED.value
+ or event_type == GitHubEventType.FLAG_UPDATED_FROM_GHA.value
):
for resource in data.feature_external_resources:
url = resource.get("url")
@@ -40,7 +41,7 @@ def send_post_request(data: CallGithubData) -> None:
installation_id, split_url[1], split_url[2], split_url[4], body
)
- elif event_type == WebhookEventType.FEATURE_EXTERNAL_RESOURCE_REMOVED.value:
+ elif event_type == GitHubEventType.FEATURE_EXTERNAL_RESOURCE_REMOVED.value:
url = data.github_data.url
pathname = urlparse(url).path
split_url = pathname.split("/")
@@ -80,8 +81,8 @@ def generate_feature_external_resources(
]
if (
- github_event_data.type == WebhookEventType.FLAG_DELETED.value
- or github_event_data.type == WebhookEventType.SEGMENT_OVERRIDE_DELETED.value
+ github_event_data.type == GitHubEventType.FLAG_DELETED.value
+ or github_event_data.type == GitHubEventType.SEGMENT_OVERRIDE_DELETED.value
):
feature_external_resources = generate_feature_external_resources(
list(
@@ -100,7 +101,7 @@ def generate_feature_external_resources(
if (
github_event_data.type
- == WebhookEventType.FEATURE_EXTERNAL_RESOURCE_REMOVED.value
+ == GitHubEventType.FEATURE_EXTERNAL_RESOURCE_REMOVED.value
):
data = CallGithubData(
event_type=github_event_data.type,
diff --git a/api/integrations/github/views.py b/api/integrations/github/views.py
index 4cc2f3efea57..8d48a17dfa84 100644
--- a/api/integrations/github/views.py
+++ b/api/integrations/github/views.py
@@ -15,13 +15,17 @@
from integrations.github.client import (
ResourceType,
+ create_flagsmith_flag_label,
delete_github_installation,
fetch_github_repo_contributors,
fetch_github_repositories,
fetch_search_github_resource,
)
from integrations.github.exceptions import DuplicateGitHubIntegration
-from integrations.github.github import handle_github_webhook_event
+from integrations.github.github import (
+ handle_github_webhook_event,
+ tag_by_event_type,
+)
from integrations.github.helpers import github_webhook_payload_is_valid
from integrations.github.models import GithubConfiguration, GithubRepository
from integrations.github.permissions import HasPermissionToGithubConfiguration
@@ -136,10 +140,20 @@ def get_queryset(self):
except ValueError:
raise ValidationError({"github_pk": ["Must be an integer"]})
- def create(self, request, *args, **kwargs):
+ def create(self, request, *args, **kwargs) -> Response | None:
try:
- return super().create(request, *args, **kwargs)
+ response: Response = super().create(request, *args, **kwargs)
+ github_configuration: GithubConfiguration = GithubConfiguration.objects.get(
+ id=self.kwargs["github_pk"]
+ )
+ if request.data.get("tagging_enabled", False):
+ create_flagsmith_flag_label(
+ installation_id=github_configuration.installation_id,
+ owner=request.data.get("repository_owner"),
+ repo=request.data.get("repository_name"),
+ )
+ return response
except IntegrityError as e:
if re.search(
@@ -150,6 +164,19 @@ def create(self, request, *args, **kwargs):
detail="Duplication error. The GitHub repository already linked"
)
+ def update(self, request, *args, **kwargs) -> Response | None:
+ response: Response = super().update(request, *args, **kwargs)
+ github_configuration: GithubConfiguration = GithubConfiguration.objects.get(
+ id=self.kwargs["github_pk"]
+ )
+ if request.data.get("tagging_enabled", False):
+ create_flagsmith_flag_label(
+ installation_id=github_configuration.installation_id,
+ owner=request.data.get("repository_owner"),
+ repo=request.data.get("repository_name"),
+ )
+ return response
+
@api_view(["GET"])
@permission_classes([IsAuthenticated, HasPermissionToGithubConfiguration])
@@ -250,7 +277,7 @@ def github_webhook(request) -> Response:
payload_body=payload, secret_token=secret, signature_header=signature
):
data = json.loads(payload.decode("utf-8"))
- if github_event == "installation":
+ if github_event == "installation" or github_event in tag_by_event_type:
handle_github_webhook_event(event_type=github_event, payload=data)
return Response({"detail": "Event processed"}, status=200)
else:
diff --git a/api/organisations/subscription_info_cache.py b/api/organisations/subscription_info_cache.py
index 9fc0819b1d84..a5ec5a017e94 100644
--- a/api/organisations/subscription_info_cache.py
+++ b/api/organisations/subscription_info_cache.py
@@ -1,7 +1,9 @@
import typing
+from datetime import timedelta
from app_analytics.influxdb_wrapper import get_top_organisations
from django.conf import settings
+from django.utils import timezone
from .chargebee import get_subscription_metadata_from_id
from .models import Organisation, OrganisationSubscriptionInformationCache
@@ -70,9 +72,19 @@ def _update_caches_with_influx_data(
if not settings.INFLUXDB_TOKEN:
return
- for date_start, limit in (("-30d", ""), ("-7d", ""), ("-24h", "100")):
- key = f"api_calls_{date_start[1:]}"
+ for _date_start, limit in (("-30d", ""), ("-7d", ""), ("-24h", "100")):
+ key = f"api_calls_{_date_start[1:]}"
+
+ now = timezone.now()
+ if _date_start.endswith("d"):
+ date_start = now - timedelta(days=int(_date_start[1:-1]))
+ elif _date_start.endswith("h"):
+ date_start = now - timedelta(hours=int(_date_start[1:-1]))
+ else:
+ assert False, "Expecting either days (d) or hours (h)" # pragma: no cover
+
org_calls = get_top_organisations(date_start, limit)
+
for org_id, calls in org_calls.items():
subscription_info_cache = organisation_info_cache_dict.get(org_id)
if not subscription_info_cache:
diff --git a/api/organisations/task_helpers.py b/api/organisations/task_helpers.py
index 509811cbf622..65d382358615 100644
--- a/api/organisations/task_helpers.py
+++ b/api/organisations/task_helpers.py
@@ -69,6 +69,7 @@ def _send_api_usage_notification(
context = {
"organisation": organisation,
"matched_threshold": matched_threshold,
+ "grace_period": not hasattr(organisation, "breached_grace_period"),
}
send_mail(
@@ -118,10 +119,9 @@ def handle_api_usage_notification_for_organisation(organisation: Organisation) -
month_delta = relativedelta(now, billing_starts_at).months
period_starts_at = relativedelta(months=month_delta) + billing_starts_at
- days = relativedelta(now, period_starts_at).days
allowed_api_calls = subscription_cache.allowed_30d_api_calls
- api_usage = get_current_api_usage(organisation.id, f"-{days}d")
+ api_usage = get_current_api_usage(organisation.id, period_starts_at)
# For some reason the allowed API calls is set to 0 so default to the max free plan.
allowed_api_calls = allowed_api_calls or MAX_API_CALLS_IN_FREE_PLAN
@@ -140,6 +140,7 @@ def handle_api_usage_notification_for_organisation(organisation: Organisation) -
return
if OrganisationAPIUsageNotification.objects.filter(
+ organisation_id=organisation.id,
notified_at__gt=period_starts_at,
percent_usage__gte=matched_threshold,
).exists():
diff --git a/api/organisations/tasks.py b/api/organisations/tasks.py
index 0ebdc1f5d6ca..f03a9fba3e81 100644
--- a/api/organisations/tasks.py
+++ b/api/organisations/tasks.py
@@ -192,7 +192,7 @@ def charge_for_api_call_count_overages():
continue
subscription_cache = organisation.subscription_information_cache
- api_usage = get_current_api_usage(organisation.id, "30d")
+ api_usage = get_current_api_usage(organisation.id)
# Grace period for organisations < 200% of usage.
if api_usage / subscription_cache.allowed_30d_api_calls < 2.0:
@@ -306,7 +306,7 @@ def restrict_use_due_to_api_limit_grace_period_over() -> None:
OrganisationBreachedGracePeriod.objects.get_or_create(organisation=organisation)
subscription_cache = organisation.subscription_information_cache
- api_usage = get_current_api_usage(organisation.id, "30d")
+ api_usage = get_current_api_usage(organisation.id)
if api_usage / subscription_cache.allowed_30d_api_calls < 1.0:
logger.info(
f"API use for organisation {organisation.id} has fallen to below limit, so not restricting use."
diff --git a/api/organisations/templates/organisations/api_usage_notification.html b/api/organisations/templates/organisations/api_usage_notification.html
index 19b14d13b1b1..74deaac7fc7d 100644
--- a/api/organisations/templates/organisations/api_usage_notification.html
+++ b/api/organisations/templates/organisations/api_usage_notification.html
@@ -19,8 +19,8 @@
for overages after our first grace period of 30 days.
{% else %}
Please note that once 100% use has been breached, the serving of feature flags and admin access may
- be disabled after a 7-day grace period. Please reach out to support@flagsmith.com in order to upgrade
- your account.
+ be disabled{% if grace_period %} after a 7-day grace period{% endif %}. Please reach out to
+ support@flagsmith.com in order to upgrade your account.
{% endif %}
diff --git a/api/organisations/templates/organisations/api_usage_notification.txt b/api/organisations/templates/organisations/api_usage_notification.txt
index f5646e87abcd..e02fe3f967c5 100644
--- a/api/organisations/templates/organisations/api_usage_notification.txt
+++ b/api/organisations/templates/organisations/api_usage_notification.txt
@@ -8,8 +8,8 @@ If this is expected, no action is required. If you are expecting to go over, you
limits by reaching out to support@flagsmith.com. We will automatically charge for overages after our first grace period
of 30 days.
{% else %}
-Please note that once 100% use has been breached, the serving of feature flags and admin access may be disabled after a
-7-day grace period. Please reach out to support@flagsmith.com in order to upgrade your account.
+Please note that once 100% use has been breached, the serving of feature flags and admin access may be disabled{% if grace_period %}
+after a 7-day grace period{% endif %}. Please reach out to support@flagsmith.com in order to upgrade your account.
{% endif %}
Thank you!
diff --git a/api/organisations/templates/organisations/api_usage_notification_limit.html b/api/organisations/templates/organisations/api_usage_notification_limit.html
index 15f4bf9ea3e5..fa79e1196f96 100644
--- a/api/organisations/templates/organisations/api_usage_notification_limit.html
+++ b/api/organisations/templates/organisations/api_usage_notification_limit.html
@@ -18,8 +18,8 @@
more information. You can reach out to support@flagsmith.com if you’d like to take advantage of better
contracted rates.
{% else %}
- Please note that the serving of feature flags and admin access will be disabled after a 7 day grace
- period until the next subscription period. If you’d like to continue service you can upgrade your
+ Please note that the serving of feature flags and admin access will be disabled{% if grace_period %} after a 7 day grace
+ period{% endif %} until the next subscription period. If you’d like to continue service you can upgrade your
organisation’s account (see pricing page).
{% endif %}
diff --git a/api/organisations/templates/organisations/api_usage_notification_limit.txt b/api/organisations/templates/organisations/api_usage_notification_limit.txt
index faf301b74c22..65c8d1eaa48c 100644
--- a/api/organisations/templates/organisations/api_usage_notification_limit.txt
+++ b/api/organisations/templates/organisations/api_usage_notification_limit.txt
@@ -7,9 +7,9 @@ has reached {{ matched_threshold }}% of your API usage within the current subscr
We will charge for overages after our first grace period of 30 days. Please see the pricing page for more information.
You can reach out to support@flagsmith.com if you’d like to take advantage of better contracted rates.
{% else %}
-Please note that the serving of feature flags and admin access will be disabled after a 7 day grace period until the
-next subscription period. If you’d like to continue service you can upgrade your organisation’s account (see pricing
-page).
+Please note that the serving of feature flags and admin access will be disabled{% if grace_period %} after a 7 day
+grace period{% endif %} until the next subscription period. If you’d like to continue service you can upgrade your
+organisation’s account (see pricing page).
{% endif %}
Thank you!
diff --git a/api/permissions/permission_service.py b/api/permissions/permission_service.py
index aa938401f228..9cff1f13e486 100644
--- a/api/permissions/permission_service.py
+++ b/api/permissions/permission_service.py
@@ -134,12 +134,13 @@ def get_permitted_environments_for_user(
return queryset.prefetch_related("metadata")
return queryset
- base_filter = get_base_permission_filter(
+ environment_ids_from_base_filter = get_object_id_from_base_permission_filter(
user, Environment, permission_key, tag_ids=tag_ids
)
- filter_ = base_filter & Q(project=project)
+ queryset = Environment.objects.filter(
+ id__in=environment_ids_from_base_filter, project=project
+ )
- queryset = Environment.objects.filter(filter_)
if prefetch_metadata:
queryset = queryset.prefetch_related("metadata")
@@ -148,7 +149,7 @@ def get_permitted_environments_for_user(
# the select parameters. This leads to an N+1 query for
# lists of environments when description is included, as
# each environment object re-queries the DB seperately.
- return queryset.distinct().defer("description")
+ return queryset.defer("description")
def get_permitted_environments_for_master_api_key(
diff --git a/api/projects/permissions.py b/api/projects/permissions.py
index 1b4bddc24a45..2df8503cc1d1 100644
--- a/api/projects/permissions.py
+++ b/api/projects/permissions.py
@@ -48,12 +48,14 @@ def has_permission(self, request, view):
subscription_metadata = (
organisation.subscription.get_subscription_metadata()
)
+
total_projects_created = Project.objects.filter(
organisation=organisation
).count()
if (
subscription_metadata.projects
and total_projects_created >= subscription_metadata.projects
+ and not getattr(request, "is_e2e", False) is True
):
return False
if organisation.restrict_project_create_to_admin:
diff --git a/api/projects/tags/migrations/0006_alter_tag_type.py b/api/projects/tags/migrations/0006_alter_tag_type.py
new file mode 100644
index 000000000000..84736c31bd3f
--- /dev/null
+++ b/api/projects/tags/migrations/0006_alter_tag_type.py
@@ -0,0 +1,18 @@
+# Generated by Django 3.2.25 on 2024-05-27 15:03
+
+from django.db import migrations, models
+
+
+class Migration(migrations.Migration):
+
+ dependencies = [
+ ('tags', '0005_add_tag_fields_for_stale_flags_logic'),
+ ]
+
+ operations = [
+ migrations.AlterField(
+ model_name='tag',
+ name='type',
+ field=models.CharField(choices=[('NONE', 'None'), ('STALE', 'Stale'), ('GITHUB', 'Github')], default='NONE', help_text='Field used to provide a consistent identifier for the FE and API to use for business logic.', max_length=100),
+ ),
+ ]
diff --git a/api/projects/tags/models.py b/api/projects/tags/models.py
index 597bebdd0d9a..35c75eecfb7e 100644
--- a/api/projects/tags/models.py
+++ b/api/projects/tags/models.py
@@ -7,6 +7,7 @@
class TagType(models.Choices):
NONE = "NONE"
STALE = "STALE"
+ GITHUB = "GITHUB"
class Tag(AbstractBaseExportableModel):
diff --git a/api/sales_dashboard/views.py b/api/sales_dashboard/views.py
index 27861b5b7022..53513ab87d8b 100644
--- a/api/sales_dashboard/views.py
+++ b/api/sales_dashboard/views.py
@@ -1,4 +1,5 @@
import json
+from datetime import timedelta
import re2 as re
from app_analytics.influxdb_wrapper import (
@@ -19,6 +20,7 @@
from django.shortcuts import get_object_or_404
from django.template import loader
from django.urls import reverse, reverse_lazy
+from django.utils import timezone
from django.utils.safestring import mark_safe
from django.views.generic import ListView
from django.views.generic.edit import FormView
@@ -163,10 +165,11 @@ def organisation_info(request: HttpRequest, organisation_id: int) -> HttpRespons
date_range = request.GET.get("date_range", "180d")
context["date_range"] = date_range
- date_start = f"-{date_range}"
- date_stop = "now()"
+ assert date_range.endswith("d")
+ now = timezone.now()
+ date_start = now - timedelta(days=int(date_range[:-1]))
event_list, labels = get_event_list_for_organisation(
- organisation_id, date_start, date_stop
+ organisation_id, date_start
)
context["event_list"] = event_list
context["traits"] = mark_safe(json.dumps(event_list["traits"]))
@@ -176,13 +179,16 @@ def organisation_info(request: HttpRequest, organisation_id: int) -> HttpRespons
json.dumps(event_list["environment-document"])
)
context["labels"] = mark_safe(json.dumps(labels))
+
+ date_starts = {}
+ date_starts["24h"] = now - timedelta(days=1)
+ date_starts["7d"] = now - timedelta(days=7)
+ date_starts["30d"] = now - timedelta(days=30)
context["api_calls"] = {
# TODO: this could probably be reduced to a single influx request
# rather than 3
- range_: get_events_for_organisation(
- organisation_id, date_start=f"-{range_}"
- )
- for range_ in ("24h", "7d", "30d")
+ period: get_events_for_organisation(organisation_id, date_start=_date_start)
+ for period, _date_start in date_starts.items()
}
return HttpResponse(template.render(context, request))
diff --git a/api/scripts/run-docker.sh b/api/scripts/run-docker.sh
index 939a339b504b..88ae045c98f5 100755
--- a/api/scripts/run-docker.sh
+++ b/api/scripts/run-docker.sh
@@ -1,8 +1,14 @@
#!/bin/sh
set -e
+function waitfordb() {
+ if [ -z "${SKIP_WAIT_FOR_DB}" ]; then
+ python manage.py waitfordb "$@"
+ fi
+}
+
function migrate () {
- python manage.py waitfordb && python manage.py migrate && python manage.py createcachetable
+ waitfordb && python manage.py migrate && python manage.py createcachetable
}
function serve() {
# configuration parameters for statsd. Docs can be found here:
@@ -10,7 +16,7 @@ function serve() {
export STATSD_PORT=${STATSD_PORT:-8125}
export STATSD_PREFIX=${STATSD_PREFIX:-flagsmith.api}
- python manage.py waitfordb
+ waitfordb
exec gunicorn --bind 0.0.0.0:8000 \
--worker-tmp-dir /dev/shm \
@@ -26,9 +32,9 @@ function serve() {
app.wsgi
}
function run_task_processor() {
- python manage.py waitfordb --waitfor 30 --migrations
+ waitfordb --waitfor 30 --migrations
if [[ -n "$ANALYTICS_DATABASE_URL" || -n "$DJANGO_DB_NAME_ANALYTICS" ]]; then
- python manage.py waitfordb --waitfor 30 --migrations --database analytics
+ waitfordb --waitfor 30 --migrations --database analytics
fi
RUN_BY_PROCESSOR=1 exec python manage.py runprocessor \
--sleepintervalms ${TASK_PROCESSOR_SLEEP_INTERVAL:-500} \
diff --git a/api/segments/migrations/0023_add_versioning_to_segments.py b/api/segments/migrations/0023_add_versioning_to_segments.py
index 0e7c8f0d5a03..16473f1cf587 100644
--- a/api/segments/migrations/0023_add_versioning_to_segments.py
+++ b/api/segments/migrations/0023_add_versioning_to_segments.py
@@ -1,9 +1,17 @@
# Generated by Django 3.2.25 on 2024-06-10 15:31
-import os
+from pathlib import Path
import django.db.models.deletion
from django.db import migrations, models
+parent_dir = Path(__file__).parent.resolve()
+
+with open(parent_dir / "sql/0023_add_versioning_to_segments.sql") as f:
+ segment_versioning_sql_forwards = f.read()
+
+with open(parent_dir / "sql/0023_add_versioning_to_segments_reverse.sql") as f:
+ segment_versioning_sql_reverse = f.read()
+
class Migration(migrations.Migration):
@@ -46,13 +54,7 @@ class Migration(migrations.Migration):
),
),
migrations.RunSQL(
- sql=open(
- os.path.join(
- os.path.dirname(__file__),
- "sql",
- "0023_add_versioning_to_segments.sql",
- )
- ).read(),
- reverse_sql=migrations.RunSQL.noop,
+ sql=segment_versioning_sql_forwards,
+ reverse_sql=segment_versioning_sql_reverse,
),
]
diff --git a/api/segments/migrations/sql/0023_add_versioning_to_segments_reverse.sql b/api/segments/migrations/sql/0023_add_versioning_to_segments_reverse.sql
new file mode 100644
index 000000000000..7d78f144d9ec
--- /dev/null
+++ b/api/segments/migrations/sql/0023_add_versioning_to_segments_reverse.sql
@@ -0,0 +1,3 @@
+UPDATE segments_segment
+SET deleted_at = now()
+WHERE version_of_id <> id;
diff --git a/api/tests/integration/custom_auth/end_to_end/test_custom_auth_integration.py b/api/tests/integration/custom_auth/end_to_end/test_custom_auth_integration.py
index aec4b9b4327f..e274475119b5 100644
--- a/api/tests/integration/custom_auth/end_to_end/test_custom_auth_integration.py
+++ b/api/tests/integration/custom_auth/end_to_end/test_custom_auth_integration.py
@@ -12,7 +12,7 @@
from organisations.invites.models import Invite
from organisations.models import Organisation
-from users.models import FFAdminUser
+from users.models import FFAdminUser, SignUpType
def test_register_and_login_workflows(db: None, api_client: APIClient) -> None:
@@ -124,6 +124,7 @@ def test_can_register_with_invite_if_registration_disabled_without_invite(
"password": password,
"first_name": "test",
"last_name": "register",
+ "sign_up_type": SignUpType.INVITE_EMAIL.value,
}
Invite.objects.create(email=email, organisation=organisation)
diff --git a/api/tests/integration/e2etests/end_to_end/test_integration_e2e_tests.py b/api/tests/integration/e2etests/end_to_end/test_integration_e2e_tests.py
index c2d187007839..731ffe28d3a2 100644
--- a/api/tests/integration/e2etests/end_to_end/test_integration_e2e_tests.py
+++ b/api/tests/integration/e2etests/end_to_end/test_integration_e2e_tests.py
@@ -5,6 +5,7 @@
from rest_framework.test import APIClient
from organisations.models import Subscription
+from organisations.subscriptions.constants import SCALE_UP
from users.models import FFAdminUser
@@ -14,8 +15,8 @@ def test_e2e_teardown(settings, db) -> None:
token = "test-token"
register_url = "/api/v1/auth/users/"
settings.ENABLE_FE_E2E = True
-
- os.environ["E2E_TEST_AUTH_TOKEN"] = token
+ settings.E2E_TEST_AUTH_TOKEN = token
+ settings.MIDDLEWARE.append("e2etests.middleware.E2ETestMiddleware")
client = APIClient(HTTP_X_E2E_TEST_AUTH_TOKEN=token)
@@ -41,7 +42,9 @@ def test_e2e_teardown(settings, db) -> None:
for subscription in Subscription.objects.filter(
organisation__in=e2e_user.organisations.all()
):
- assert subscription.max_seats == 2
+ assert subscription.max_seats == 8
+ assert subscription.plan == SCALE_UP
+ assert subscription.subscription_id == "test_subscription_id"
def test_e2e_teardown_with_incorrect_token(settings, db):
diff --git a/api/tests/integration/edge_api/identities/conftest.py b/api/tests/integration/edge_api/identities/conftest.py
index f22e02366dda..298bd6165ee6 100644
--- a/api/tests/integration/edge_api/identities/conftest.py
+++ b/api/tests/integration/edge_api/identities/conftest.py
@@ -1,4 +1,13 @@
+from typing import Any
+
import pytest
+from boto3.dynamodb.conditions import Key
+from flag_engine.identities.models import IdentityModel
+
+from edge_api.identities.models import EdgeIdentity
+from environments.dynamodb.wrappers.environment_wrapper import (
+ DynamoEnvironmentV2Wrapper,
+)
@pytest.fixture()
@@ -6,3 +15,26 @@ def webhook_mock(mocker):
return mocker.patch(
"edge_api.identities.serializers.call_environment_webhook_for_feature_state_change"
)
+
+
+@pytest.fixture()
+def identity_overrides_v2(
+ dynamo_enabled_environment: int,
+ identity_document_without_fs: dict[str, Any],
+ identity_document: dict[str, Any],
+ dynamodb_wrapper_v2: DynamoEnvironmentV2Wrapper,
+) -> list[str]:
+ edge_identity = EdgeIdentity.from_identity_document(identity_document_without_fs)
+ for feature_override in IdentityModel.model_validate(
+ identity_document
+ ).identity_features:
+ edge_identity.add_feature_override(feature_override)
+ edge_identity.save()
+ return [
+ item["document_key"]
+ for item in dynamodb_wrapper_v2.query_get_all_items(
+ KeyConditionExpression=Key("environment_id").eq(
+ str(dynamo_enabled_environment)
+ ),
+ )
+ ]
diff --git a/api/tests/integration/edge_api/identities/test_edge_identity_viewset.py b/api/tests/integration/edge_api/identities/test_edge_identity_viewset.py
index ad9a95385a93..ffdadfe5d6a1 100644
--- a/api/tests/integration/edge_api/identities/test_edge_identity_viewset.py
+++ b/api/tests/integration/edge_api/identities/test_edge_identity_viewset.py
@@ -1,11 +1,20 @@
import json
import urllib
+from typing import Any
+from boto3.dynamodb.conditions import Key
from django.urls import reverse
from rest_framework import status
from rest_framework.exceptions import NotFound
+from rest_framework.test import APIClient
from edge_api.identities.views import EdgeIdentityViewSet
+from environments.dynamodb.wrappers.environment_wrapper import (
+ DynamoEnvironmentV2Wrapper,
+)
+from environments.dynamodb.wrappers.identity_wrapper import (
+ DynamoIdentityWrapper,
+)
def test_get_identities_returns_bad_request_if_dynamo_is_not_enabled(
@@ -125,12 +134,15 @@ def test_create_identity_returns_400_if_identity_already_exists(
def test_delete_identity(
- admin_client,
- dynamo_enabled_environment,
- environment_api_key,
- identity_document,
- edge_identity_dynamo_wrapper_mock,
-):
+ admin_client: APIClient,
+ dynamo_enabled_environment: int,
+ environment_api_key: str,
+ identity_document_without_fs: dict[str, Any],
+ identity_document: dict[str, Any],
+ dynamodb_identity_wrapper: DynamoIdentityWrapper,
+ dynamodb_wrapper_v2: DynamoEnvironmentV2Wrapper,
+ identity_overrides_v2: list[str],
+) -> None:
# Given
identity_uuid = identity_document["identity_uuid"]
url = reverse(
@@ -138,20 +150,22 @@ def test_delete_identity(
args=[environment_api_key, identity_uuid],
)
- edge_identity_dynamo_wrapper_mock.get_item_from_uuid_or_404.return_value = (
- identity_document
- )
# When
response = admin_client.delete(url)
# Then
assert response.status_code == status.HTTP_204_NO_CONTENT
- edge_identity_dynamo_wrapper_mock.get_item_from_uuid_or_404.assert_called_with(
- identity_uuid
- )
- edge_identity_dynamo_wrapper_mock.delete_item.assert_called_with(
- identity_document["composite_key"]
+ assert not dynamodb_identity_wrapper.query_items(
+ IndexName="identity_uuid-index",
+ KeyConditionExpression=Key("identity_uuid").eq(identity_uuid),
+ )["Count"]
+ assert not list(
+ dynamodb_wrapper_v2.query_get_all_items(
+ KeyConditionExpression=Key("environment_id").eq(
+ str(dynamo_enabled_environment)
+ )
+ )
)
diff --git a/api/tests/integration/environments/identities/test_integration_identities.py b/api/tests/integration/environments/identities/test_integration_identities.py
index 445eedacfbb4..0537fb66f6ef 100644
--- a/api/tests/integration/environments/identities/test_integration_identities.py
+++ b/api/tests/integration/environments/identities/test_integration_identities.py
@@ -1,8 +1,11 @@
+import hashlib
import json
+from typing import Any, Generator
from unittest import mock
import pytest
from django.urls import reverse
+from pytest_lazyfixture import lazy_fixture
from rest_framework import status
from rest_framework.test import APIClient
@@ -224,13 +227,64 @@ def test_get_feature_states_for_identity_only_makes_one_query_to_get_mv_feature_
assert len(second_identity_response_json["flags"]) == 3
-def test_get_feature_states_for_identity__transient_identity__segment_match_expected(
+@pytest.fixture
+def existing_identity_identifier_data(
+ identity_identifier: str,
+ identity: int,
+) -> dict[str, Any]:
+ return {"identifier": identity_identifier}
+
+
+@pytest.fixture
+def transient_identifier(
+ segment_condition_property: str,
+ segment_condition_value: str,
+) -> Generator[str, None, None]:
+ return hashlib.sha256(
+ f"avalue_a{segment_condition_property}{segment_condition_value}".encode()
+ ).hexdigest()
+
+
+@pytest.mark.parametrize(
+ "transient_data",
+ [
+ pytest.param({"transient": True}, id="with-transient-true"),
+ pytest.param({"transient": False}, id="with-transient-false"),
+ pytest.param({}, id="missing-transient"),
+ ],
+)
+@pytest.mark.parametrize(
+ "identifier_data,expected_identifier",
+ [
+ pytest.param(
+ lazy_fixture("existing_identity_identifier_data"),
+ lazy_fixture("identity_identifier"),
+ id="existing-identifier",
+ ),
+ pytest.param({"identifier": "unseen"}, "unseen", id="new-identifier"),
+ pytest.param(
+ {"identifier": ""},
+ lazy_fixture("transient_identifier"),
+ id="blank-identifier",
+ ),
+ pytest.param(
+ {"identifier": None},
+ lazy_fixture("transient_identifier"),
+ id="null-identifier",
+ ),
+ pytest.param({}, lazy_fixture("transient_identifier"), id="missing-identifier"),
+ ],
+)
+def test_get_feature_states_for_identity__segment_match_expected(
sdk_client: APIClient,
feature: int,
segment: int,
segment_condition_property: str,
segment_condition_value: str,
segment_featurestate: int,
+ identifier_data: dict[str, Any],
+ transient_data: dict[str, Any],
+ expected_identifier: str,
) -> None:
# Given
url = reverse("api-v1:sdk-identities")
@@ -242,14 +296,15 @@ def test_get_feature_states_for_identity__transient_identity__segment_match_expe
url,
data=json.dumps(
{
- "identifier": "unseen",
+ **identifier_data,
+ **transient_data,
"traits": [
{
"trait_key": segment_condition_property,
"trait_value": segment_condition_value,
- }
+ },
+ {"trait_key": "a", "trait_value": "value_a"},
],
- "transient": True,
}
),
content_type="application/json",
@@ -258,6 +313,7 @@ def test_get_feature_states_for_identity__transient_identity__segment_match_expe
# Then
assert response.status_code == status.HTTP_200_OK
response_json = response.json()
+ assert response_json["identifier"] == expected_identifier
assert (
flag_data := next(
(
@@ -272,6 +328,29 @@ def test_get_feature_states_for_identity__transient_identity__segment_match_expe
assert flag_data["feature_state_value"] == "segment override"
+def test_get_feature_states_for_identity__empty_traits__random_identifier_expected(
+ sdk_client: APIClient,
+ environment: int,
+) -> None:
+ # Given
+ url = reverse("api-v1:sdk-identities")
+
+ # When
+ response_1 = sdk_client.post(
+ url,
+ data=json.dumps({"traits": []}),
+ content_type="application/json",
+ )
+ response_2 = sdk_client.post(
+ url,
+ data=json.dumps({"traits": []}),
+ content_type="application/json",
+ )
+
+ # Then
+ assert response_1.json()["identifier"] != response_2.json()["identifier"]
+
+
def test_get_feature_states_for_identity__transient_trait__segment_match_expected(
sdk_client: APIClient,
feature: int,
diff --git a/api/tests/unit/app_analytics/test_unit_app_analytics_cache.py b/api/tests/unit/app_analytics/test_unit_app_analytics_cache.py
index de5f9114d589..88ccc0cbe852 100644
--- a/api/tests/unit/app_analytics/test_unit_app_analytics_cache.py
+++ b/api/tests/unit/app_analytics/test_unit_app_analytics_cache.py
@@ -1,12 +1,18 @@
-from app_analytics.cache import CACHE_FLUSH_INTERVAL, APIUsageCache
+from app_analytics.cache import APIUsageCache, FeatureEvaluationCache
from app_analytics.models import Resource
from django.utils import timezone
from freezegun import freeze_time
+from pytest_django.fixtures import SettingsWrapper
from pytest_mock import MockerFixture
-def test_api_usage_cache(mocker: MockerFixture) -> None:
+def test_api_usage_cache(
+ mocker: MockerFixture,
+ settings: SettingsWrapper,
+) -> None:
# Given
+ settings.PG_API_USAGE_CACHE_SECONDS = 60
+
cache = APIUsageCache()
now = timezone.now()
mocked_track_request_task = mocker.patch("app_analytics.cache.track_request")
@@ -25,7 +31,7 @@ def test_api_usage_cache(mocker: MockerFixture) -> None:
assert not mocked_track_request_task.called
# Now, let's move the time forward
- frozen_time.tick(CACHE_FLUSH_INTERVAL + 1)
+ frozen_time.tick(settings.PG_API_USAGE_CACHE_SECONDS + 1)
# let's track another request(to trigger flush)
cache.track_request(
@@ -71,3 +77,92 @@ def test_api_usage_cache(mocker: MockerFixture) -> None:
# finally, make sure track_request task was not called
assert not mocked_track_request_task.called
+
+
+def test_feature_evaluation_cache(
+ mocker: MockerFixture,
+ settings: SettingsWrapper,
+):
+ # Given
+ settings.FEATURE_EVALUATION_CACHE_SECONDS = 60
+ settings.USE_POSTGRES_FOR_ANALYTICS = False
+ settings.INFLUXDB_TOKEN = "token"
+
+ mocked_track_evaluation_task = mocker.patch(
+ "app_analytics.cache.track_feature_evaluation"
+ )
+ mocked_track_feature_evaluation_influxdb_task = mocker.patch(
+ "app_analytics.cache.track_feature_evaluation_influxdb"
+ )
+ environment_1_id = 1
+ environment_2_id = 2
+ feature_1_name = "feature_1_name"
+ feature_2_name = "feature_2_name"
+
+ cache = FeatureEvaluationCache()
+ now = timezone.now()
+
+ with freeze_time(now) as frozen_time:
+ # Track some feature evaluations
+ for _ in range(10):
+ cache.track_feature_evaluation(environment_1_id, feature_1_name, 1)
+ cache.track_feature_evaluation(environment_1_id, feature_2_name, 1)
+ cache.track_feature_evaluation(environment_2_id, feature_2_name, 1)
+
+ # Make sure the internal tasks were not called
+ assert not mocked_track_evaluation_task.delay.called
+ assert not mocked_track_feature_evaluation_influxdb_task.delay.called
+
+ # Now, let's move the time forward
+ frozen_time.tick(settings.FEATURE_EVALUATION_CACHE_SECONDS + 1)
+
+ # track another evaluation(to trigger cache flush)
+ cache.track_feature_evaluation(environment_1_id, feature_1_name, 1)
+
+ # Then
+ mocked_track_feature_evaluation_influxdb_task.delay.assert_has_calls(
+ [
+ mocker.call(
+ kwargs={
+ "environment_id": environment_1_id,
+ "feature_evaluations": {
+ feature_1_name: 11,
+ feature_2_name: 10,
+ },
+ },
+ ),
+ mocker.call(
+ kwargs={
+ "environment_id": environment_2_id,
+ "feature_evaluations": {feature_2_name: 10},
+ },
+ ),
+ ]
+ )
+ # task responsible for tracking evaluation using postgres was not called
+ assert not mocked_track_evaluation_task.delay.called
+
+ # Next, let's enable postgres tracking
+ settings.USE_POSTGRES_FOR_ANALYTICS = True
+
+ # rest the mock
+ mocked_track_feature_evaluation_influxdb_task.reset_mock()
+
+ # Track another evaluation
+ cache.track_feature_evaluation(environment_1_id, feature_1_name, 1)
+
+ # move time forward again
+ frozen_time.tick(settings.FEATURE_EVALUATION_CACHE_SECONDS + 1)
+
+ # track another one(to trigger cache flush)
+ cache.track_feature_evaluation(environment_1_id, feature_1_name, 1)
+
+ # Assert that the call was made with only the data tracked after the flush interval.
+ mocked_track_evaluation_task.delay.assert_called_once_with(
+ kwargs={
+ "environment_id": environment_1_id,
+ "feature_evaluations": {feature_1_name: 2},
+ }
+ )
+ # and the task for influx was not called
+ assert not mocked_track_feature_evaluation_influxdb_task.delay.called
diff --git a/api/tests/unit/app_analytics/test_unit_app_analytics_influxdb_wrapper.py b/api/tests/unit/app_analytics/test_unit_app_analytics_influxdb_wrapper.py
index 85f8607d7947..d82b6f162978 100644
--- a/api/tests/unit/app_analytics/test_unit_app_analytics_influxdb_wrapper.py
+++ b/api/tests/unit/app_analytics/test_unit_app_analytics_influxdb_wrapper.py
@@ -1,4 +1,4 @@
-from datetime import date, timedelta
+from datetime import date, datetime, timedelta
from typing import Generator, Type
from unittest import mock
from unittest.mock import MagicMock
@@ -9,11 +9,13 @@
from app_analytics.influxdb_wrapper import (
InfluxDBWrapper,
build_filter_string,
+ get_current_api_usage,
get_event_list_for_organisation,
get_events_for_organisation,
get_feature_evaluation_data,
get_multiple_event_list_for_feature,
get_multiple_event_list_for_organisation,
+ get_range_bucket_mappings,
get_top_organisations,
get_usage_data,
)
@@ -21,6 +23,7 @@
from django.utils import timezone
from influxdb_client.client.exceptions import InfluxDBError
from influxdb_client.rest import ApiException
+from pytest_django.fixtures import SettingsWrapper
from pytest_mock import MockerFixture
from urllib3.exceptions import HTTPError
@@ -84,10 +87,12 @@ def test_write_handles_errors(
# but the exception was not raised
+@pytest.mark.freeze_time("2023-01-19T09:09:47.325132+00:00")
def test_influx_db_query_when_get_events_then_query_api_called(monkeypatch):
expected_query = (
(
- f'from(bucket:"{read_bucket}") |> range(start: -30d, stop: now()) '
+ f'from(bucket:"{read_bucket}") |> range(start: 2022-12-20T09:09:47.325132+00:00, '
+ "stop: 2023-01-19T09:09:47.325132+00:00) "
f'|> filter(fn:(r) => r._measurement == "api_call") '
f'|> filter(fn: (r) => r["_field"] == "request_count") '
f'|> filter(fn: (r) => r["organisation_id"] == "{org_id}") '
@@ -117,10 +122,11 @@ def test_influx_db_query_when_get_events_then_query_api_called(monkeypatch):
assert call[2]["query"].replace(" ", "").replace("\n", "") == expected_query
+@pytest.mark.freeze_time("2023-01-19T09:09:47.325132+00:00")
def test_influx_db_query_when_get_events_list_then_query_api_called(monkeypatch):
query = (
f'from(bucket:"{read_bucket}") '
- f"|> range(start: -30d, stop: now()) "
+ f"|> range(start: 2022-12-20T09:09:47.325132+00:00, stop: 2023-01-19T09:09:47.325132+00:00) "
f'|> filter(fn:(r) => r._measurement == "api_call") '
f'|> filter(fn: (r) => r["organisation_id"] == "{org_id}") '
f'|> drop(columns: ["organisation", "organisation_id", "type", "project", '
@@ -180,6 +186,7 @@ def test_influx_db_query_when_get_events_list_then_query_api_called(monkeypatch)
),
),
)
+@pytest.mark.freeze_time("2023-01-19T09:09:47.325132+00:00")
def test_influx_db_query_when_get_multiple_events_for_organisation_then_query_api_called(
monkeypatch, project_id, environment_id, expected_filters
):
@@ -187,7 +194,7 @@ def test_influx_db_query_when_get_multiple_events_for_organisation_then_query_ap
expected_query = (
(
f'from(bucket:"{read_bucket}") '
- "|> range(start: -30d, stop: now()) "
+ "|> range(start: 2022-12-20T09:09:47.325132+00:00, stop: 2023-01-19T09:09:47.325132+00:00) "
f"{build_filter_string(expected_filters)}"
'|> drop(columns: ["organisation", "organisation_id", "type", "project", '
'"project_id", "environment", "environment_id", "host"]) '
@@ -217,12 +224,13 @@ def test_influx_db_query_when_get_multiple_events_for_organisation_then_query_ap
assert call[2]["query"].replace(" ", "").replace("\n", "") == expected_query
+@pytest.mark.freeze_time("2023-01-19T09:09:47.325132+00:00")
def test_influx_db_query_when_get_multiple_events_for_feature_then_query_api_called(
monkeypatch,
):
query = (
f'from(bucket:"{read_bucket}") '
- "|> range(start: -30d, stop: now()) "
+ "|> range(start: 2022-12-20T09:09:47.325132+00:00, stop: 2023-01-19T09:09:47.325132+00:00) "
'|> filter(fn:(r) => r._measurement == "feature_evaluation") '
'|> filter(fn: (r) => r["_field"] == "request_count") '
f'|> filter(fn: (r) => r["environment_id"] == "{env_id}") '
@@ -248,6 +256,7 @@ def test_influx_db_query_when_get_multiple_events_for_feature_then_query_api_cal
mock_query_api.query.assert_called_once_with(org=influx_org, query=query)
+@pytest.mark.freeze_time("2023-01-19T09:09:47.325132+00:00")
def test_get_usage_data(mocker):
# Given
influx_data = [
@@ -276,12 +285,14 @@ def test_get_usage_data(mocker):
usage_data = get_usage_data(org_id)
# Then
+ date_start = datetime.fromisoformat("2022-12-20T09:09:47.325132+00:00")
+ date_stop = datetime.fromisoformat("2023-01-19T09:09:47.325132+00:00")
mocked_get_multiple_event_list_for_organisation.assert_called_once_with(
organisation_id=org_id,
environment_id=None,
project_id=None,
- date_start="-30d",
- date_stop="now()",
+ date_start=date_start,
+ date_stop=date_stop,
)
assert len(usage_data) == 2
@@ -299,6 +310,7 @@ def test_get_usage_data(mocker):
assert usage_data[1].environment_document == 10
+@pytest.mark.freeze_time("2023-01-19T09:09:47.325132+00:00")
def test_get_feature_evaluation_data(mocker):
# Given
influx_data = [
@@ -318,8 +330,9 @@ def test_get_feature_evaluation_data(mocker):
)
# Then
+ date_start = datetime.fromisoformat("2022-12-20T09:09:47.325132+00:00")
mocked_get_multiple_event_list_for_feature.assert_called_once_with(
- feature_name=feature_name, environment_id=env_id, date_start="-30d"
+ feature_name=feature_name, environment_id=env_id, date_start=date_start
)
assert len(feature_evaluation_data) == 2
@@ -331,17 +344,17 @@ def test_get_feature_evaluation_data(mocker):
assert feature_evaluation_data[1].count == 200
-@pytest.mark.parametrize("date_stop", ["now()", "-5d"])
@pytest.mark.freeze_time("2023-01-19T09:09:47.325132+00:00")
def test_get_event_list_for_organisation_with_date_stop_set_to_now_and_previously(
- date_stop: str,
mocker: MockerFixture,
organisation: Organisation,
) -> None:
# Given
+
now = timezone.now()
one_day_ago = now - timedelta(days=1)
two_days_ago = now - timedelta(days=2)
+ date_stop = now
record_mock1 = mock.MagicMock()
record_mock1.__getitem__.side_effect = lambda key: {
@@ -377,6 +390,7 @@ def test_get_event_list_for_organisation_with_date_stop_set_to_now_and_previousl
assert labels == ["2023-01-18", "2023-01-17"]
+@pytest.mark.freeze_time("2023-01-19T09:09:47.325132+00:00")
@pytest.mark.parametrize("limit", ["10", ""])
def test_get_top_organisations(
limit: str,
@@ -399,9 +413,11 @@ def test_get_top_organisations(
)
influx_mock.return_value = [result]
+ now = timezone.now()
+ date_start = now - timedelta(days=30)
# When
- dataset = get_top_organisations(date_start="-30d", limit=limit)
+ dataset = get_top_organisations(date_start=date_start, limit=limit)
# Then
assert dataset == {123: 23, 456: 43}
@@ -409,9 +425,10 @@ def test_get_top_organisations(
influx_mock.assert_called_once()
influx_query_call = influx_mock.call_args
assert influx_query_call.kwargs["bucket"] == "test_bucket_downsampled_1h"
- assert influx_query_call.kwargs["date_start"] == "-30d"
+ assert influx_query_call.kwargs["date_start"] == date_start
+@pytest.mark.freeze_time("2023-01-19T09:09:47.325132+00:00")
def test_get_top_organisations_value_error(
mocker: MockerFixture,
) -> None:
@@ -432,9 +449,11 @@ def test_get_top_organisations_value_error(
)
influx_mock.return_value = [result]
+ now = timezone.now()
+ date_start = now - timedelta(days=30)
# When
- dataset = get_top_organisations(date_start="-30d")
+ dataset = get_top_organisations(date_start=date_start)
# Then
# The wrongly typed data does not stop the remaining data
@@ -444,10 +463,90 @@ def test_get_top_organisations_value_error(
def test_early_return_for_empty_range_for_influx_query_manager() -> None:
# When
+ now = timezone.now()
results = InfluxDBWrapper.influx_query_manager(
- date_start="-0d",
- date_stop="now()",
+ date_start=now,
+ date_stop=now,
)
# Then
assert results == []
+
+
+def test_get_range_bucket_mappings_when_less_than_10_days(
+ settings: SettingsWrapper,
+) -> None:
+ # Given
+ two_days = timezone.now() - timedelta(days=2)
+
+ # When
+ result = get_range_bucket_mappings(two_days)
+
+ # Then
+ assert result == settings.INFLUXDB_BUCKET + "_downsampled_15m"
+
+
+def test_get_range_bucket_mappings_when_more_than_10_days(
+ settings: SettingsWrapper,
+) -> None:
+ # Given
+ twelve_days = timezone.now() - timedelta(days=12)
+
+ # When
+ result = get_range_bucket_mappings(twelve_days)
+
+ # Then
+ assert result == settings.INFLUXDB_BUCKET + "_downsampled_1h"
+
+
+def test_influx_query_manager_when_date_start_is_set_to_none(
+ mocker: MockerFixture,
+) -> None:
+ # Given
+ mock_client = mocker.patch("app_analytics.influxdb_wrapper.influxdb_client")
+
+ # When
+ InfluxDBWrapper.influx_query_manager()
+
+ # Then
+ mock_client.query_api.assert_called_once()
+
+
+@pytest.mark.freeze_time("2023-01-19T09:09:47.325132+00:00")
+def test_get_top_organisation_when_date_start_is_set_to_none(
+ mocker: MockerFixture,
+) -> None:
+ # Given
+ influx_mock = mocker.patch(
+ "app_analytics.influxdb_wrapper.InfluxDBWrapper.influx_query_manager"
+ )
+ now = timezone.now()
+ date_start = now - timedelta(days=30)
+
+ # When
+ get_top_organisations()
+
+ # Then
+ influx_query_call = influx_mock.call_args
+ assert influx_query_call.kwargs["bucket"] == "test_bucket_downsampled_1h"
+ assert influx_query_call.kwargs["date_start"] == date_start
+
+
+def test_get_current_api_usage(mocker: MockerFixture) -> None:
+ # Given
+ influx_mock = mocker.patch(
+ "app_analytics.influxdb_wrapper.InfluxDBWrapper.influx_query_manager"
+ )
+ record_mock = mock.MagicMock()
+ record_mock.values = {"organisation": "1-TestCorp"}
+ record_mock.get_value.return_value = 43
+
+ result = mock.MagicMock()
+ result.records = [record_mock]
+ influx_mock.return_value = [result]
+
+ # When
+ result = get_current_api_usage(organisation_id=1)
+
+ # Then
+ assert result == 43
diff --git a/api/tests/unit/app_analytics/test_unit_app_analytics_views.py b/api/tests/unit/app_analytics/test_unit_app_analytics_views.py
index a9276fa868ea..371f54bf329a 100644
--- a/api/tests/unit/app_analytics/test_unit_app_analytics_views.py
+++ b/api/tests/unit/app_analytics/test_unit_app_analytics_views.py
@@ -36,8 +36,8 @@ def test_sdk_analytics_does_not_allow_bad_data(mocker, settings, environment):
view = SDKAnalyticsFlags(request=request)
- mocked_track_feature_eval = mocker.patch(
- "app_analytics.views.track_feature_evaluation_influxdb"
+ mocked_feature_eval_cache = mocker.patch(
+ "app_analytics.views.feature_evaluation_cache"
)
# When
@@ -45,34 +45,7 @@ def test_sdk_analytics_does_not_allow_bad_data(mocker, settings, environment):
# Then
assert response.status_code == status.HTTP_200_OK
- mocked_track_feature_eval.assert_not_called()
-
-
-def test_sdk_analytics_allows_valid_data(mocker, settings, environment, feature):
- # Given
- settings.INFLUXDB_TOKEN = "some-token"
-
- data = {feature.name: 12}
- request = mocker.MagicMock(
- data=data,
- environment=environment,
- query_params={},
- )
-
- view = SDKAnalyticsFlags(request=request)
-
- mocked_track_feature_eval = mocker.patch(
- "app_analytics.views.track_feature_evaluation_influxdb"
- )
-
- # When
- response = view.post(request)
-
- # Then
- assert response.status_code == status.HTTP_200_OK
- mocked_track_feature_eval.run_in_thread.assert_called_once_with(
- args=(environment.id, data)
- )
+ mocked_feature_eval_cache.track_feature_evaluation.assert_not_called()
def test_get_usage_data(mocker, admin_client, organisation):
@@ -168,8 +141,8 @@ def test_get_usage_data__current_billing_period(
organisation_id=organisation.id,
environment_id=None,
project_id=None,
- date_start="-28d",
- date_stop="now()",
+ date_start=four_weeks_ago,
+ date_stop=now,
)
@@ -195,6 +168,7 @@ def test_get_usage_data__previous_billing_period(
now = timezone.now()
week_from_now = now + timedelta(days=7)
four_weeks_ago = now - timedelta(days=28)
+ target_start_at = now - timedelta(days=59)
OrganisationSubscriptionInformationCache.objects.create(
organisation=organisation,
@@ -229,8 +203,8 @@ def test_get_usage_data__previous_billing_period(
organisation_id=organisation.id,
environment_id=None,
project_id=None,
- date_start="-59d",
- date_stop="-28d",
+ date_start=target_start_at,
+ date_stop=four_weeks_ago,
)
@@ -256,7 +230,7 @@ def test_get_usage_data__ninety_day_period(
now = timezone.now()
week_from_now = now + timedelta(days=7)
four_weeks_ago = now - timedelta(days=28)
-
+ ninety_days_ago = now - timedelta(days=90)
OrganisationSubscriptionInformationCache.objects.create(
organisation=organisation,
current_billing_term_starts_at=four_weeks_ago,
@@ -290,8 +264,8 @@ def test_get_usage_data__ninety_day_period(
organisation_id=organisation.id,
environment_id=None,
project_id=None,
- date_start="-90d",
- date_stop="now()",
+ date_start=ninety_days_ago,
+ date_stop=now,
)
@@ -432,24 +406,20 @@ def test_set_sdk_analytics_flags_without_identifier(
assert feature_evaluation_raw.evaluation_count is feature_request_count
-def test_set_sdk_analytics_flags_v1_to_influxdb(
+def test_sdk_analytics_flags_v1(
api_client: APIClient,
environment: Environment,
feature: Feature,
- identity: Identity,
- settings: SettingsWrapper,
mocker: MockerFixture,
) -> None:
# Given
- settings.INFLUXDB_TOKEN = "some-token"
-
url = reverse("api-v1:analytics-flags")
api_client.credentials(HTTP_X_ENVIRONMENT_KEY=environment.api_key)
feature_request_count = 2
data = {feature.name: feature_request_count}
- mocked_track_feature_eval = mocker.patch(
- "app_analytics.views.track_feature_evaluation_influxdb"
+ mocked_feature_evaluation_cache = mocker.patch(
+ "app_analytics.views.feature_evaluation_cache"
)
# When
@@ -459,9 +429,6 @@ def test_set_sdk_analytics_flags_v1_to_influxdb(
# Then
assert response.status_code == status.HTTP_200_OK
- mocked_track_feature_eval.run_in_thread.assert_called_once_with(
- args=(
- environment.id,
- data,
- )
+ mocked_feature_evaluation_cache.track_feature_evaluation.assert_called_once_with(
+ environment.id, feature.name, feature_request_count
)
diff --git a/api/tests/unit/custom_auth/conftest.py b/api/tests/unit/custom_auth/conftest.py
new file mode 100644
index 000000000000..17d5f760c4c1
--- /dev/null
+++ b/api/tests/unit/custom_auth/conftest.py
@@ -0,0 +1,9 @@
+import pytest
+
+from organisations.invites.models import InviteLink
+from organisations.models import Organisation
+
+
+@pytest.fixture()
+def invite_link(organisation: Organisation) -> InviteLink:
+ return InviteLink.objects.create(organisation=organisation)
diff --git a/api/tests/unit/custom_auth/oauth/test_unit_oauth_serializers.py b/api/tests/unit/custom_auth/oauth/test_unit_oauth_serializers.py
index bd21e9fc5d08..11a0519e0b6f 100644
--- a/api/tests/unit/custom_auth/oauth/test_unit_oauth_serializers.py
+++ b/api/tests/unit/custom_auth/oauth/test_unit_oauth_serializers.py
@@ -1,5 +1,7 @@
+from typing import Type
from unittest import mock
+import pytest
from django.test import RequestFactory
from django.utils import timezone
from pytest_django.fixtures import SettingsWrapper
@@ -11,6 +13,7 @@
GoogleLoginSerializer,
OAuthLoginSerializer,
)
+from organisations.invites.models import InviteLink
from users.models import FFAdminUser, SignUpType
@@ -128,7 +131,11 @@ def test_OAuthLoginSerializer_calls_is_authentication_method_valid_correctly_if_
def test_OAuthLoginSerializer_allows_registration_if_sign_up_type_is_invite_link(
- settings: SettingsWrapper, rf: RequestFactory, mocker: MockerFixture, db: None
+ settings: SettingsWrapper,
+ rf: RequestFactory,
+ mocker: MockerFixture,
+ db: None,
+ invite_link: InviteLink,
):
# Given
settings.ALLOW_REGISTRATION_WITHOUT_INVITE = False
@@ -140,6 +147,7 @@ def test_OAuthLoginSerializer_allows_registration_if_sign_up_type_is_invite_link
data={
"access_token": "some_token",
"sign_up_type": SignUpType.INVITE_LINK.value,
+ "invite_hash": invite_link.hash,
},
context={"request": request},
)
@@ -153,3 +161,38 @@ def test_OAuthLoginSerializer_allows_registration_if_sign_up_type_is_invite_link
# Then
assert user
+
+
+@pytest.mark.parametrize(
+ "serializer_class", (GithubLoginSerializer, GithubLoginSerializer)
+)
+def test_OAuthLoginSerializer_allows_login_if_allow_registration_without_invite_is_false(
+ settings: SettingsWrapper,
+ rf: RequestFactory,
+ mocker: MockerFixture,
+ admin_user: FFAdminUser,
+ serializer_class: Type[OAuthLoginSerializer],
+):
+ # Given
+ settings.ALLOW_REGISTRATION_WITHOUT_INVITE = False
+
+ request = rf.post("/api/v1/auth/users/")
+
+ serializer = serializer_class(
+ data={"access_token": "some_token"},
+ context={"request": request},
+ )
+ # monkey patch the get_user_info method to return the mock user data
+ serializer.get_user_info = lambda: {
+ "email": admin_user.email,
+ "github_user_id": "abc123",
+ "google_user_id": "abc123",
+ }
+
+ serializer.is_valid(raise_exception=True)
+
+ # When
+ user = serializer.save()
+
+ # Then
+ assert user
diff --git a/api/tests/unit/custom_auth/oauth/test_unit_oauth_views.py b/api/tests/unit/custom_auth/oauth/test_unit_oauth_views.py
index 0f742267b71b..99a451bab4eb 100644
--- a/api/tests/unit/custom_auth/oauth/test_unit_oauth_views.py
+++ b/api/tests/unit/custom_auth/oauth/test_unit_oauth_views.py
@@ -9,6 +9,7 @@
from organisations.invites.models import Invite
from organisations.models import Organisation
+from users.models import SignUpType
@mock.patch("custom_auth.oauth.serializers.get_user_info")
@@ -66,7 +67,13 @@ def test_can_register_with_google_with_invite_if_registration_disabled(
Invite.objects.create(organisation=organisation, email=email)
# When
- response = client.post(url, data={"access_token": "some-token"})
+ response = client.post(
+ url,
+ data={
+ "access_token": "some-token",
+ "sign_up_type": SignUpType.INVITE_EMAIL.value,
+ },
+ )
# Then
assert response.status_code == status.HTTP_200_OK
@@ -89,7 +96,13 @@ def test_can_register_with_github_with_invite_if_registration_disabled(
Invite.objects.create(organisation=organisation, email=email)
# When
- response = client.post(url, data={"access_token": "some-token"})
+ response = client.post(
+ url,
+ data={
+ "access_token": "some-token",
+ "sign_up_type": SignUpType.INVITE_EMAIL.value,
+ },
+ )
# Then
assert response.status_code == status.HTTP_200_OK
diff --git a/api/tests/unit/custom_auth/test_unit_custom_auth_serializer.py b/api/tests/unit/custom_auth/test_unit_custom_auth_serializer.py
index 00f099e1ace6..010a861f30ab 100644
--- a/api/tests/unit/custom_auth/test_unit_custom_auth_serializer.py
+++ b/api/tests/unit/custom_auth/test_unit_custom_auth_serializer.py
@@ -1,7 +1,13 @@
+import pytest
from django.test import RequestFactory
from pytest_django.fixtures import SettingsWrapper
+from rest_framework.exceptions import PermissionDenied
+from custom_auth.constants import (
+ USER_REGISTRATION_WITHOUT_INVITE_ERROR_MESSAGE,
+)
from custom_auth.serializers import CustomUserCreateSerializer
+from organisations.invites.models import InviteLink
from users.models import FFAdminUser, SignUpType
user_dict = {
@@ -70,6 +76,7 @@ def test_CustomUserCreateSerializer_calls_is_authentication_method_valid_correct
def test_CustomUserCreateSerializer_allows_registration_if_sign_up_type_is_invite_link(
+ invite_link: InviteLink,
db: None,
settings: SettingsWrapper,
rf: RequestFactory,
@@ -80,6 +87,7 @@ def test_CustomUserCreateSerializer_allows_registration_if_sign_up_type_is_invit
data = {
**user_dict,
"sign_up_type": SignUpType.INVITE_LINK.value,
+ "invite_hash": invite_link.hash,
}
serializer = CustomUserCreateSerializer(
@@ -92,3 +100,48 @@ def test_CustomUserCreateSerializer_allows_registration_if_sign_up_type_is_invit
# Then
assert user
+
+
+def test_invite_link_validation_mixin_validate_fails_if_invite_link_hash_not_provided(
+ settings: SettingsWrapper,
+ db: None,
+) -> None:
+ # Given
+ settings.ALLOW_REGISTRATION_WITHOUT_INVITE = False
+
+ serializer = CustomUserCreateSerializer(
+ data={
+ **user_dict,
+ "sign_up_type": SignUpType.INVITE_LINK.value,
+ }
+ )
+
+ # When
+ with pytest.raises(PermissionDenied) as exc_info:
+ serializer.is_valid(raise_exception=True)
+
+ # Then
+ assert exc_info.value.detail == USER_REGISTRATION_WITHOUT_INVITE_ERROR_MESSAGE
+
+
+def test_invite_link_validation_mixin_validate_fails_if_invite_link_hash_not_valid(
+ invite_link: InviteLink,
+ settings: SettingsWrapper,
+) -> None:
+ # Given
+ settings.ALLOW_REGISTRATION_WITHOUT_INVITE = False
+
+ serializer = CustomUserCreateSerializer(
+ data={
+ **user_dict,
+ "sign_up_type": SignUpType.INVITE_LINK.value,
+ "invite_hash": "invalid-hash",
+ }
+ )
+
+ # When
+ with pytest.raises(PermissionDenied) as exc_info:
+ serializer.is_valid(raise_exception=True)
+
+ # Then
+ assert exc_info.value.detail == USER_REGISTRATION_WITHOUT_INVITE_ERROR_MESSAGE
diff --git a/api/tests/unit/environments/identities/test_unit_identities_views.py b/api/tests/unit/environments/identities/test_unit_identities_views.py
index bf9e030058d0..b1e31cdb50b2 100644
--- a/api/tests/unit/environments/identities/test_unit_identities_views.py
+++ b/api/tests/unit/environments/identities/test_unit_identities_views.py
@@ -1,7 +1,9 @@
import json
import urllib
+from typing import Any
from unittest import mock
+import pytest
from core.constants import FLAGSMITH_UPDATED_AT_HEADER, STRING
from django.test import override_settings
from django.urls import reverse
@@ -1143,20 +1145,31 @@ def test_post_identities__server_key_only_feature__server_key_auth__return_expec
assert response.json()["flags"]
+@pytest.mark.parametrize(
+ "identity_data",
+ [
+ pytest.param(
+ {"identifier": "transient", "transient": True},
+ id="new-identifier-transient-true",
+ ),
+ pytest.param({"identifier": ""}, id="blank-identifier"),
+ pytest.param({"identifier": None}, id="null-identifier"),
+ pytest.param({}, id="missing_identifier"),
+ ],
+)
def test_post_identities__transient__no_persistence(
environment: Environment,
api_client: APIClient,
+ identity_data: dict[str, Any],
) -> None:
# Given
- identifier = "transient"
trait_key = "trait_key"
api_client.credentials(HTTP_X_ENVIRONMENT_KEY=environment.api_key)
url = reverse("api-v1:sdk-identities")
data = {
- "identifier": identifier,
+ **identity_data,
"traits": [{"trait_key": trait_key, "trait_value": "bar"}],
- "transient": True,
}
# When
@@ -1166,10 +1179,74 @@ def test_post_identities__transient__no_persistence(
# Then
assert response.status_code == status.HTTP_200_OK
- assert not Identity.objects.filter(identifier=identifier).exists()
+ assert not Identity.objects.exists()
assert not Trait.objects.filter(trait_key=trait_key).exists()
+@pytest.mark.parametrize(
+ "trait_transiency_data",
+ [
+ pytest.param({"transient": True}, id="trait-transient-true"),
+ pytest.param({"transient": False}, id="trait-transient-false"),
+ pytest.param({}, id="trait-default"),
+ ],
+)
+def test_post_identities__existing__transient__no_persistence(
+ environment: Environment,
+ identity: Identity,
+ trait: Trait,
+ identity_featurestate: FeatureState,
+ api_client: APIClient,
+ trait_transiency_data: dict[str, Any],
+) -> None:
+ # Given
+ feature_state_value = "identity override"
+ identity_featurestate.feature_state_value.string_value = feature_state_value
+ identity_featurestate.feature_state_value.save()
+
+ trait_key = "trait_key"
+
+ api_client.credentials(HTTP_X_ENVIRONMENT_KEY=environment.api_key)
+ url = reverse("api-v1:sdk-identities")
+ data = {
+ "identifier": identity.identifier,
+ "transient": True,
+ "traits": [
+ {"trait_key": trait_key, "trait_value": "bar", **trait_transiency_data}
+ ],
+ }
+
+ # When
+ response = api_client.post(
+ url, data=json.dumps(data), content_type="application/json"
+ )
+
+ # Then
+ assert response.status_code == status.HTTP_200_OK
+ response_json = response.json()
+
+ # identity overrides are correctly loaded
+ assert response_json["flags"][0]["feature_state_value"] == feature_state_value
+
+ # previously persisted traits not provided in the request
+ # are not marked as transient in the response
+ assert response_json["traits"][0]["trait_key"] == trait.trait_key
+ assert not response_json["traits"][0].get("transient")
+
+ # every trait provided in the request for a transient identity
+ # is marked as transient
+ assert response_json["traits"][1]["trait_key"] == trait_key
+ assert response_json["traits"][1]["transient"]
+
+ assert (
+ persisted_trait := Trait.objects.filter(
+ identity=identity, trait_key=trait.trait_key
+ ).first()
+ )
+ assert persisted_trait.trait_value == trait.trait_value
+ assert not Trait.objects.filter(identity=identity, trait_key=trait_key).exists()
+
+
def test_post_identities__transient_traits__no_persistence(
environment: Environment,
api_client: APIClient,
diff --git a/api/tests/unit/environments/test_unit_environments_views.py b/api/tests/unit/environments/test_unit_environments_views.py
index 90760acec236..d5fec31fe8de 100644
--- a/api/tests/unit/environments/test_unit_environments_views.py
+++ b/api/tests/unit/environments/test_unit_environments_views.py
@@ -569,7 +569,7 @@ def test_view_environment_with_staff__query_count_is_expected(
url = reverse("api-v1:environments:environment-list")
data = {"project": project.id}
- expected_query_count = 7
+ expected_query_count = 9
# When
with django_assert_num_queries(expected_query_count):
response = staff_client.get(url, data=data, content_type="application/json")
@@ -766,11 +766,13 @@ def test_audit_log_entry_created_when_environment_updated(
def test_get_document(
environment: Environment,
project: Project,
- admin_client_new: APIClient,
+ staff_client: APIClient,
feature: Feature,
segment: Segment,
+ with_environment_permissions: WithEnvironmentPermissionsCallable,
) -> None:
# Given
+ with_environment_permissions([VIEW_ENVIRONMENT])
# and some sample data to make sure we're testing all of the document
segment_rule = SegmentRule.objects.create(
@@ -786,13 +788,28 @@ def test_get_document(
)
# When
- response = admin_client_new.get(url)
+ response = staff_client.get(url)
# Then
assert response.status_code == status.HTTP_200_OK
assert response.json()
+def test_cannot_get_environment_document_without_permission(
+ staff_client: APIClient, environment: Environment
+) -> None:
+ # Given
+ url = reverse(
+ "api-v1:environments:environment-get-document", args=[environment.api_key]
+ )
+
+ # When
+ response = staff_client.get(url)
+
+ # Then
+ assert response.status_code == status.HTTP_403_FORBIDDEN
+
+
def test_get_all_trait_keys_for_environment_only_returns_distinct_keys(
identity: Identity,
admin_client_new: APIClient,
diff --git a/api/tests/unit/features/test_unit_feature_external_resources_views.py b/api/tests/unit/features/test_unit_feature_external_resources_views.py
index 901454b6add6..d69b05766c0e 100644
--- a/api/tests/unit/features/test_unit_feature_external_resources_views.py
+++ b/api/tests/unit/features/test_unit_feature_external_resources_views.py
@@ -75,17 +75,15 @@ def test_create_feature_external_resource(
github_configuration: GithubConfiguration,
github_repository: GithubRepository,
post_request_mock: MagicMock,
- mocker: MockerFixture,
+ mock_github_client_generate_token: MagicMock,
) -> None:
# Given
- mocker.patch(
- "integrations.github.client.generate_token",
- return_value="mocked_token",
+ repository_owner_name = (
+ f"{github_repository.repository_owner}/{github_repository.repository_name}"
)
-
feature_external_resource_data = {
"type": "GITHUB_ISSUE",
- "url": "https://github.com/repoowner/repo-name/issues/35",
+ "url": f"https://github.com/{repository_owner_name}/issues/35",
"feature": feature_with_value.id,
"metadata": {"state": "open"},
}
@@ -130,7 +128,7 @@ def test_create_feature_external_resource(
)
)
post_request_mock.assert_called_with(
- "https://api.github.com/repos/repoowner/repo-name/issues/35/comments",
+ f"https://api.github.com/repos/{repository_owner_name}/issues/35/comments",
json={"body": f"{expected_comment_body}"},
headers={
"Accept": "application/vnd.github.v3+json",
@@ -157,7 +155,7 @@ def test_create_feature_external_resource(
# And When
responses.add(
method="GET",
- url=f"{GITHUB_API_URL}repos/repoowner/repo-name/issues/35",
+ url=f"{GITHUB_API_URL}repos/{repository_owner_name}/issues/35",
status=200,
json={"title": "resource name", "state": "open"},
)
@@ -183,6 +181,66 @@ def test_create_feature_external_resource(
)
+def test_cannot_create_feature_external_resource_with_an_invalid_gh_url(
+ admin_client_new: APIClient,
+ feature: Feature,
+ project: Project,
+ github_configuration: GithubConfiguration,
+ github_repository: GithubRepository,
+) -> None:
+ # Given
+ feature_external_resource_data = {
+ "type": "GITHUB_ISSUE",
+ "url": "https://github.com/repoowner/repo-name/pull/1",
+ "feature": feature.id,
+ "metadata": {"state": "open"},
+ }
+
+ url = reverse(
+ "api-v1:projects:feature-external-resources-list",
+ kwargs={"project_pk": project.id, "feature_pk": feature.id},
+ )
+
+ # When
+ response = admin_client_new.post(
+ url, data=feature_external_resource_data, format="json"
+ )
+
+ # Then
+ assert response.status_code == status.HTTP_400_BAD_REQUEST
+ assert response.json()["detail"] == "Invalid GitHub Issue/PR URL"
+
+
+def test_cannot_create_feature_external_resource_with_an_incorrect_gh_type(
+ admin_client_new: APIClient,
+ feature: Feature,
+ project: Project,
+ github_configuration: GithubConfiguration,
+ github_repository: GithubRepository,
+) -> None:
+ # Given
+ feature_external_resource_data = {
+ "type": "GITHUB_INCORRECT_TYPE",
+ "url": "https://github.com/repoowner/repo-name/pull/1",
+ "feature": feature.id,
+ "metadata": {"state": "open"},
+ }
+
+ url = reverse(
+ "api-v1:projects:feature-external-resources-list",
+ kwargs={"project_pk": project.id, "feature_pk": feature.id},
+ )
+
+ # When
+ response = admin_client_new.post(
+ url, data=feature_external_resource_data, format="json"
+ )
+
+ # Then
+ assert response.status_code == status.HTTP_400_BAD_REQUEST
+ assert response.json()["detail"] == "Incorrect GitHub type"
+
+
def test_cannot_create_feature_external_resource_when_doesnt_have_a_valid_github_integration(
admin_client_new: APIClient,
feature: Feature,
@@ -297,11 +355,11 @@ def test_update_feature_external_resource(
"integrations.github.client.generate_token",
)
mock_generate_token.return_value = "mocked_token"
- mock_generate_token.return_value = "mocked_token"
feature_external_resource_data = {
"type": "GITHUB_ISSUE",
"url": "https://github.com/userexample/example-project-repo/issues/12",
"feature": feature.id,
+ "metadata": '{"state": "open"}',
}
url = reverse(
"api-v1:projects:feature-external-resources-detail",
@@ -338,7 +396,7 @@ def test_delete_feature_external_resource(
post_request_mock.assert_called_with(
"https://api.github.com/repos/repositoryownertest/repositorynametest/issues/11/comments",
json={
- "body": "### The feature flag `Test Feature1` was unlinked from the issue/PR"
+ "body": "**The feature flag `Test Feature1` was unlinked from the issue/PR**"
},
headers={
"Accept": "application/vnd.github.v3+json",
@@ -361,12 +419,9 @@ def test_get_feature_external_resources(
github_configuration: GithubConfiguration,
github_repository: GithubRepository,
feature_external_resource: FeatureExternalResource,
- mocker: MockerFixture,
+ mock_github_client_generate_token: MagicMock,
) -> None:
# Given
- mocker.patch(
- "integrations.github.client.generate_token",
- )
url = reverse(
"api-v1:projects:feature-external-resources-list",
kwargs={"project_pk": project.id, "feature_pk": feature.id},
@@ -446,7 +501,7 @@ def test_create_github_comment_on_feature_state_updated(
).updated_at.strftime(get_format("DATETIME_INPUT_FORMATS")[0])
expected_body_comment = (
- "Flagsmith Feature `Test Feature1` has been updated:\n"
+ "**Flagsmith Feature `Test Feature1` has been updated:**\n"
+ expected_default_body(
project.id,
environment.api_key,
@@ -480,14 +535,9 @@ def test_create_github_comment_on_feature_was_deleted(
github_repository: GithubRepository,
feature_external_resource: FeatureExternalResource,
post_request_mock: MagicMock,
- mocker: MockerFixture,
+ mock_github_client_generate_token: MagicMock,
) -> None:
# Given
- mocker.patch(
- "integrations.github.client.generate_token",
- return_value="mocked_token",
- )
-
url = reverse(
viewname="api-v1:projects:project-features-detail",
kwargs={"project_pk": project.id, "pk": feature.id},
@@ -501,7 +551,7 @@ def test_create_github_comment_on_feature_was_deleted(
post_request_mock.assert_called_with(
"https://api.github.com/repos/repositoryownertest/repositorynametest/issues/11/comments",
- json={"body": "### The Feature Flag `Test Feature1` was deleted"},
+ json={"body": "**The Feature Flag `Test Feature1` was deleted**"},
headers={
"Accept": "application/vnd.github.v3+json",
"X-GitHub-Api-Version": GITHUB_API_VERSION,
@@ -518,18 +568,13 @@ def test_create_github_comment_on_segment_override_updated(
github_configuration: GithubConfiguration,
github_repository: GithubRepository,
post_request_mock: MagicMock,
- mocker: MockerFixture,
environment: Environment,
admin_client: APIClient,
feature_with_value_external_resource: FeatureExternalResource,
+ mock_github_client_generate_token: MagicMock,
) -> None:
# Given
feature_state = segment_override_for_feature_with_value
- mocker.patch(
- "integrations.github.client.generate_token",
- return_value="mocked_token",
- )
-
payload = dict(WritableNestedFeatureStateSerializer(instance=feature_state).data)
payload["enabled"] = not feature_state.enabled
@@ -549,7 +594,7 @@ def test_create_github_comment_on_segment_override_updated(
).updated_at.strftime(get_format("DATETIME_INPUT_FORMATS")[0])
expected_comment_body = (
- "Flagsmith Feature `feature_with_value` has been updated:\n"
+ "**Flagsmith Feature `feature_with_value` has been updated:**\n"
+ "\n"
+ expected_segment_comment_body(
project.id,
@@ -581,16 +626,11 @@ def test_create_github_comment_on_segment_override_deleted(
github_configuration: GithubConfiguration,
github_repository: GithubRepository,
post_request_mock: MagicMock,
- mocker: MockerFixture,
admin_client_new: APIClient,
feature_with_value_external_resource: FeatureExternalResource,
+ mock_github_client_generate_token: MagicMock,
) -> None:
# Given
- mocker.patch(
- "integrations.github.client.generate_token",
- return_value="mocked_token",
- )
-
url = reverse(
viewname="api-v1:features:feature-segment-detail",
kwargs={"pk": feature_with_value_segment.id},
@@ -606,7 +646,7 @@ def test_create_github_comment_on_segment_override_deleted(
post_request_mock.assert_called_with(
"https://api.github.com/repos/repositoryownertest/repositorynametest/issues/11/comments",
json={
- "body": "### The Segment Override `segment` for Feature Flag `feature_with_value` was deleted"
+ "body": "**The Segment Override `segment` for Feature Flag `feature_with_value` was deleted**"
},
headers={
"Accept": "application/vnd.github.v3+json",
@@ -664,7 +704,7 @@ def test_create_github_comment_using_v2(
response_data["updated_at"], format
).strftime(get_format("DATETIME_INPUT_FORMATS")[0])
expected_comment_body = (
- "Flagsmith Feature `Test Feature1` has been updated:\n"
+ "**Flagsmith Feature `Test Feature1` has been updated:**\n"
+ "\n"
+ expected_segment_comment_body(
project.id,
@@ -745,19 +785,17 @@ def test_create_feature_external_resource_on_environment_with_v2(
segment_override_for_feature_with_value: FeatureState,
environment_v2_versioning: Environment,
post_request_mock: MagicMock,
- mocker: MockerFixture,
+ mock_github_client_generate_token: MagicMock,
) -> None:
# Given
feature_id = segment_override_for_feature_with_value.feature_id
-
- mocker.patch(
- "integrations.github.client.generate_token",
- return_value="mocked_token",
+ repository_owner_name = (
+ f"{github_repository.repository_owner}/{github_repository.repository_name}"
)
feature_external_resource_data = {
"type": "GITHUB_ISSUE",
- "url": "https://github.com/repoowner/repo-name/issues/35",
+ "url": f"https://github.com/{repository_owner_name}/issues/35",
"feature": feature_id,
"metadata": {"state": "open"},
}
@@ -804,7 +842,7 @@ def test_create_feature_external_resource_on_environment_with_v2(
assert response.status_code == status.HTTP_201_CREATED
post_request_mock.assert_called_with(
- "https://api.github.com/repos/repoowner/repo-name/issues/35/comments",
+ f"https://api.github.com/repos/{repository_owner_name}/issues/35/comments",
json={"body": f"{expected_comment_body}"},
headers={
"Accept": "application/vnd.github.v3+json",
@@ -813,3 +851,37 @@ def test_create_feature_external_resource_on_environment_with_v2(
},
timeout=10,
)
+
+
+def test_cannot_create_feature_external_resource_for_the_same_feature_and_resource_uri(
+ admin_client_new: APIClient,
+ feature: Feature,
+ project: Project,
+ github_configuration: GithubConfiguration,
+ github_repository: GithubRepository,
+ feature_external_resource_gh_pr: FeatureExternalResource,
+) -> None:
+ # Given
+ feature_external_resource_data = {
+ "type": "GITHUB_PR",
+ "url": "https://github.com/repositoryownertest/repositorynametest/pull/1",
+ "feature": feature.id,
+ "metadata": {"state": "open"},
+ }
+
+ url = reverse(
+ "api-v1:projects:feature-external-resources-list",
+ kwargs={"project_pk": project.id, "feature_pk": feature.id},
+ )
+
+ # When
+ response = admin_client_new.post(
+ url, data=feature_external_resource_data, format="json"
+ )
+
+ # Then
+ assert response.status_code == status.HTTP_400_BAD_REQUEST
+ assert (
+ response.json()["non_field_errors"][0]
+ == "The fields feature, url must make a unique set."
+ )
diff --git a/api/tests/unit/features/test_unit_features_views.py b/api/tests/unit/features/test_unit_features_views.py
index c0496b26f497..0226d4b1a615 100644
--- a/api/tests/unit/features/test_unit_features_views.py
+++ b/api/tests/unit/features/test_unit_features_views.py
@@ -425,6 +425,7 @@ def test_put_feature_does_not_update_feature_states(
assert all(fs.enabled is False for fs in feature.feature_states.all())
+@pytest.mark.freeze_time("2023-01-19T09:09:47.325132+00:00")
@mock.patch("features.views.get_multiple_event_list_for_feature")
def test_get_project_features_influx_data(
mock_get_event_list: mock.MagicMock,
@@ -446,6 +447,7 @@ def test_get_project_features_influx_data(
"datetime": datetime(2021, 2, 26, 12, 0, 0, tzinfo=pytz.UTC),
}
]
+ one_day_ago = timezone.now() - timedelta(days=1)
# When
response = admin_client_new.get(url)
@@ -455,11 +457,70 @@ def test_get_project_features_influx_data(
mock_get_event_list.assert_called_once_with(
feature_name=feature.name,
environment_id=str(environment.id), # provided as a GET param
- date_start="-24h", # this is the default but can be provided as a GET param
+ date_start=one_day_ago, # this is the default but can be provided as a GET param
aggregate_every="24h", # this is the default but can be provided as a GET param
)
+@pytest.mark.freeze_time("2023-01-19T09:09:47.325132+00:00")
+@mock.patch("features.views.get_multiple_event_list_for_feature")
+def test_get_project_features_influx_data_with_two_weeks_period(
+ mock_get_event_list: mock.MagicMock,
+ feature: Feature,
+ project: Project,
+ environment: Environment,
+ admin_client_new: APIClient,
+) -> None:
+ # Given
+ base_url = reverse(
+ "api-v1:projects:project-features-get-influx-data",
+ args=[project.id, feature.id],
+ )
+ url = f"{base_url}?environment_id={environment.id}&period=14d"
+ date_start = timezone.now() - timedelta(days=14)
+
+ mock_get_event_list.return_value = [
+ {
+ feature.name: 1,
+ "datetime": datetime(2021, 2, 26, 12, 0, 0, tzinfo=pytz.UTC),
+ }
+ ]
+
+ # When
+ response = admin_client_new.get(url)
+
+ # Then
+ assert response.status_code == status.HTTP_200_OK
+ mock_get_event_list.assert_called_once_with(
+ feature_name=feature.name,
+ environment_id=str(environment.id),
+ date_start=date_start,
+ aggregate_every="24h",
+ )
+
+
+@pytest.mark.freeze_time("2023-01-19T09:09:47.325132+00:00")
+def test_get_project_features_influx_data_with_malformed_period(
+ feature: Feature,
+ project: Project,
+ environment: Environment,
+ admin_client_new: APIClient,
+) -> None:
+ # Given
+ base_url = reverse(
+ "api-v1:projects:project-features-get-influx-data",
+ args=[project.id, feature.id],
+ )
+ url = f"{base_url}?environment_id={environment.id}&period=baddata"
+
+ # When
+ response = admin_client_new.get(url)
+
+ # Then
+ assert response.status_code == status.HTTP_400_BAD_REQUEST
+ assert response.data[0] == "Malformed period supplied"
+
+
def test_regular_user_cannot_create_mv_options_when_creating_feature(
staff_client: APIClient,
with_project_permissions: WithProjectPermissionsCallable,
diff --git a/api/tests/unit/integrations/github/test_unit_github_views.py b/api/tests/unit/integrations/github/test_unit_github_views.py
index 07d3e9660f14..a1d9add23f36 100644
--- a/api/tests/unit/integrations/github/test_unit_github_views.py
+++ b/api/tests/unit/integrations/github/test_unit_github_views.py
@@ -1,10 +1,10 @@
import json
from typing import Any
+from unittest.mock import MagicMock
import pytest
import requests
import responses
-from django.conf import settings
from django.urls import reverse
from pytest_lazyfixture import lazy_fixture
from pytest_mock import MockerFixture
@@ -12,7 +12,9 @@
from rest_framework.response import Response
from rest_framework.test import APIClient
+from environments.models import Environment
from features.feature_external_resources.models import FeatureExternalResource
+from features.models import Feature
from integrations.github.constants import GITHUB_API_URL
from integrations.github.models import GithubConfiguration, GithubRepository
from integrations.github.views import (
@@ -29,6 +31,17 @@
WEBHOOK_PAYLOAD_WITHOUT_INSTALLATION_ID = json.dumps(
{"installation": {"test": 765432}, "action": "deleted"}
)
+WEBHOOK_PAYLOAD_MERGED = json.dumps(
+ {
+ "pull_request": {
+ "id": 1234567,
+ "html_url": "https://github.com/repositoryownertest/repositorynametest/issues/11",
+ "merged": True,
+ },
+ "action": "closed",
+ }
+)
+
WEBHOOK_SIGNATURE = "sha1=57a1426e19cdab55dd6d0c191743e2958e50ccaa"
WEBHOOK_SIGNATURE_WITH_AN_INVALID_INSTALLATION_ID = (
"sha1=081eef49d04df27552587d5df1c6b76e0fe20d21"
@@ -36,6 +49,7 @@
WEBHOOK_SIGNATURE_WITHOUT_INSTALLATION_ID = (
"sha1=f99796bd3cebb902864e87ed960c5cca8772ff67"
)
+WEBHOOK_MERGED_ACTION_SIGNATURE = "sha1=712ec7a5db14aad99d900da40738ebb9508ecad2"
WEBHOOK_SECRET = "secret-key"
@@ -246,11 +260,14 @@ def test_cannot_get_github_repository_when_github_pk_in_not_a_number(
assert response.json() == {"github_pk": ["Must be an integer"]}
+@responses.activate
def test_create_github_repository(
admin_client_new: APIClient,
organisation: Organisation,
github_configuration: GithubConfiguration,
project: Project,
+ mocker: MockerFixture,
+ mock_github_client_generate_token: MagicMock,
) -> None:
# Given
data = {
@@ -258,8 +275,16 @@ def test_create_github_repository(
"repository_owner": "repositoryowner",
"repository_name": "repositoryname",
"project": project.id,
+ "tagging_enabled": True,
}
+ responses.add(
+ method="POST",
+ url=f"{GITHUB_API_URL}repos/repositoryowner/repositoryname/labels",
+ status=status.HTTP_200_OK,
+ json={},
+ )
+
url = reverse(
"api-v1:organisations:repositories-list",
args=[organisation.id, github_configuration.id],
@@ -272,6 +297,53 @@ def test_create_github_repository(
assert GithubRepository.objects.filter(repository_owner="repositoryowner").exists()
+@responses.activate
+def test_create_github_repository_and_label_already_Existe(
+ admin_client_new: APIClient,
+ organisation: Organisation,
+ github_configuration: GithubConfiguration,
+ project: Project,
+ mocker: MockerFixture,
+ mock_github_client_generate_token: MagicMock,
+) -> None:
+ # Given
+ mocker_logger = mocker.patch("integrations.github.client.logger")
+
+ data = {
+ "github_configuration": github_configuration.id,
+ "repository_owner": "repositoryowner",
+ "repository_name": "repositoryname",
+ "project": project.id,
+ "tagging_enabled": True,
+ }
+
+ mock_response = {
+ "message": "Validation Failed",
+ "errors": [{"resource": "Label", "code": "already_exists", "field": "name"}],
+ "documentation_url": "https://docs.github.com/rest/issues/labels#create-a-label",
+ "status": "422",
+ }
+
+ responses.add(
+ method="POST",
+ url=f"{GITHUB_API_URL}repos/repositoryowner/repositoryname/labels",
+ status=status.HTTP_422_UNPROCESSABLE_ENTITY,
+ json=mock_response,
+ )
+
+ url = reverse(
+ "api-v1:organisations:repositories-list",
+ args=[organisation.id, github_configuration.id],
+ )
+ # When
+ response = admin_client_new.post(url, data)
+
+ # Then
+ mocker_logger.warning.assert_called_once_with("Label already exists")
+ assert response.status_code == status.HTTP_201_CREATED
+ assert GithubRepository.objects.filter(repository_owner="repositoryowner").exists()
+
+
def test_cannot_create_github_repository_when_does_not_have_permissions(
test_user_client: APIClient,
organisation: Organisation,
@@ -334,13 +406,9 @@ def test_github_delete_repository(
github_configuration: GithubConfiguration,
github_repository: GithubRepository,
feature_external_resource: FeatureExternalResource,
- mocker: MockerFixture,
+ mock_github_client_generate_token: MagicMock,
) -> None:
# Given
- mock_generate_token = mocker.patch(
- "integrations.github.client.generate_token",
- )
- mock_generate_token.return_value = "mocked_token"
url = reverse(
"api-v1:organisations:repositories-detail",
args=[organisation.id, github_configuration.id, github_repository.id],
@@ -409,14 +477,10 @@ def test_fetch_pull_requests(
organisation: Organisation,
github_configuration: GithubConfiguration,
github_repository: GithubRepository,
+ mock_github_client_generate_token: MagicMock,
mocker: MockerFixture,
) -> None:
-
# Given
- mock_generate_token = mocker.patch(
- "integrations.github.client.generate_token",
- )
- mock_generate_token.return_value = "mocked_token"
github_request_mock = mocker.patch(
"requests.get", side_effect=mocked_requests_get_issues_and_pull_requests
)
@@ -448,13 +512,10 @@ def test_fetch_issues(
organisation: Organisation,
github_configuration: GithubConfiguration,
github_repository: GithubRepository,
+ mock_github_client_generate_token: MagicMock,
mocker: MockerFixture,
) -> None:
# Given
- mock_generate_token = mocker.patch(
- "integrations.github.client.generate_token",
- )
- mock_generate_token.return_value = "mocked_token"
github_request_mock = mocker.patch(
"requests.get", side_effect=mocked_requests_get_issues_and_pull_requests
)
@@ -491,13 +552,10 @@ def test_fetch_issues_returns_error_on_bad_response_from_github(
organisation: Organisation,
github_configuration: GithubConfiguration,
github_repository: GithubRepository,
+ mock_github_client_generate_token: MagicMock,
mocker: MockerFixture,
) -> None:
# Given
- mock_generate_token = mocker.patch(
- "integrations.github.client.generate_token",
- )
- mock_generate_token.return_value = "mocked_token"
mocker.patch("requests.get", side_effect=mocked_requests_get_error)
url = reverse("api-v1:organisations:get-github-issues", args=[organisation.id])
data = {"repo_owner": "owner", "repo_name": "repo"}
@@ -519,13 +577,9 @@ def test_fetch_repositories(
organisation: Organisation,
github_configuration: GithubConfiguration,
github_repository: GithubRepository,
- mocker: MockerFixture,
+ mock_github_client_generate_token: MagicMock,
) -> None:
# Given
- mock_generate_token = mocker.patch(
- "integrations.github.client.generate_token",
- )
- mock_generate_token.return_value = "mocked_token"
responses.add(
method="GET",
url=f"{GITHUB_API_URL}installation/repositories",
@@ -573,13 +627,11 @@ def test_fetch_repositories(
],
)
def test_fetch_issues_and_pull_requests_fails_with_status_400_when_integration_not_configured(
- client: APIClient, organisation: Organisation, reverse_url: str, mocker
+ client: APIClient,
+ organisation: Organisation,
+ reverse_url: str,
+ mock_github_client_generate_token: MagicMock,
) -> None:
- # Given
- mock_generate_token = mocker.patch(
- "integrations.github.client.generate_token",
- )
- mock_generate_token.generate_token.return_value = "mocked_token"
# When
url = reverse(reverse_url, args=[organisation.id])
response = client.get(url)
@@ -600,15 +652,9 @@ def test_cannot_fetch_issues_or_prs_when_does_not_have_permissions(
organisation: Organisation,
github_configuration: GithubConfiguration,
github_repository: GithubRepository,
- mocker,
+ mock_github_client_generate_token: MagicMock,
reverse_url: str,
) -> None:
- # Given
- mock_generate_token = mocker.patch(
- "integrations.github.client.generate_token",
- )
- mock_generate_token.generate_token.return_value = "mocked_token"
-
# When
url = reverse(reverse_url, args=[organisation.id])
response = test_user_client.get(url)
@@ -656,9 +702,9 @@ def test_verify_github_webhook_payload_returns_false_on_no_signature_header() ->
def test_github_webhook_delete_installation(
api_client: APIClient,
github_configuration: GithubConfiguration,
+ set_github_webhook_secret,
) -> None:
# Given
- settings.GITHUB_WEBHOOK_SECRET = WEBHOOK_SECRET
url = reverse("api-v1:github-webhook")
# When
@@ -675,63 +721,88 @@ def test_github_webhook_delete_installation(
assert not GithubConfiguration.objects.filter(installation_id=1234567).exists()
-def test_github_webhook_with_non_existing_installation(
+def test_github_webhook_merged_a_pull_request(
api_client: APIClient,
+ feature: Feature,
github_configuration: GithubConfiguration,
+ github_repository: GithubRepository,
+ feature_external_resource: FeatureExternalResource,
+ set_github_webhook_secret,
+) -> None:
+ # Given
+ url = reverse("api-v1:github-webhook")
+
+ # When
+ response = api_client.post(
+ path=url,
+ data=WEBHOOK_PAYLOAD_MERGED,
+ content_type="application/json",
+ HTTP_X_HUB_SIGNATURE=WEBHOOK_MERGED_ACTION_SIGNATURE,
+ HTTP_X_GITHUB_EVENT="pull_request",
+ )
+
+ # Then
+ feature.refresh_from_db()
+ assert response.status_code == status.HTTP_200_OK
+ assert feature.tags.first().label == "PR Merged"
+
+
+def test_github_webhook_without_installation_id(
+ api_client: APIClient,
mocker: MockerFixture,
+ set_github_webhook_secret,
) -> None:
# Given
- settings.GITHUB_WEBHOOK_SECRET = WEBHOOK_SECRET
url = reverse("api-v1:github-webhook")
mocker_logger = mocker.patch("integrations.github.github.logger")
# When
response = api_client.post(
path=url,
- data=WEBHOOK_PAYLOAD_WITH_AN_INVALID_INSTALLATION_ID,
+ data=WEBHOOK_PAYLOAD_WITHOUT_INSTALLATION_ID,
content_type="application/json",
- HTTP_X_HUB_SIGNATURE=WEBHOOK_SIGNATURE_WITH_AN_INVALID_INSTALLATION_ID,
+ HTTP_X_HUB_SIGNATURE=WEBHOOK_SIGNATURE_WITHOUT_INSTALLATION_ID,
HTTP_X_GITHUB_EVENT="installation",
)
# Then
mocker_logger.error.assert_called_once_with(
- "GitHub Configuration with installation_id 765432 does not exist"
+ "The installation_id is not present in the payload: {'installation': {'test': 765432}, 'action': 'deleted'}"
)
assert response.status_code == status.HTTP_200_OK
-def test_github_webhook_without_installation_id(
+def test_github_webhook_with_non_existing_installation(
api_client: APIClient,
github_configuration: GithubConfiguration,
mocker: MockerFixture,
+ set_github_webhook_secret,
) -> None:
# Given
- settings.GITHUB_WEBHOOK_SECRET = WEBHOOK_SECRET
url = reverse("api-v1:github-webhook")
mocker_logger = mocker.patch("integrations.github.github.logger")
# When
response = api_client.post(
path=url,
- data=WEBHOOK_PAYLOAD_WITHOUT_INSTALLATION_ID,
+ data=WEBHOOK_PAYLOAD_WITH_AN_INVALID_INSTALLATION_ID,
content_type="application/json",
- HTTP_X_HUB_SIGNATURE=WEBHOOK_SIGNATURE_WITHOUT_INSTALLATION_ID,
+ HTTP_X_HUB_SIGNATURE=WEBHOOK_SIGNATURE_WITH_AN_INVALID_INSTALLATION_ID,
HTTP_X_GITHUB_EVENT="installation",
)
# Then
mocker_logger.error.assert_called_once_with(
- "The installation_id is not present in the payload: {'installation': {'test': 765432}, 'action': 'deleted'}"
+ "GitHub Configuration with installation_id 765432 does not exist"
)
assert response.status_code == status.HTTP_200_OK
def test_github_webhook_fails_on_signature_header_missing(
github_configuration: GithubConfiguration,
+ set_github_webhook_secret,
) -> None:
# Given
- settings.GITHUB_WEBHOOK_SECRET = WEBHOOK_SECRET
url = reverse("api-v1:github-webhook")
# When
@@ -751,9 +822,9 @@ def test_github_webhook_fails_on_signature_header_missing(
def test_github_webhook_fails_on_bad_signature_header_missing(
github_configuration: GithubConfiguration,
+ set_github_webhook_secret,
) -> None:
# Given
- settings.GITHUB_WEBHOOK_SECRET = WEBHOOK_SECRET
url = reverse("api-v1:github-webhook")
# When
@@ -774,9 +845,9 @@ def test_github_webhook_fails_on_bad_signature_header_missing(
def test_github_webhook_bypass_event(
github_configuration: GithubConfiguration,
+ set_github_webhook_secret,
) -> None:
# Given
- settings.GITHUB_WEBHOOK_SECRET = WEBHOOK_SECRET
url = reverse("api-v1:github-webhook")
# When
@@ -800,15 +871,10 @@ def test_cannot_fetch_pull_requests_when_github_request_call_failed(
organisation: Organisation,
github_configuration: GithubConfiguration,
github_repository: GithubRepository,
- mocker,
+ mock_github_client_generate_token: MagicMock,
) -> None:
-
# Given
data = {"repo_owner": "owner", "repo_name": "repo"}
- mock_generate_token = mocker.patch(
- "integrations.github.client.generate_token",
- )
- mock_generate_token.return_value = "mocked_token"
responses.add(
method="GET",
url=f"{GITHUB_API_URL}repos/{data['repo_owner']}/{data['repo_name']}/pulls",
@@ -833,14 +899,10 @@ def test_cannot_fetch_pulls_when_the_github_response_was_invalid(
organisation: Organisation,
github_configuration: GithubConfiguration,
github_repository: GithubRepository,
- mocker,
+ mock_github_client_generate_token: MagicMock,
) -> None:
# Given
data = {"repo_owner": "owner", "repo_name": "repo"}
- mock_generate_token = mocker.patch(
- "integrations.github.client.generate_token",
- )
- mock_generate_token.return_value = "mocked_token"
responses.add(
method="GET",
url=f"{GITHUB_API_URL}repos/{data['repo_owner']}/{data['repo_name']}/pulls",
@@ -877,7 +939,7 @@ def test_fetch_github_repo_contributors(
organisation: Organisation,
github_configuration: GithubConfiguration,
github_repository: GithubRepository,
- mocker: MockerFixture,
+ mock_github_client_generate_token: MagicMock,
) -> None:
# Given
url = reverse(
@@ -905,11 +967,6 @@ def test_fetch_github_repo_contributors(
expected_response = {"results": mocked_github_response}
- mock_generate_token = mocker.patch(
- "integrations.github.client.generate_token",
- )
- mock_generate_token.return_value = "mocked_token"
-
# Add response for endpoint being tested
responses.add(
method=responses.GET,
@@ -1063,3 +1120,85 @@ def test_send_the_invalid_type_page_or_page_size_param_returns_400(
assert response.status_code == status.HTTP_400_BAD_REQUEST
response_json = response.json()
assert response_json == error_response
+
+
+@responses.activate
+def test_label_and_tags_no_added_when_tagging_is_disabled(
+ admin_client_new: APIClient,
+ project: Project,
+ environment: Environment,
+ github_repository: GithubRepository,
+ feature_with_value: Feature,
+ mock_github_client_generate_token: MagicMock,
+ post_request_mock: MagicMock,
+) -> None:
+ # Given
+ github_repository.tagging_enabled = False
+ github_repository.save()
+ repository_owner_name = (
+ f"{github_repository.repository_owner}/{github_repository.repository_name}"
+ )
+
+ feature_external_resource_data = {
+ "type": "GITHUB_ISSUE",
+ "url": f"https://github.com/{repository_owner_name}/issues/35",
+ "feature": feature_with_value.id,
+ "metadata": {"state": "open"},
+ }
+
+ url = reverse(
+ "api-v1:projects:feature-external-resources-list",
+ kwargs={"project_pk": project.id, "feature_pk": feature_with_value.id},
+ )
+
+ # When
+ response = admin_client_new.post(
+ url, data=feature_external_resource_data, format="json"
+ )
+
+ # Then
+ assert response.status_code == status.HTTP_201_CREATED
+ assert feature_with_value.tags.count() == 0
+
+
+@responses.activate
+def test_update_github_repository(
+ admin_client_new: APIClient,
+ organisation: Organisation,
+ github_configuration: GithubConfiguration,
+ github_repository: GithubRepository,
+ project: Project,
+ mocker: MockerFixture,
+ mock_github_client_generate_token: MagicMock,
+) -> None:
+ # Given
+ github_repository.tagging_enabled = False
+ github_repository.save()
+ data = {
+ "github_configuration": github_configuration.id,
+ "repository_owner": "repositoryowner",
+ "repository_name": "repositoryname",
+ "project": project.id,
+ "tagging_enabled": True,
+ }
+
+ responses.add(
+ method="POST",
+ url=f"{GITHUB_API_URL}repos/repositoryowner/repositoryname/labels",
+ status=status.HTTP_200_OK,
+ json={},
+ )
+
+ url = reverse(
+ "api-v1:organisations:repositories-detail",
+ args=[organisation.id, github_configuration.id, github_repository.id],
+ )
+ # When
+ response = admin_client_new.put(url, data)
+
+ # Then
+ assert response.status_code == status.HTTP_200_OK
+ assert GithubRepository.objects.filter(repository_owner="repositoryowner").exists()
+ assert GithubRepository.objects.get(
+ repository_owner="repositoryowner"
+ ).tagging_enabled
diff --git a/api/tests/unit/organisations/test_unit_organisations_subscription_info_cache.py b/api/tests/unit/organisations/test_unit_organisations_subscription_info_cache.py
index bb59c1176b12..12523040d262 100644
--- a/api/tests/unit/organisations/test_unit_organisations_subscription_info_cache.py
+++ b/api/tests/unit/organisations/test_unit_organisations_subscription_info_cache.py
@@ -1,3 +1,7 @@
+from datetime import timedelta
+
+import pytest
+from django.utils import timezone
from task_processor.task_run_method import TaskRunMethod
from organisations.chargebee.metadata import ChargebeeObjMetadata
@@ -5,18 +9,27 @@
from organisations.subscriptions.constants import SubscriptionCacheEntity
+@pytest.mark.freeze_time("2023-01-19T09:09:47.325132+00:00")
def test_update_caches(mocker, organisation, chargebee_subscription, settings):
# Given
settings.CHARGEBEE_API_KEY = "api-key"
settings.INFLUXDB_TOKEN = "token"
settings.TASK_RUN_METHOD = TaskRunMethod.SYNCHRONOUSLY
- organisation_usage = {"24h": 25123, "7d": 182957, "30d": 804564}
+ now = timezone.now()
+ day_1 = now - timedelta(days=1)
+ day_7 = now - timedelta(days=7)
+ day_30 = now - timedelta(days=30)
+ organisation_usage = {
+ day_1: 25123,
+ day_7: 182957,
+ day_30: 804564,
+ }
mocked_get_top_organisations = mocker.patch(
"organisations.subscription_info_cache.get_top_organisations"
)
mocked_get_top_organisations.side_effect = lambda t, _: {
- organisation.id: organisation_usage.get(f"{t[1:]}")
+ organisation.id: organisation_usage[t]
}
chargebee_metadata = ChargebeeObjMetadata(seats=15, api_calls=1000000)
@@ -35,15 +48,15 @@ def test_update_caches(mocker, organisation, chargebee_subscription, settings):
# Then
assert (
organisation.subscription_information_cache.api_calls_24h
- == organisation_usage["24h"]
+ == organisation_usage[day_1]
)
assert (
organisation.subscription_information_cache.api_calls_7d
- == organisation_usage["7d"]
+ == organisation_usage[day_7]
)
assert (
organisation.subscription_information_cache.api_calls_30d
- == organisation_usage["30d"]
+ == organisation_usage[day_30]
)
assert (
organisation.subscription_information_cache.allowed_seats
@@ -60,7 +73,7 @@ def test_update_caches(mocker, organisation, chargebee_subscription, settings):
assert mocked_get_top_organisations.call_count == 3
assert [call[0] for call in mocked_get_top_organisations.call_args_list] == [
- ("-30d", ""),
- ("-7d", ""),
- ("-24h", "100"),
+ (day_30, ""),
+ (day_7, ""),
+ (day_1, "100"),
]
diff --git a/api/tests/unit/organisations/test_unit_organisations_tasks.py b/api/tests/unit/organisations/test_unit_organisations_tasks.py
index 62708193a94a..4d1a38dd9520 100644
--- a/api/tests/unit/organisations/test_unit_organisations_tasks.py
+++ b/api/tests/unit/organisations/test_unit_organisations_tasks.py
@@ -362,11 +362,28 @@ def test_handle_api_usage_notifications_below_100(
organisation=organisation,
).exists()
+ # Create an OrganisationApiUsageNotification object for another organisation
+ # to verify that only the correct organisation's notifications are taken into
+ # account.
+ another_organisation = Organisation.objects.create(name="Another Organisation")
+ OrganisationAPIUsageNotification.objects.create(
+ organisation=another_organisation,
+ percent_usage=100,
+ notified_at=now - timedelta(days=1),
+ )
+
# When
handle_api_usage_notifications()
# Then
- mock_api_usage.assert_called_once_with(organisation.id, "-14d")
+ assert len(mock_api_usage.call_args_list) == 2
+
+ # We only care about the call for the main organisation,
+ # not the call for 'another_organisation'
+ assert mock_api_usage.call_args_list[0].args == (
+ organisation.id,
+ now - timedelta(days=14),
+ )
assert len(mailoutbox) == 1
email = mailoutbox[0]
@@ -410,7 +427,12 @@ def test_handle_api_usage_notifications_below_100(
).count()
== 1
)
- assert OrganisationAPIUsageNotification.objects.first() == api_usage_notification
+ assert (
+ OrganisationAPIUsageNotification.objects.filter(
+ organisation=organisation
+ ).first()
+ == api_usage_notification
+ )
@pytest.mark.freeze_time("2023-01-19T09:09:47.325132+00:00")
@@ -452,7 +474,7 @@ def test_handle_api_usage_notifications_below_api_usage_alert_thresholds(
handle_api_usage_notifications()
# Then
- mock_api_usage.assert_called_once_with(organisation.id, "-14d")
+ mock_api_usage.assert_called_once_with(organisation.id, now - timedelta(days=14))
assert len(mailoutbox) == 0
@@ -502,7 +524,7 @@ def test_handle_api_usage_notifications_above_100(
handle_api_usage_notifications()
# Then
- mock_api_usage.assert_called_once_with(organisation.id, "-14d")
+ mock_api_usage.assert_called_once_with(organisation.id, now - timedelta(days=14))
assert len(mailoutbox) == 1
email = mailoutbox[0]
@@ -612,6 +634,7 @@ def test_handle_api_usage_notifications_for_free_accounts(
mailoutbox: list[EmailMultiAlternatives],
) -> None:
# Given
+ now = timezone.now()
assert organisation.is_paid is False
assert organisation.subscription.is_free_plan is True
assert organisation.subscription.max_api_calls == MAX_API_CALLS_IN_FREE_PLAN
@@ -634,14 +657,18 @@ def test_handle_api_usage_notifications_for_free_accounts(
handle_api_usage_notifications()
# Then
- mock_api_usage.assert_called_once_with(organisation.id, "-30d")
+ mock_api_usage.assert_called_once_with(organisation.id, now - timedelta(days=30))
assert len(mailoutbox) == 1
email = mailoutbox[0]
assert email.subject == "Flagsmith API use has reached 100%"
assert email.body == render_to_string(
"organisations/api_usage_notification_limit.txt",
- context={"organisation": organisation, "matched_threshold": 100},
+ context={
+ "organisation": organisation,
+ "matched_threshold": 100,
+ "grace_period": True,
+ },
)
assert len(email.alternatives) == 1
@@ -650,7 +677,11 @@ def test_handle_api_usage_notifications_for_free_accounts(
assert email.alternatives[0][0] == render_to_string(
"organisations/api_usage_notification_limit.html",
- context={"organisation": organisation, "matched_threshold": 100},
+ context={
+ "organisation": organisation,
+ "matched_threshold": 100,
+ "grace_period": True,
+ },
)
assert email.from_email == "noreply@flagsmith.com"
diff --git a/api/tests/unit/organisations/test_unit_organisations_views.py b/api/tests/unit/organisations/test_unit_organisations_views.py
index de129f3c9fc1..9ce266788f35 100644
--- a/api/tests/unit/organisations/test_unit_organisations_views.py
+++ b/api/tests/unit/organisations/test_unit_organisations_views.py
@@ -351,6 +351,7 @@ def test_user_can_get_projects_for_an_organisation(
assert response.data[0]["name"] == project.name
+@pytest.mark.freeze_time("2023-01-19T09:09:47.325132+00:00")
@mock.patch("app_analytics.influxdb_wrapper.influxdb_client")
def test_should_get_usage_for_organisation(
mock_influxdb_client: MagicMock,
@@ -365,7 +366,7 @@ def test_should_get_usage_for_organisation(
expected_query = (
(
f'from(bucket:"{read_bucket}") '
- "|> range(start: -30d, stop: now()) "
+ "|> range(start: 2022-12-20T09:09:47.325132+00:00, stop: 2023-01-19T09:09:47.325132+00:00) "
'|> filter(fn:(r) => r._measurement == "api_call") '
'|> filter(fn: (r) => r["_field"] == "request_count") '
f'|> filter(fn: (r) => r["organisation_id"] == "{organisation.id}") '
diff --git a/api/tests/unit/projects/test_unit_projects_permissions.py b/api/tests/unit/projects/test_unit_projects_permissions.py
index 8662d58bbd5f..f7b874ab0b81 100644
--- a/api/tests/unit/projects/test_unit_projects_permissions.py
+++ b/api/tests/unit/projects/test_unit_projects_permissions.py
@@ -1,3 +1,4 @@
+import os
from unittest import mock
import pytest
@@ -58,6 +59,31 @@ def test_create_project_has_permission(
assert response is True
+def test_create_project_has_permission_with_e2e_test_auth_token(
+ staff_user: FFAdminUser,
+ organisation: Organisation,
+ with_organisation_permissions: WithOrganisationPermissionsCallable,
+) -> None:
+ # Given
+ with_organisation_permissions([CREATE_PROJECT])
+ mock_request = mock.MagicMock(
+ user=staff_user, data={"name": "Test", "organisation": organisation.id}
+ )
+ token = "test-token"
+ settings.ENABLE_FE_E2E = True
+ os.environ["E2E_TEST_AUTH_TOKEN"] = token
+
+ mock_request.META = {"E2E_TEST_AUTH_TOKEN": token}
+ mock_view = mock.MagicMock(action="create", detail=False)
+ project_permissions = ProjectPermissions()
+
+ # When
+ response = project_permissions.has_permission(mock_request, mock_view)
+
+ # Then
+ assert response is True
+
+
def test_admin_can_update_project_has_permission(
organisation: Organisation,
staff_user: FFAdminUser,
diff --git a/api/tests/unit/sales_dashboard/test_unit_sales_dashboard_views.py b/api/tests/unit/sales_dashboard/test_unit_sales_dashboard_views.py
index 1ab1dc97fcf0..2cbeacd723cf 100644
--- a/api/tests/unit/sales_dashboard/test_unit_sales_dashboard_views.py
+++ b/api/tests/unit/sales_dashboard/test_unit_sales_dashboard_views.py
@@ -1,6 +1,9 @@
+from datetime import timedelta
+
import pytest
from django.test import Client, RequestFactory
from django.urls import reverse
+from django.utils import timezone
from pytest_django.fixtures import SettingsWrapper
from pytest_mock import MockerFixture
from rest_framework.test import APIClient
@@ -39,6 +42,7 @@ def test_organisation_subscription_get_api_call_overage(
assert result.overage == expected_overage
+@pytest.mark.freeze_time("2023-01-19T09:09:47.325132+00:00")
def test_get_organisation_info__get_event_list_for_organisation(
organisation: Organisation,
superuser_client: APIClient,
@@ -65,7 +69,8 @@ def test_get_organisation_info__get_event_list_for_organisation(
# Then
assert "label1" in str(response.content)
assert "label2" in str(response.content)
- event_list_mock.assert_called_once_with(organisation.id, "-180d", "now()")
+ date_start = timezone.now() - timedelta(days=180)
+ event_list_mock.assert_called_once_with(organisation.id, date_start)
def test_list_organisations_search_by_name(
diff --git a/api/tests/unit/segments/test_migrations.py b/api/tests/unit/segments/test_migrations.py
deleted file mode 100644
index f2b181a81165..000000000000
--- a/api/tests/unit/segments/test_migrations.py
+++ /dev/null
@@ -1,104 +0,0 @@
-import pytest
-from django.conf import settings as test_settings
-from django_test_migrations.migrator import Migrator
-from flag_engine.segments import constants
-from pytest_django.fixtures import SettingsWrapper
-
-
-@pytest.mark.skipif(
- test_settings.SKIP_MIGRATION_TESTS is True,
- reason="Skip migration tests to speed up tests where necessary",
-)
-def test_create_whitelisted_segments_migration(
- migrator: Migrator,
- settings: SettingsWrapper,
-) -> None:
- # Given - The migration state is at 0020 (before the migration we want to test).
- old_state = migrator.apply_initial_migration(
- ("segments", "0020_detach_segment_from_project_cascade_delete")
- )
-
- Organisation = old_state.apps.get_model("organisations", "Organisation")
- Project = old_state.apps.get_model("projects", "Project")
- SegmentRule = old_state.apps.get_model("segments", "SegmentRule")
- Segment = old_state.apps.get_model("segments", "Segment")
- Condition = old_state.apps.get_model("segments", "Condition")
-
- # Set the limit lower to allow for a faster test.
- settings.SEGMENT_RULES_CONDITIONS_LIMIT = 3
-
- # Next, create the setup data.
- organisation = Organisation.objects.create(name="Big Corp Incorporated")
- project = Project.objects.create(name="Huge Project", organisation=organisation)
-
- segment_1 = Segment.objects.create(name="Segment1", project=project)
- segment_2 = Segment.objects.create(name="Segment1", project=project)
- segment_rule_1 = SegmentRule.objects.create(
- segment=segment_1,
- type="ALL",
- )
-
- # Subnested segment rules.
- segment_rule_2 = SegmentRule.objects.create(
- rule=segment_rule_1,
- type="ALL",
- )
- segment_rule_3 = SegmentRule.objects.create(
- rule=segment_rule_1,
- type="ALL",
- )
-
- # Lonely segment rules for pass criteria for segment_2.
- segment_rule_4 = SegmentRule.objects.create(
- segment=segment_2,
- type="ALL",
- )
- segment_rule_5 = SegmentRule.objects.create(
- rule=segment_rule_4,
- type="ALL",
- )
-
- Condition.objects.create(
- operator=constants.EQUAL,
- property="age",
- value="21",
- rule=segment_rule_2,
- )
- Condition.objects.create(
- operator=constants.GREATER_THAN,
- property="height",
- value="210",
- rule=segment_rule_2,
- )
- Condition.objects.create(
- operator=constants.GREATER_THAN,
- property="waist",
- value="36",
- rule=segment_rule_3,
- )
- Condition.objects.create(
- operator=constants.LESS_THAN,
- property="shoes",
- value="12",
- rule=segment_rule_3,
- )
-
- # Sole criteria for segment_2 conditions.
- Condition.objects.create(
- operator=constants.LESS_THAN,
- property="toy_count",
- value="7",
- rule=segment_rule_5,
- )
-
- # When we run the migration.
- new_state = migrator.apply_tested_migration(
- ("segments", "0021_create_whitelisted_segments")
- )
-
- # Then the first segment is in the whitelist while the second is not.
- NewSegment = new_state.apps.get_model("segments", "Segment")
- new_segment_1 = NewSegment.objects.get(id=segment_1.id)
- new_segment_2 = NewSegment.objects.get(id=segment_2.id)
- assert new_segment_1.whitelisted_segment
- assert getattr(new_segment_2, "whitelisted_segment", None) is None
diff --git a/api/tests/unit/segments/test_unit_segments_migrations.py b/api/tests/unit/segments/test_unit_segments_migrations.py
new file mode 100644
index 000000000000..b6f85808bba5
--- /dev/null
+++ b/api/tests/unit/segments/test_unit_segments_migrations.py
@@ -0,0 +1,239 @@
+import uuid
+
+import pytest
+from django.conf import settings as test_settings
+from django_test_migrations.migrator import Migrator
+from flag_engine.segments import constants
+from pytest_django.fixtures import SettingsWrapper
+
+
+@pytest.mark.skipif(
+ test_settings.SKIP_MIGRATION_TESTS is True,
+ reason="Skip migration tests to speed up tests where necessary",
+)
+def test_create_whitelisted_segments_migration(
+ migrator: Migrator,
+ settings: SettingsWrapper,
+) -> None:
+ # Given - The migration state is at 0020 (before the migration we want to test).
+ old_state = migrator.apply_initial_migration(
+ ("segments", "0020_detach_segment_from_project_cascade_delete")
+ )
+
+ Organisation = old_state.apps.get_model("organisations", "Organisation")
+ Project = old_state.apps.get_model("projects", "Project")
+ SegmentRule = old_state.apps.get_model("segments", "SegmentRule")
+ Segment = old_state.apps.get_model("segments", "Segment")
+ Condition = old_state.apps.get_model("segments", "Condition")
+
+ # Set the limit lower to allow for a faster test.
+ settings.SEGMENT_RULES_CONDITIONS_LIMIT = 3
+
+ # Next, create the setup data.
+ organisation = Organisation.objects.create(name="Big Corp Incorporated")
+ project = Project.objects.create(name="Huge Project", organisation=organisation)
+
+ segment_1 = Segment.objects.create(name="Segment1", project=project)
+ segment_2 = Segment.objects.create(name="Segment1", project=project)
+ segment_rule_1 = SegmentRule.objects.create(
+ segment=segment_1,
+ type="ALL",
+ )
+
+ # Subnested segment rules.
+ segment_rule_2 = SegmentRule.objects.create(
+ rule=segment_rule_1,
+ type="ALL",
+ )
+ segment_rule_3 = SegmentRule.objects.create(
+ rule=segment_rule_1,
+ type="ALL",
+ )
+
+ # Lonely segment rules for pass criteria for segment_2.
+ segment_rule_4 = SegmentRule.objects.create(
+ segment=segment_2,
+ type="ALL",
+ )
+ segment_rule_5 = SegmentRule.objects.create(
+ rule=segment_rule_4,
+ type="ALL",
+ )
+
+ Condition.objects.create(
+ operator=constants.EQUAL,
+ property="age",
+ value="21",
+ rule=segment_rule_2,
+ )
+ Condition.objects.create(
+ operator=constants.GREATER_THAN,
+ property="height",
+ value="210",
+ rule=segment_rule_2,
+ )
+ Condition.objects.create(
+ operator=constants.GREATER_THAN,
+ property="waist",
+ value="36",
+ rule=segment_rule_3,
+ )
+ Condition.objects.create(
+ operator=constants.LESS_THAN,
+ property="shoes",
+ value="12",
+ rule=segment_rule_3,
+ )
+
+ # Sole criteria for segment_2 conditions.
+ Condition.objects.create(
+ operator=constants.LESS_THAN,
+ property="toy_count",
+ value="7",
+ rule=segment_rule_5,
+ )
+
+ # When we run the migration.
+ new_state = migrator.apply_tested_migration(
+ ("segments", "0021_create_whitelisted_segments")
+ )
+
+ # Then the first segment is in the whitelist while the second is not.
+ NewSegment = new_state.apps.get_model("segments", "Segment")
+ new_segment_1 = NewSegment.objects.get(id=segment_1.id)
+ new_segment_2 = NewSegment.objects.get(id=segment_2.id)
+ assert new_segment_1.whitelisted_segment
+ assert getattr(new_segment_2, "whitelisted_segment", None) is None
+
+
+@pytest.mark.skipif(
+ test_settings.SKIP_MIGRATION_TESTS is True,
+ reason="Skip migration tests to speed up tests where necessary",
+)
+def test_add_versioning_to_segments_forwards(migrator: Migrator) -> None:
+ # Given - The migration state is at 0021 (before the migration we want to test).
+ old_state = migrator.apply_initial_migration(
+ ("segments", "0022_add_soft_delete_to_segment_rules_and_conditions")
+ )
+
+ Organisation = old_state.apps.get_model("organisations", "Organisation")
+ Project = old_state.apps.get_model("projects", "Project")
+ SegmentRule = old_state.apps.get_model("segments", "SegmentRule")
+ Segment = old_state.apps.get_model("segments", "Segment")
+ Condition = old_state.apps.get_model("segments", "Condition")
+
+ # Next, create the setup data.
+ organisation = Organisation.objects.create(name="Test Org")
+ project = Project.objects.create(name="Test Project", organisation_id=organisation.id)
+
+ segment = Segment.objects.create(name="Segment1", project_id=project.id)
+ segment_rule_1 = SegmentRule.objects.create(
+ segment_id=segment.id,
+ type="ALL",
+ )
+
+ # Subnested segment rules.
+ segment_rule_2 = SegmentRule.objects.create(
+ rule_id=segment_rule_1.id,
+ type="ALL",
+ )
+
+ Condition.objects.create(
+ operator=constants.EQUAL,
+ property="age",
+ value="21",
+ rule_id=segment_rule_2.id,
+ )
+
+ # When we run the migration.
+ new_state = migrator.apply_tested_migration(
+ ("segments", "0023_add_versioning_to_segments")
+ )
+
+ # Then the version_of attribute is correctly set.
+ NewSegment = new_state.apps.get_model("segments", "Segment")
+ new_segment = NewSegment.objects.get(id=segment.id)
+ assert new_segment.version_of == new_segment
+
+
+@pytest.mark.skipif(
+ test_settings.SKIP_MIGRATION_TESTS is True,
+ reason="Skip migration tests to speed up tests where necessary",
+)
+def test_add_versioning_to_segments_reverse(migrator: Migrator) -> None:
+ # Given - The migration state is at 0023 (after the migration we want to test).
+ old_state = migrator.apply_initial_migration(
+ ("segments", "0023_add_versioning_to_segments")
+ )
+
+ Organisation = old_state.apps.get_model("organisations", "Organisation")
+ Project = old_state.apps.get_model("projects", "Project")
+ SegmentRule = old_state.apps.get_model("segments", "SegmentRule")
+ Segment = old_state.apps.get_model("segments", "Segment")
+ Condition = old_state.apps.get_model("segments", "Condition")
+
+ # Next, create the setup data.
+ organisation = Organisation.objects.create(name="Test Org")
+ project = Project.objects.create(name="Test Project", organisation=organisation)
+
+ # Set the version manually since this is normally done via a lifecycle hook
+ # that doesn't run for models created in a migration state.
+ segment = Segment.objects.create(name="Segment1", project=project, version=1)
+ segment_rule_1 = SegmentRule.objects.create(
+ segment=segment,
+ type="ALL",
+ )
+
+ # We ideally want to call Segment.deep_clone but that's not
+ # possible when working in a migration state. As such, we
+ # do the basic amount necessary from that method to allow
+ # us to test the migration behaviour.
+ def _deep_clone(segment: Segment) -> Segment:
+ cloned_segment = Segment.objects.create(
+ name=segment.name,
+ project_id=segment.project_id,
+ description=segment.description,
+ feature=segment.feature,
+ uuid=uuid.uuid4(),
+ version_of_id=segment.id,
+ )
+
+ segment.version += 1
+ segment.save()
+
+ return cloned_segment
+
+ version_1 = _deep_clone(segment)
+ version_2 = _deep_clone(segment)
+
+ version_3 = segment
+
+ # Subnested segment rules.
+ segment_rule_2 = SegmentRule.objects.create(
+ rule=segment_rule_1,
+ type="ALL",
+ )
+
+ Condition.objects.create(
+ operator=constants.EQUAL,
+ property="age",
+ value="21",
+ rule=segment_rule_2,
+ )
+
+ # When we run the migration in reverse.
+ new_state = migrator.apply_tested_migration(
+ ("segments", "0022_add_soft_delete_to_segment_rules_and_conditions")
+ )
+
+ # Then any historical versions of the segment are deleted.
+ NewSegment = new_state.apps.get_model("segments", "Segment")
+
+ new_segment_v1 = NewSegment.objects.get(id=version_1.id)
+ assert new_segment_v1.deleted_at is not None
+
+ new_segment_v2 = NewSegment.objects.get(id=version_2.id)
+ assert new_segment_v2.deleted_at is not None
+
+ new_segment_v3 = NewSegment.objects.get(id=version_3.id)
+ assert new_segment_v3.deleted_at is None
diff --git a/api/webhooks/webhooks.py b/api/webhooks/webhooks.py
index 75684e886614..e314b5976f9f 100644
--- a/api/webhooks/webhooks.py
+++ b/api/webhooks/webhooks.py
@@ -39,9 +39,6 @@ class WebhookEventType(enum.Enum):
FLAG_DELETED = "FLAG_DELETED"
AUDIT_LOG_CREATED = "AUDIT_LOG_CREATED"
NEW_VERSION_PUBLISHED = "NEW_VERSION_PUBLISHED"
- FEATURE_EXTERNAL_RESOURCE_ADDED = "FEATURE_EXTERNAL_RESOURCE_ADDED"
- FEATURE_EXTERNAL_RESOURCE_REMOVED = "FEATURE_EXTERNAL_RESOURCE_REMOVED"
- SEGMENT_OVERRIDE_DELETED = "SEGMENT_OVERRIDE_DELETED"
class WebhookType(enum.Enum):
diff --git a/docs/docs/deployment/hosting/aptible.md b/docs/docs/deployment/hosting/aptible.md
new file mode 100644
index 000000000000..92c0c135dd52
--- /dev/null
+++ b/docs/docs/deployment/hosting/aptible.md
@@ -0,0 +1,75 @@
+---
+title: Aptible
+---
+
+## Prerequisites
+
+The options and health check routes described in this document are available from Flagsmith 2.130.0.
+
+## Configuration
+
+Running Flagsmith on Aptible requires some configuration tweaks because of how Aptible's application lifecycle works:
+
+- Don't wait for the database to be available before the Flagsmith API starts. You can do this by setting the
+ `SKIP_WAIT_FOR_DB` environment variable.
+- Add `containers` as an allowed host to comply with Aptible's
+ [strict health checks](https://www.aptible.com/docs/core-concepts/apps/connecting-to-apps/app-endpoints/https-endpoints/health-checks#strict-health-checks).
+- Use the `before_release` tasks from `.aptible.yml` to run database migrations
+- Use a Procfile to only start the API and not perform database migrations on startup
+
+This configuration can be applied by adding the Procfile and `.aptible.yml` configuration files to a
+[Docker image](https://www.aptible.com/docs/core-concepts/apps/deploying-apps/image/deploying-with-docker-image/overview#how-do-i-deploy-from-docker-image)
+that you build starting from a Flagsmith base image:
+
+```text title="Procfile"
+cmd: serve
+```
+
+```yaml title=".aptible.yml"
+before_release:
+ - migrate
+ - bootstrap
+```
+
+```dockerfile title="Dockerfile"
+# Use flagsmith/flagsmith-private-cloud for the Enterprise image
+FROM --platform=linux/amd64 flagsmith/flagsmith
+
+# Don't wait for the database to be available during startup for health checks to succeed
+ENV SKIP_WAIT_FOR_DB=1
+
+# Use root user to add Aptible files to the container
+USER root
+RUN mkdir /.aptible/
+ADD Procfile /.aptible/Procfile
+ADD .aptible.yml /.aptible/.aptible.yml
+
+# Use non-root user at runtime
+USER nobody
+```
+
+Before deploying, set the environment variables for your database URL and allowed hosts from the Aptible dashboard, or
+using the Aptible CLI:
+
+```shell
+aptible config:set --app flagsmith \
+ DATABASE_URL=postgresql://aptible:...@...:23532/db \
+ DJANGO_ALLOWED_HOSTS='containers,YOUR_APTIBLE_HOSTNAME'
+```
+
+## Deployment
+
+After your image is built and pushed to a container registry that Aptible can access, you can deploy it using the
+Aptible CLI as you would any other application:
+
+```shell
+aptible deploy --app flagsmith --docker-image example/my-flagsmith-aptible-image
+```
+
+Once Flagsmith is running in Aptible, make sure to create the first admin user by visiting `/api/v1/users/config/init/`.
+
+## Limitations
+
+The steps described in this document do not deploy the
+[asynchronous task processor](/deployment/configuration/task-processor), which may affect performance in production
+workloads.
diff --git a/docs/docs/deployment/hosting/locally-edge-proxy.md b/docs/docs/deployment/hosting/locally-edge-proxy.md
index 5423e39c9525..29fa55191166 100644
--- a/docs/docs/deployment/hosting/locally-edge-proxy.md
+++ b/docs/docs/deployment/hosting/locally-edge-proxy.md
@@ -294,7 +294,7 @@ domain name and you're good to go. For example, lets say you had your proxy runn
above:
```bash
-curl "http://localhost:8000/api/v1/flags" -H "x-environment-key: 95DybY5oJoRNhxPZYLrxk4" | jq
+curl "http://localhost:8000/api/v1/flags/" -H "x-environment-key: 95DybY5oJoRNhxPZYLrxk4" | jq
[
{
diff --git a/docs/docs/system-administration/rbac.md b/docs/docs/system-administration/rbac.md
index f838fcf84355..b57fcc3d6e85 100644
--- a/docs/docs/system-administration/rbac.md
+++ b/docs/docs/system-administration/rbac.md
@@ -1,100 +1,173 @@
---
-title: Role Based Access Control
+title: Role-based access control
---
-Flagsmith provides fine-grained permissions to help larger teams manage access and roles across organisations, projects
-and environments.
-
:::info
-The Permissions/Role Based Access features of Flagsmith are _not_ part of the Open Source version. If you want to use
-these features as part of a self hosted/on premise solution, please [get in touch](https://flagsmith.com/contact-us/).
+Role-based access control requires an [Enterprise subscription](https://www.flagsmith.com/pricing).
:::
-## Users, Groups, and Roles
+Role-based access control (RBAC) provides fine-grained access management of Flagsmith resources. Using RBAC, you can
+ensure users only have the access they need within your Flagsmith organisation.
-Permissions can be assigned to Flagsmith individual users, groups, or roles.
+For example, RBAC allows you to achieve the following scenarios:
-### Users
+- Only allow certain users to modify your production environments.
+- Grant a default set of permissions to all users that join your Flagsmith organisation.
+- Lock down an [Admin API](/clients/rest/#private-admin-api-endpoints) key to a specific set of permissions.
+- Provide Flagsmith permissions based on your enterprise identity provider's groups when using
+ [SAML single sign-on](/system-administration/authentication/SAML/).
-Flagsmith Users can be defined as Organisation Administrators or Users. Organisation Administrator is effectively a
-super-user role, and gives full read/write access to every Project, Environment, Flag, Remote Config and Segment within
-that Organisation.
+To add users to your Flagsmith organisation or to manage user permissions, click on your organisation name in the top
+left and open the **Users and Permissions** tab.
-Users that are not Organisation Administrators must have permissions assigned to them manually at the relevant levels.
+## Core concepts
-### Groups
+The diagram below shows an overview of how permissions are assigned within your Flagsmith organisation:
-Groups are a convenient way to manage permissions for multiple Flagsmith users. Groups can contain any number of
-Flagsmith users. You can create groups with the Organisation Settings page.
+
+```mermaid
+graph LR;
+ R[Custom roles] -->|Assigned to| G[Groups];
+ B[Built-in role] -->|Assigned to| U[Users];
+ R -->|Assigned to| U;
+ R -->|Assigned to| A[Admin API keys];
+ G -->|Contains many| U;
+```
-Members of a group can be designated as an admin for that group. As a group admin, users can manage the membership for
-that group, but not the permissions the group has on other entities.
+