Skip to content

Commit

Permalink
Release latest staging code (#591)
Browse files Browse the repository at this point in the history
* build(deps): bump django from 3.2.24 to 3.2.25

Bumps [django](https://github.com/django/django) from 3.2.24 to 3.2.25.
- [Commits](django/django@3.2.24...3.2.25)

---
updated-dependencies:
- dependency-name: django
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <[email protected]>

* stashing

* Target loader now accepts experiments marked as 'manual'

* build(deps): bump pillow from 10.2.0 to 10.3.0

Bumps [pillow](https://github.com/python-pillow/Pillow) from 10.2.0 to 10.3.0.
- [Release notes](https://github.com/python-pillow/Pillow/releases)
- [Changelog](https://github.com/python-pillow/Pillow/blob/main/CHANGES.rst)
- [Commits](python-pillow/Pillow@10.2.0...10.3.0)

---
updated-dependencies:
- dependency-name: pillow
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <[email protected]>

* Attempt to reduce pop-up "flicker" (1403) (#569)

* fix: Attempt to debug timeout errors

* fix: More logging on service_query

* fix: Varioustimeout adjustments

* fix: Removed exception during timeout

* fix: Explit log on SSH connection error

* fix: Retry attempts for MySQL connections

* fix: Service timeout now 28 (was 17)

* fix: Add pymysql read and write timeouts

* fix: Quiter (expected) connection failure handling

* fix: TIMEOUT now DEGRADED

* fix: Fix while loop exit conditions

* fix: Better loop logic

* style: services logging reduced and back to debug

* fix: SSHTunnel logging now ERROR (was DEBUG)

* fix: Quieter securty

* fix: More failures permitted (and debug tweaks)

* fix: Leaner logging

* fix: Leaner logging (only report when we're having topruble)

* fix: Better constant name

* fix: Reduced service logging

* docs: Doc tweak

* fix: Minor log tweak

* fix: Fixed duplicate log content

---------

Co-authored-by: Alan Christie <[email protected]>

* feat: endpoint to download first reference pdb from assemblies.yaml

* stashing

* stashing

Changes so far:
- removed endpoint FirstAssemblyview
- moved the functionality to template_protein field in
TargetSerializer
- removed TargetMoleculesserializer
- removed sequences field from TargetSerializer

This is a result of Boris' comment in
github (m2ms/fragalysis-frontend#1373 (comment))
where he said the field isn't used and template_protein field is not
used. Looking at the code where this may be used, revealed that
Targetmoleculesserializer can be removed as well

NB! they're not removed-removed right now, only commented in. This
commit can be used to restore the code.

* fix: removed code mentioned in previous commit

* basic functionality

TODO: add and test PATCH method on PoseSerializer

* stashing

* build(deps): bump idna from 3.6 to 3.7

Bumps [idna](https://github.com/kjd/idna) from 3.6 to 3.7.
- [Release notes](https://github.com/kjd/idna/releases)
- [Changelog](https://github.com/kjd/idna/blob/master/HISTORY.rst)
- [Commits](kjd/idna@v3.6...v3.7)

---
updated-dependencies:
- dependency-name: idna
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <[email protected]>

* stashing

* Added 'hidden' property to viewer.Tag model

Also, added exclude directives to .pre-commit-config.yaml to not touch
django migration files

* Better user/proposal cache (#576)

* refactor: Quieter ISPyB cache

* refactor: Security cache log now quieter

* fix: Better cache logic - and reduced log

---------

Co-authored-by: Alan Christie <[email protected]>

* stashing

Nested serializer for pose, but I don't think I can use them

* feat: added compound_code to pose serializer

* fix: deeper nesting level to site observaton in meta_aligner.yaml

Data loads successfully but actual v2 upload has not been tested

* fix: renamed panddas_event_files to ligand_binding_events in meta_al

* fix: static resources are now loaded again

Fixes bare html API pages

* fix: fixed nginx config

* Revert "Fix static ressources not being loaded"

* stashing

Added sorting keys for versioned key. V1 data loading, waiting for
fixes in conf site presentation to continue with v2

* fix: more robust update method

Allows sending incomplete requests (no idea why seralizer isn't
populating fields).
Also fixed a bug where field updates on poses with multiple
observations where blocked.

* feat: added pose tags

* stashing

Working on allowing incomplete requests payloads. Turns out to be
quite tricky, may have to go back on this.

* feat: fully functional versioned data

Reads and processes upload_2+ data where version numbers are given in suffix

* fix: removed some dead code

* stashing

* build(deps): bump tqdm from 4.66.1 to 4.66.3

Bumps [tqdm](https://github.com/tqdm/tqdm) from 4.66.1 to 4.66.3.
- [Release notes](https://github.com/tqdm/tqdm/releases)
- [Commits](tqdm/tqdm@v4.66.1...v4.66.3)

---
updated-dependencies:
- dependency-name: tqdm
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <[email protected]>

* feat: only non-superseded sites available from the api

* build(deps-dev): bump black from 23.12.0 to 24.3.0

Bumps [black](https://github.com/psf/black) from 23.12.0 to 24.3.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](psf/black@23.12.0...24.3.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <[email protected]>

* chore: merged conflicting migrations from issue 1311 branch

* fix: pose instance attributes updating successfully

* fix: better check for already uploaded data

* fix: merge conflicting migrations

Not sure why the previous merge didn't work

* Adds basic metrics (#588)

* feat: Experiment with django prometheus

* feat: Fix build

* fix: Fix build (locking drf)

* feat: Fix lock file

* feat: Update to non-slim Python 3.11.9

* feat: Back to slim image

* fix: Some basic internal metrics

* fix: Removed rogue line

* fix: Fix lint issues

* fix: Removed custom metrics

---------

Co-authored-by: Alan Christie <[email protected]>

* Attempt to add prometheus to DB (#589)

Co-authored-by: Alan Christie <[email protected]>

---------

Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Kalev Takkis <[email protected]>
Co-authored-by: Alan Christie <[email protected]>
Co-authored-by: Kalev Takkis <[email protected]>
Co-authored-by: Warren Thompson <[email protected]>
  • Loading branch information
6 people authored May 21, 2024
1 parent 3f59569 commit d4681de
Show file tree
Hide file tree
Showing 25 changed files with 1,539 additions and 999 deletions.
4 changes: 4 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@ repos:
rev: 5.13.1
hooks:
- id: isort
exclude: 'migrations/'
args:
- --profile
- black
Expand All @@ -38,6 +39,7 @@ repos:
rev: 23.12.0
hooks:
- id: black
exclude: 'migrations/'
args:
- --skip-string-normalization
- --target-version
Expand All @@ -51,6 +53,7 @@ repos:
rev: v1.7.1
hooks:
- id: mypy
exclude: 'migrations/'
additional_dependencies:
- types-PyYAML
- types-pymysql
Expand All @@ -63,6 +66,7 @@ repos:
rev: v3.0.3
hooks:
- id: pylint
exclude: 'migrations/'
additional_dependencies:
- pylint-django
args:
Expand Down
2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM python:3.11.7-slim-bullseye AS python-base
FROM python:3.11.9-slim-bullseye AS python-base

ENV PYTHONUNBUFFERED 1
ENV PYTHONDONTWRITEBYTECODE 1
Expand Down
6 changes: 2 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -283,11 +283,9 @@ pre-commit documentation.

Ideally from a Python environment...

python -m venv venv
source venv/bin/activate
poetry shell
poetry install --only dev

pip install --upgrade pip
pip install -r build-requirements.txt
pre-commit install -t commit-msg -t pre-commit

Now the project's rules will run on every commit and you can check the
Expand Down
147 changes: 57 additions & 90 deletions api/security.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,11 @@
# pylint: skip-file
import logging
import os
import threading
from datetime import datetime, timedelta
from functools import cache
from pathlib import Path
from typing import Any, Dict, Optional, Union
from typing import Any, Dict, List, Optional, Union
from wsgiref.util import FileWrapper

from django.conf import settings
Expand All @@ -19,32 +21,49 @@

logger: logging.Logger = logging.getLogger(__name__)

# Sets of cached query results, indexed by username.
# The cache uses the key 'RESULTS' and the collection time uses the key 'TIMESTAMP'.
# and the time the cache is expires is in 'EXPIRES_AT'
USER_PROPOSAL_CACHE: Dict[str, Dict[str, Any]] = {}
# Period to cache user lists in seconds (on successful reads from the connector)
USER_PROPOSAL_CACHE_MAX_AGE: timedelta = timedelta(
minutes=settings.SECURITY_CONNECTOR_CACHE_MINUTES
)
# A short period, used when caching of results fails.
# This ensures a rapid retry on failure.
USER_PROPOSAL_CACHE_RETRY_TIMEOUT: timedelta = timedelta(seconds=7)

# example test:
# from rest_framework.test import APIRequestFactory
#
# from rest_framework.test import force_authenticate
# from viewer.views import TargetView
# from django.contrib.auth.models import User
#
# factory = APIRequestFactory()
# view = TargetView.as_view({'get': 'list'})
# user = User.objects.get(username='uzw12877')
# # Make an authenticated request to the view...
# request = factory.get('/api/targets/')
# force_authenticate(request, user=user)
# response = view(request)

@cache
class CachedContent:
"""A static class managing caches proposals/visits for each user.
Proposals should be collected when has_expired() returns True.
Content can be written (when the cache for the user has expired)
and read using the set/get methods.
"""

_timers: Dict[str, datetime] = {}
_content: Dict[str, List[str]] = {}
_cache_period: timedelta = timedelta(
minutes=settings.SECURITY_CONNECTOR_CACHE_MINUTES
)
_cache_lock: threading.Lock = threading.Lock()

@staticmethod
def has_expired(username) -> bool:
assert username
with CachedContent._cache_lock:
has_expired = False
now = datetime.now()
if username not in CachedContent._timers:
# User's not known,
# initialise an entry that will automatically expire
CachedContent._timers[username] = now
if CachedContent._timers[username] <= now:
has_expired = True
# Expired, reset the expiry time
CachedContent._timers[username] = now + CachedContent._cache_period
return has_expired

@staticmethod
def get_content(username):
with CachedContent._cache_lock:
if username not in CachedContent._content:
CachedContent._content[username] = []
return CachedContent._content[username]

@staticmethod
def set_content(username, content) -> None:
with CachedContent._cache_lock:
CachedContent._content[username] = content.copy()


def get_remote_conn(force_error_display=False) -> Optional[SSHConnector]:
Expand Down Expand Up @@ -197,57 +216,6 @@ def _get_proposals_for_user_from_django(self, user):
)
return prop_ids

def _cache_needs_updating(self, user):
"""True of the data for a user now needs to be collected
(e.g. the cache is out of date). The response is also True for the first
call for each user. When data is successfully collected you need to
call '_populate_cache()' with the user and new cache content.
This will set the cache content and the cache timestamp.
"""
now = datetime.now()
if user.username not in USER_PROPOSAL_CACHE:
# Unknown user - initilise the entry for this user.
# And make suer it immediately expires!
USER_PROPOSAL_CACHE[user.username] = {
"RESULTS": set(),
"TIMESTAMP": None,
"EXPIRES_AT": now,
}

# Has the cache expired?
return now >= USER_PROPOSAL_CACHE[user.username]["EXPIRES_AT"]

def _populate_cache(self, user, new_content):
"""Called by code that collects content to replace the cache with new content,
this is typically from '_get_proposals_from_connector()'. The underlying map's
TIMESTAMP for the user will also be set (to 'now') to reflect the time the
cache was most recently populated.
"""
username = user.username
USER_PROPOSAL_CACHE[username]["RESULTS"] = new_content.copy()
# Set the timestamp (which records when the cache was populated with 'stuff')
# and ensure it will now expire after USER_PROPOSAL_CACHE_SECONDS.
now = datetime.now()
USER_PROPOSAL_CACHE[username]["TIMESTAMP"] = now
USER_PROPOSAL_CACHE[username]["EXPIRES_AT"] = now + USER_PROPOSAL_CACHE_MAX_AGE
logger.info(
"USER_PROPOSAL_CACHE populated for '%s' (expires at %s)",
username,
USER_PROPOSAL_CACHE[username]["EXPIRES_AT"],
)

def _mark_cache_collection_failure(self, user):
"""Called by code that collects content to indicate that although the cache
should have been collected it has not (trough some other problem).
Under these circumstances the cache will not be updated but we have the opportunity
to set a new, short, 'expiry' time. In this way, cache collection will occur
again soon. The cache and its timestamp are left intact.
"""
now = datetime.now()
USER_PROPOSAL_CACHE[user.username]["EXPIRES_AT"] = (
now + USER_PROPOSAL_CACHE_RETRY_TIMEOUT
)

def _run_query_with_connector(self, conn, user):
core = conn.core
try:
Expand All @@ -262,8 +230,8 @@ def _run_query_with_connector(self, conn, user):
return rs

def _get_proposals_for_user_from_ispyb(self, user):
if self._cache_needs_updating(user):
logger.info("user='%s' (needs_updating)", user.username)
if CachedContent.has_expired(user.username):
logger.info("Cache has expired for '%s'", user.username)
if conn := get_configured_connector():
logger.debug("Got a connector for '%s'", user.username)
self._get_proposals_from_connector(user, conn)
Expand All @@ -272,15 +240,16 @@ def _get_proposals_for_user_from_ispyb(self, user):
self._mark_cache_collection_failure(user)

# The cache has either been updated, has not changed or is empty.
# Return what we have for the user. If required, public (open) proposals
# will be added to what we return.
cached_prop_ids = USER_PROPOSAL_CACHE[user.username]["RESULTS"]
logger.info(
"Got %s proposals for '%s': %s",
# Return what we have for the user. Public (open) proposals
# will be added to what we return if necessary.
cached_prop_ids = CachedContent.get_content(user.username)
logger.debug(
"Have %s cached Proposals for '%s': %s",
len(cached_prop_ids),
user.username,
cached_prop_ids,
)

return cached_prop_ids

def _get_proposals_from_connector(self, user, conn):
Expand Down Expand Up @@ -342,9 +311,7 @@ def _get_proposals_from_connector(self, user, conn):
user.username,
prop_id_set,
)

# Replace the cache with what we've collected
self._populate_cache(user, prop_id_set)
CachedContent.set_content(user.username, prop_id_set)

def get_proposals_for_user(self, user, restrict_to_membership=False):
"""Returns a list of proposals that the user has access to.
Expand All @@ -369,10 +336,10 @@ def get_proposals_for_user(self, user, restrict_to_membership=False):
)
if ispyb_user:
if user.is_authenticated:
logger.info("Getting proposals from ISPyB...")
logger.debug("Getting proposals from ISPyB...")
proposals = self._get_proposals_for_user_from_ispyb(user)
else:
logger.info("Getting proposals from Django...")
logger.debug("Getting proposals from Django...")
proposals = self._get_proposals_for_user_from_django(user)

# We have all the proposals where the user has authority.
Expand Down
4 changes: 1 addition & 3 deletions api/urls.py
Original file line number Diff line number Diff line change
Expand Up @@ -88,9 +88,6 @@
viewer_views.SessionProjectTagView,
basename='session_project_tag',
)
router.register(
"target_molecules", viewer_views.TargetMoleculesView, basename='target_molecules'
)

# Download a zip file of the requested contents
router.register(
Expand Down Expand Up @@ -125,6 +122,7 @@
"canon_site_confs", viewer_views.CanonSiteConfs, basename='canon_site_confs'
)
router.register("xtalform_sites", viewer_views.XtalformSites, basename='xtalform_sites')
router.register("poses", viewer_views.PoseView, basename='poses')

# Squonk Jobs
router.register(
Expand Down
5 changes: 4 additions & 1 deletion fragalysis/settings.py
Original file line number Diff line number Diff line change
Expand Up @@ -153,6 +153,7 @@
"webpack_loader",
"django_cleanup",
"simple_history",
"django_prometheus",
]

LANGUAGE_CODE = "en-us"
Expand All @@ -167,6 +168,7 @@
LOGOUT_REDIRECT_URL = "/viewer/react/landing"

MIDDLEWARE = [
"django_prometheus.middleware.PrometheusBeforeMiddleware",
"django.middleware.security.SecurityMiddleware",
"django.contrib.sessions.middleware.SessionMiddleware",
"django.middleware.common.CommonMiddleware",
Expand All @@ -175,6 +177,7 @@
"django.contrib.messages.middleware.MessageMiddleware",
"django.middleware.clickjacking.XFrameOptionsMiddleware",
"mozilla_django_oidc.middleware.SessionRefresh",
"django_prometheus.middleware.PrometheusAfterMiddleware",
]

PROJECT_ROOT = os.path.abspath(os.path.join(BASE_DIR, ".."))
Expand Down Expand Up @@ -331,7 +334,7 @@

DATABASES = {
"default": {
"ENGINE": "django.db.backends.postgresql_psycopg2",
"ENGINE": "django_prometheus.db.backends.postgresql",
"NAME": os.environ.get("POSTGRESQL_DATABASE", "frag"),
"USER": os.environ.get("POSTGRESQL_USER", "fragalysis"),
"PASSWORD": os.environ.get("POSTGRESQL_PASSWORD", "fragalysis"),
Expand Down
1 change: 1 addition & 0 deletions fragalysis/urls.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,4 +49,5 @@
mozilla_django_oidc.views.OIDCAuthenticationCallbackView.as_view(),
name="keycloak_callback",
),
path("", include("django_prometheus.urls")),
]
Loading

0 comments on commit d4681de

Please sign in to comment.