Update to channels 2

* geis does not work with channels2 and never will be (it has to be python now)
* pytest
* rewrote cache system
* use username instead of pk for admin user in tests
This commit is contained in:
Oskar Hahn 2018-07-09 23:22:26 +02:00
parent dbd808c02b
commit 10b3bb6497
76 changed files with 2834 additions and 1943 deletions

1
.gitignore vendored
View File

@ -30,6 +30,7 @@ debug/*
# Unit test and coverage reports
.coverage
tests/file/*
.pytest_cache
# Plugin development
openslides_*

View File

@ -22,15 +22,9 @@ install:
- node_modules/.bin/gulp --production
script:
- flake8 openslides tests
- isort --check-only --recursive openslides tests
- isort --check-only --diff --recursive openslides tests
- python -m mypy openslides/
- node_modules/.bin/gulp jshint
- node_modules/.bin/karma start --browsers PhantomJS tests/karma/karma.conf.js
- DJANGO_SETTINGS_MODULE='tests.settings' coverage run ./manage.py test tests.unit
- coverage report --fail-under=35
- DJANGO_SETTINGS_MODULE='tests.settings' coverage run ./manage.py test tests.integration
- coverage report --fail-under=73
- DJANGO_SETTINGS_MODULE='tests.settings' ./manage.py test tests.old
- pytest --cov --cov-fail-under=70

View File

@ -78,7 +78,7 @@ To get help on the command line options run::
Later you might want to restart the server with one of the following commands.
To start OpenSlides with Daphne and one worker and to avoid opening new browser
To start OpenSlides with Daphne and to avoid opening new browser
windows run::
$ python manage.py start --no-browser
@ -87,16 +87,10 @@ When debugging something email related change the email backend to console::
$ python manage.py start --debug-email
To start OpenSlides with Daphne and four workers (avoid concurrent write
requests or use PostgreSQL, see below) run::
To start OpenSlides with Daphne run::
$ python manage.py runserver
To start OpenSlides with Geiss and one worker and to avoid opening new browser
windows (download Geiss and setup Redis before, see below) run::
$ python manage.py start --no-browser --use-geiss
Use gulp watch in a second command-line interface::
$ node_modules/.bin/gulp watch
@ -152,8 +146,7 @@ OpenSlides in big mode
In the so called big mode you should use OpenSlides with Redis, PostgreSQL and a
webserver like Apache HTTP Server or nginx as proxy server in front of your
OpenSlides interface server. Optionally you can use `Geiss
<https://github.com/ostcar/geiss/>`_ as interface server instead of Daphne.
OpenSlides interface server.
1. Install and configure PostgreSQL and Redis
@ -200,23 +193,12 @@ Populate your new database::
4. Run OpenSlides
-----------------
First start e. g. four workers (do not use the `--threads` option, because the threads will not spawn across all cores)::
$ python manage.py runworker&
$ python manage.py runworker&
$ python manage.py runworker&
$ python manage.py runworker&
To start Daphne as protocol server run::
$ export DJANGO_SETTINGS_MODULE=settings
$ export PYTHONPATH=personal_data/var/
$ daphne openslides.asgi:channel_layer
To use Geiss instead of Daphne, just download Geiss and start it::
$ python manage.py getgeiss
$ ./personal_data/var/geiss
5. Use Nginx (optional)
@ -224,7 +206,7 @@ When using Nginx as a proxy for delivering staticfiles the performance of the se
$ python manage.py collectstatic
This is an example configuration for a single Daphne/Geiss listen on port 8000::
This is an example configuration for a single Daphne listen on port 8000::
server {
listen 80;
@ -259,7 +241,7 @@ This is an example configuration for a single Daphne/Geiss listen on port 8000::
}
}
Using Nginx as a load balancer is fairly easy. Just start multiple Daphnes/Geiss on different ports, change the `proxy_pass` to `http://openslides/` and add this on top of the Nginx configuration::
Using Nginx as a load balancer is fairly easy. Just start multiple Daphnes on different ports, change the `proxy_pass` to `http://openslides/` and add this on top of the Nginx configuration::
upstream openslides {
server localhost:2001;

View File

@ -148,13 +148,12 @@ You should use a webserver like Apache HTTP Server or nginx to serve the
static and media files as proxy server in front of your OpenSlides
interface server. You also should use a database like PostgreSQL and Redis
as channels backend, cache backend and session engine. Finally you should
start some WSGI workers and one or more interface servers (Daphne or Geiss).
start one or more interface servers (Daphne).
Please see the respective section in the `DEVELOPMENT.rst
<https://github.com/OpenSlides/OpenSlides/blob/master/DEVELOPMENT.rst>`_ and:
* https://channels.readthedocs.io/en/latest/deploying.html
* https://github.com/ostcar/geiss
* https://docs.djangoproject.com/en/1.10/topics/cache/
* https://github.com/sebleier/django-redis-cache
* https://docs.djangoproject.com/en/1.10/ref/settings/#databases

View File

@ -1,44 +1,12 @@
import re
from parser import command, argument, call
import yaml
import requirements
FAIL = '\033[91m'
SUCCESS = '\033[92m'
RESET = '\033[0m'
@argument('module', nargs='?', default='')
@command('test', help='runs the tests')
def test(args=None):
"""
Runs the tests.
"""
module = getattr(args, 'module', '')
if module == '':
module = 'tests'
else:
module = 'tests.{}'.format(module)
return call("DJANGO_SETTINGS_MODULE='tests.settings' coverage run "
"./manage.py test {}".format(module))
@argument('--plain', action='store_true')
@command('coverage', help='Runs all tests and builds the coverage html files')
def coverage(args=None, plain=None):
"""
Runs the tests and creates a coverage report.
By default it creates a html report. With the argument --plain, it creates
a plain report and fails under a certain amount of untested lines.
"""
if plain is None:
plain = getattr(args, 'plain', False)
if plain:
return call('coverage report -m --fail-under=80')
else:
return call('coverage html')
@command('check', help='Checks for pep8 errors in openslides and tests')
def check(args=None):
"""
@ -54,17 +22,10 @@ def travis(args=None):
"""
return_codes = []
with open('.travis.yml') as f:
script_lines = False
for line in (line.strip() for line in f.readlines()):
if line == 'script:':
script_lines = True
continue
if not script_lines or not line:
continue
match = re.search(r'"(.*)"', line)
print('Run: %s' % match.group(1))
return_code = call(match.group(1))
travis = yaml.load(f)
for line in travis['script']:
print('Run: {}'.format(line))
return_code = call(line)
return_codes.append(return_code)
if return_code:
print(FAIL + 'fail!\n' + RESET)
@ -76,7 +37,7 @@ def travis(args=None):
@argument('-r', '--requirements', nargs='?',
default='requirements_production.txt')
default='requirements.txt')
@command('min_requirements',
help='Prints a pip line to install the minimum supported versions of '
'the requirements.')
@ -85,23 +46,19 @@ def min_requirements(args=None):
Prints a pip install command to install the minimal supported versions of a
requirement file.
Uses requirements_production.txt by default.
Uses requirements.txt by default.
The following line will install the version:
pip install $(python make min_requirements)
"""
import pip
def get_lowest_versions(requirements_file):
for line in pip.req.parse_requirements(requirements_file, session=pip.download.PipSession()):
for specifier in line.req.specifier:
if specifier.operator == '>=':
min_version = specifier.version
break
else:
raise ValueError('Not supported line {}'.format(line))
yield '%s==%s' % (line.req.name, min_version)
with open(requirements_file) as f:
for req in requirements.parse(f):
if req.specifier:
for spec, version in req.specs:
if spec == ">=":
yield "{}=={}".format(req.name, version)
print(' '.join(get_lowest_versions(args.requirements)))

4
make/requirements.txt Normal file
View File

@ -0,0 +1,4 @@
# Requirements for the make scripts
requirements-parser
PyYAML

View File

@ -1,7 +1,6 @@
#!/usr/bin/env python
import os
import subprocess
import sys
from typing import Dict # noqa
@ -14,7 +13,6 @@ from openslides.utils.main import (
ExceptionArgumentParser,
UnknownCommand,
get_default_settings_dir,
get_geiss_path,
get_local_settings_dir,
is_local_installation,
open_browser,
@ -145,10 +143,6 @@ def get_parser():
'--local-installation',
action='store_true',
help='Store settings and user files in a local directory.')
subcommand_start.add_argument(
'--use-geiss',
action='store_true',
help='Use Geiss instead of Daphne as ASGI protocol server.')
# Subcommand createsettings
createsettings_help = 'Creates the settings file.'
@ -220,55 +214,22 @@ def start(args):
# Migrate database
call_command('migrate')
if args.use_geiss:
# Make sure Redis is used.
if settings.CHANNEL_LAYERS['default']['BACKEND'] != 'asgi_redis.RedisChannelLayer':
raise RuntimeError("You have to use the ASGI Redis backend in the settings to use Geiss.")
# Download Geiss and collect the static files.
call_command('getgeiss')
call_command('collectstatic', interactive=False)
# Open the browser
if not args.no_browser:
open_browser(args.host, args.port)
# Start Geiss in its own thread
subprocess.Popen([
get_geiss_path(),
'--host', args.host,
'--port', args.port,
'--static', '/static/:{}'.format(settings.STATIC_ROOT),
'--static', '/media/:{}'.format(settings.MEDIA_ROOT),
])
# Start one worker in this thread. There can be only one worker as
# long as SQLite3 is used.
call_command('runworker')
else:
# Open the browser
if not args.no_browser:
open_browser(args.host, args.port)
# Start Daphne and one worker
# Start Daphne
#
# Use flag --noreload to tell Django not to reload the server.
# Therefor we have to set the keyword noreload to False because Django
# parses this directly to the use_reloader keyword.
#
# Use flag --insecure to serve static files even if DEBUG is False.
#
# Use flag --nothreading to tell Django Channels to run in single
# thread mode with one worker only. Therefor we have to set the keyword
# nothreading to False because Django parses this directly to
# use_threading keyword.
call_command(
'runserver',
'{}:{}'.format(args.host, args.port),
noreload=False, # Means True, see above.
insecure=True,
nothreading=False, # Means True, see above.
)

View File

@ -1,6 +1,5 @@
from django.apps import AppConfig
from ..utils.collection import Collection
from ..utils.projector import register_projector_elements
@ -48,7 +47,7 @@ class AgendaAppConfig(AppConfig):
def get_startup_elements(self):
"""
Yields all collections required on startup i. e. opening the websocket
Yields all Cachables required on startup i. e. opening the websocket
connection.
"""
yield Collection(self.get_model('Item').get_collection_string())
yield self.get_model('Item')

View File

@ -4,8 +4,9 @@ from __future__ import unicode_literals
from django.db import migrations
from openslides.utils.migrations import \
add_permission_to_groups_based_on_existing_permission
from openslides.utils.migrations import (
add_permission_to_groups_based_on_existing_permission,
)
class Migration(migrations.Migration):

View File

@ -5,8 +5,9 @@ from __future__ import unicode_literals
from django.contrib.auth.models import Permission
from django.db import migrations, models
from openslides.utils.migrations import \
add_permission_to_groups_based_on_existing_permission
from openslides.utils.migrations import (
add_permission_to_groups_based_on_existing_permission,
)
def delete_old_can_see_hidden_permission(apps, schema_editor):

View File

@ -7,8 +7,7 @@ from django.contrib.contenttypes.fields import GenericForeignKey
from django.contrib.contenttypes.models import ContentType
from django.db import models, transaction
from django.utils import timezone
from django.utils.translation import ugettext as _
from django.utils.translation import ugettext_lazy
from django.utils.translation import ugettext as _, ugettext_lazy
from openslides.core.config import config
from openslides.core.models import Countdown, Projector

View File

@ -2,6 +2,7 @@ from django.conf.urls import url
from . import views
urlpatterns = [
url(r'^docxtemplate/$',
views.AgendaDocxTemplateView.as_view(),

View File

@ -1,12 +1,16 @@
from channels.asgi import get_channel_layer
"""
ASGI entrypoint. Configures Django and then runs the application
defined in the ASGI_APPLICATION setting.
"""
import django
from channels.routing import get_default_application
from .utils.main import setup_django_settings_module
# Loads the openslides setting. You can use your own settings by setting the
# environment variable DJANGO_SETTINGS_MODULE
setup_django_settings_module()
channel_layer = get_channel_layer()
# Use native twisted mode
channel_layer.extensions.append("twisted")
django.setup()
application = get_default_application()

View File

@ -3,7 +3,6 @@ from typing import Dict, List, Union # noqa
from django.apps import AppConfig
from mypy_extensions import TypedDict
from ..utils.collection import Collection
from ..utils.projector import register_projector_elements
@ -41,10 +40,10 @@ class AssignmentsAppConfig(AppConfig):
def get_startup_elements(self):
"""
Yields all collections required on startup i. e. opening the websocket
Yields all Cachables required on startup i. e. opening the websocket
connection.
"""
yield Collection(self.get_model('Assignment').get_collection_string())
yield self.get_model('Assignment')
def get_angular_constants(self):
assignment = self.get_model('Assignment')

View File

@ -4,8 +4,7 @@ from typing import Any, Dict, List, Optional # noqa
from django.conf import settings
from django.contrib.contenttypes.fields import GenericRelation
from django.db import models
from django.utils.translation import ugettext as _
from django.utils.translation import ugettext_noop
from django.utils.translation import ugettext as _, ugettext_noop
from openslides.agenda.models import Item, Speaker
from openslides.core.config import config

View File

@ -6,7 +6,6 @@ from django.apps import AppConfig
from django.conf import settings
from django.db.models.signals import post_migrate
from ..utils.collection import Collection
from ..utils.projector import register_projector_elements
@ -66,11 +65,11 @@ class CoreAppConfig(AppConfig):
def get_startup_elements(self):
"""
Yields all collections required on startup i. e. opening the websocket
Yields all Cachables required on startup i. e. opening the websocket
connection.
"""
for model in ('Projector', 'ChatMessage', 'Tag', 'ProjectorMessage', 'Countdown', 'ConfigStore'):
yield Collection(self.get_model(model).get_collection_string())
for model_name in ('Projector', 'ChatMessage', 'Tag', 'ProjectorMessage', 'Countdown', 'ConfigStore'):
yield self.get_model(model_name)
def get_angular_constants(self):
from .config import config

View File

@ -1,13 +1,25 @@
from typing import Any, Callable, Dict, Iterable, Optional, TypeVar, Union
from typing import (
Any,
Callable,
Dict,
Iterable,
Optional,
TypeVar,
Union,
cast,
)
from asgiref.sync import async_to_sync
from django.core.exceptions import ValidationError as DjangoValidationError
from django.utils.translation import ugettext as _
from mypy_extensions import TypedDict
from ..utils.cache import element_cache
from ..utils.collection import CollectionElement
from .exceptions import ConfigError, ConfigNotFound
from .models import ConfigStore
INPUT_TYPE_MAPPING = {
'string': str,
'text': str,
@ -37,21 +49,42 @@ class ConfigHandler:
self.config_variables = {} # type: Dict[str, ConfigVariable]
# Index to get the database id from a given config key
self.key_to_id = {} # type: Dict[str, int]
self.key_to_id = None # type: Optional[Dict[str, int]]
def __getitem__(self, key: str) -> Any:
"""
Returns the value of the config variable.
"""
# Build the key_to_id dict
self.save_default_values()
if not self.exists(key):
raise ConfigNotFound(_('The config variable {} was not found.').format(key))
return CollectionElement.from_values(
self.get_collection_string(),
self.key_to_id[key]).get_full_data()['value']
self.get_key_to_id()[key]).get_full_data()['value']
def get_key_to_id(self) -> Dict[str, int]:
"""
Returns the key_to_id dict. Builds it, if it does not exist.
"""
if self.key_to_id is None:
async_to_sync(self.build_key_to_id)()
self.key_to_id = cast(Dict[str, int], self.key_to_id)
return self.key_to_id
async def build_key_to_id(self) -> None:
"""
Build the key_to_id dict.
Recreates it, if it does not exists.
This uses the element_cache. It expects, that the config values are in the database
before this is called.
"""
self.key_to_id = {}
all_data = await element_cache.get_all_full_data()
elements = all_data[self.get_collection_string()]
for element in elements:
self.key_to_id[element['key']] = element['id']
def exists(self, key: str) -> bool:
"""
@ -183,10 +216,8 @@ class ConfigHandler:
Saves the default values to the database.
Does also build the dictonary key_to_id.
Does nothing on a second run.
"""
if not self.key_to_id:
self.key_to_id = {}
for item in self.config_variables.values():
try:
db_value = ConfigStore.objects.get(key=item.name)
@ -195,7 +226,7 @@ class ConfigHandler:
db_value.key = item.name
db_value.value = item.default_value
db_value.save(skip_autoupdate=True)
self.key_to_id[item.name] = db_value.pk
self.key_to_id[db_value.key] = db_value.id
def get_collection_string(self) -> str:
"""

View File

@ -2,8 +2,9 @@ import os
from typing import Any, Dict
from django.conf import settings
from django.contrib.staticfiles.management.commands.collectstatic import \
Command as CollectStatic
from django.contrib.staticfiles.management.commands.collectstatic import (
Command as CollectStatic,
)
from django.core.management.base import CommandError
from django.db.utils import OperationalError

View File

@ -1,91 +0,0 @@
import distutils
import json
import os
import stat
import sys
from urllib.request import urlopen, urlretrieve
from django.core.management.base import BaseCommand, CommandError
from openslides.utils.main import get_geiss_path
class Command(BaseCommand):
"""
Command to get the latest release of Geiss from GitHub.
"""
help = 'Get the latest Geiss release from GitHub.'
FIRST_NOT_SUPPORTED_VERSION = '1.0.0'
def handle(self, *args, **options):
geiss_github_name = self.get_geiss_github_name()
download_file = get_geiss_path()
if os.path.isfile(download_file):
# Geiss does probably exist. Do nothing.
# TODO: Add an update flag, that Geiss is downloaded anyway.
return
release = self.get_release()
download_url = None
for asset in release['assets']:
if asset['name'] == geiss_github_name:
download_url = asset['browser_download_url']
break
if download_url is None:
raise CommandError("Could not find download URL in release.")
urlretrieve(download_url, download_file)
# Set the executable bit on the file. This will do nothing on windows
st = os.stat(download_file)
os.chmod(download_file, st.st_mode | stat.S_IEXEC)
self.stdout.write(self.style.SUCCESS('Geiss {} successfully downloaded.'.format(release['tag_name'])))
def get_release(self):
"""
Returns API data for the latest supported Geiss release.
"""
response = urlopen(self.get_geiss_url()).read()
releases = json.loads(response.decode())
for release in releases:
version = distutils.version.StrictVersion(release['tag_name']) # type: ignore
if version < self.FIRST_NOT_SUPPORTED_VERSION:
break
else:
raise CommandError('Could not find Geiss release.')
return release
def get_geiss_url(self):
"""
Returns the URL to the API which gives the information which Geiss
binary has to be downloaded.
Currently this is a static GitHub URL to the repository where Geiss
is hosted at the moment.
"""
# TODO: Use a settings variable or a command line flag in the future.
return 'https://api.github.com/repos/ostcar/geiss/releases'
def get_geiss_github_name(self):
"""
Returns the name of the Geiss executable for the current operating
system.
For example geiss_windows_64 on a windows64 platform.
"""
# This will be 32 if the current python interpreter has only
# 32 bit, even if it is run on a 64 bit operating sysem.
bits = '64' if sys.maxsize > 2**32 else '32'
geiss_names = {
'linux': 'geiss_linux_{bits}',
'win32': 'geiss_windows_{bits}.exe', # Yes, it is win32, even on a win64 system!
'darwin': 'geiss_mac_{bits}'}
try:
return geiss_names[sys.platform].format(bits=bits)
except KeyError:
raise CommandError("Plattform {} is not supported by Geiss".format(sys.platform))

View File

@ -4,8 +4,9 @@ from __future__ import unicode_literals
from django.db import migrations
from openslides.utils.migrations import \
add_permission_to_groups_based_on_existing_permission
from openslides.utils.migrations import (
add_permission_to_groups_based_on_existing_permission,
)
class Migration(migrations.Migration):

View File

@ -8,6 +8,7 @@ from ..utils.auth import has_perm
from ..utils.collection import Collection
from .models import ChatMessage
# This signal is send when the migrate command is done. That means it is sent
# after post_migrate sending and creating all Permission objects. Don't use it
# for other things than dealing with Permission objects.
@ -38,18 +39,18 @@ def delete_django_app_permissions(sender, **kwargs):
def get_permission_change_data(sender, permissions, **kwargs):
"""
Yields all necessary collections if the respective permissions change.
Yields all necessary Cachables if the respective permissions change.
"""
core_app = apps.get_app_config(app_label='core')
for permission in permissions:
if permission.content_type.app_label == core_app.label:
if permission.codename == 'can_see_projector':
yield Collection(core_app.get_model('Projector').get_collection_string())
yield core_app.get_model('Projector')
elif permission.codename == 'can_manage_projector':
yield Collection(core_app.get_model('ProjectorMessage').get_collection_string())
yield Collection(core_app.get_model('Countdown').get_collection_string())
yield core_app.get_model('ProjectorMessage')
yield core_app.get_model('Countdown')
elif permission.codename == 'can_use_chat':
yield Collection(core_app.get_model('ChatMessage').get_collection_string())
yield core_app.get_model('ChatMessage')
def required_users(sender, request_user, **kwargs):

View File

@ -2,6 +2,7 @@ from django.conf.urls import url
from . import views
urlpatterns = [
url(r'^core/servertime/$',
views.ServerTime.as_view(),

View File

@ -11,9 +11,7 @@ from django.utils.timezone import now
from django.utils.translation import ugettext as _
from mypy_extensions import TypedDict
from .. import __license__ as license
from .. import __url__ as url
from .. import __version__ as version
from .. import __license__ as license, __url__ as url, __version__ as version
from ..utils import views as utils_views
from ..utils.auth import anonymous_is_enabled, has_perm
from ..utils.autoupdate import inform_changed_data, inform_deleted_data

View File

@ -2,6 +2,7 @@ import os
from openslides.utils.plugins import collect_plugins
MODULE_DIR = os.path.realpath(os.path.dirname(os.path.abspath(__file__)))
@ -121,31 +122,14 @@ PASSWORD_HASHERS = [
MEDIA_URL = '/media/'
# Cache
# https://docs.djangoproject.com/en/1.10/topics/cache/
CACHES = {
'default': {
'BACKEND': 'django.core.cache.backends.locmem.LocMemCache',
'LOCATION': 'openslides-cache',
'OPTIONS': {
'MAX_ENTRIES': 10000
}
}
}
# Django Channels
# http://channels.readthedocs.io/en/latest/
# https://github.com/ostcar/geiss
ASGI_APPLICATION = 'openslides.routing.application'
CHANNEL_LAYERS = {
'default': {
'BACKEND': 'asgiref.inmemory.ChannelLayer',
'ROUTING': 'openslides.routing.channel_routing',
'CONFIG': {
'capacity': 1000,
},
'BACKEND': 'channels.layers.InMemoryChannelLayer',
},
}

View File

@ -1,6 +1,5 @@
from django.apps import AppConfig
from ..utils.collection import Collection
from ..utils.projector import register_projector_elements
@ -34,7 +33,7 @@ class MediafilesAppConfig(AppConfig):
def get_startup_elements(self):
"""
Yields all collections required on startup i. e. opening the websocket
Yields all Cachables required on startup i. e. opening the websocket
connection.
"""
yield Collection(self.get_model('Mediafile').get_collection_string())
yield self.get_model('Mediafile')

View File

@ -1,7 +1,6 @@
from django.apps import AppConfig
from django.db.models.signals import post_migrate
from ..utils.collection import Collection
from ..utils.projector import register_projector_elements
@ -60,8 +59,8 @@ class MotionsAppConfig(AppConfig):
def get_startup_elements(self):
"""
Yields all collections required on startup i. e. opening the websocket
Yields all Cachables required on startup i. e. opening the websocket
connection.
"""
for model in ('Category', 'Motion', 'MotionBlock', 'Workflow', 'MotionChangeRecommendation'):
yield Collection(self.get_model(model).get_collection_string())
for model_name in ('Category', 'Motion', 'MotionBlock', 'Workflow', 'MotionChangeRecommendation'):
yield self.get_model(model_name)

View File

@ -7,8 +7,11 @@ from django.core.exceptions import ImproperlyConfigured, ValidationError
from django.db import IntegrityError, models, transaction
from django.db.models import Max
from django.utils import formats, timezone
from django.utils.translation import ugettext as _
from django.utils.translation import ugettext_lazy, ugettext_noop
from django.utils.translation import (
ugettext as _,
ugettext_lazy,
ugettext_noop,
)
from jsonfield import JSONField
from openslides.agenda.models import Item

View File

@ -2,6 +2,7 @@ from django.conf.urls import url
from . import views
urlpatterns = [
url(r'^docxtemplate/$',
views.MotionDocxTemplateView.as_view(),

View File

@ -8,8 +8,7 @@ from django.db import IntegrityError, transaction
from django.db.models.deletion import ProtectedError
from django.http import Http404
from django.http.request import QueryDict
from django.utils.translation import ugettext as _
from django.utils.translation import ugettext_noop
from django.utils.translation import ugettext as _, ugettext_noop
from rest_framework import status
from ..core.config import config

View File

@ -1,31 +1,16 @@
from channels.routing import include, route
from channels.routing import ProtocolTypeRouter, URLRouter
from django.conf.urls import url
from openslides.utils.autoupdate import (
send_data_projector,
send_data_site,
ws_add_projector,
ws_add_site,
ws_disconnect_projector,
ws_disconnect_site,
ws_receive_projector,
ws_receive_site,
from openslides.utils.consumers import ProjectorConsumer, SiteConsumer
from openslides.utils.middleware import AuthMiddlewareStack
application = ProtocolTypeRouter({
# WebSocket chat handler
"websocket": AuthMiddlewareStack(
URLRouter([
url(r"^ws/site/$", SiteConsumer),
url(r"^ws/projector/(?P<projector_id>\d+)/$", ProjectorConsumer),
])
)
projector_routing = [
route("websocket.connect", ws_add_projector),
route("websocket.disconnect", ws_disconnect_projector),
route("websocket.receive", ws_receive_projector),
]
site_routing = [
route("websocket.connect", ws_add_site),
route("websocket.disconnect", ws_disconnect_site),
route("websocket.receive", ws_receive_site),
]
channel_routing = [
include(projector_routing, path=r'^/ws/projector/(?P<projector_id>\d+)/$'),
include(site_routing, path=r'^/ws/site/$'),
route("autoupdate.send_data_projector", send_data_projector),
route("autoupdate.send_data_site", send_data_site),
]
})

View File

@ -1,6 +1,5 @@
from django.apps import AppConfig
from ..utils.collection import Collection
from ..utils.projector import register_projector_elements
@ -31,7 +30,7 @@ class TopicsAppConfig(AppConfig):
def get_startup_elements(self):
"""
Yields all collections required on startup i. e. opening the websocket
Yields all Cachables required on startup i. e. opening the websocket
connection.
"""
yield Collection(self.get_model('Topic').get_collection_string())
yield self.get_model('Topic')

View File

@ -6,6 +6,7 @@ from openslides.mediafiles.views import protected_serve
from openslides.utils.plugins import get_all_plugin_urlpatterns
from openslides.utils.rest_api import router
urlpatterns = get_all_plugin_urlpatterns()
urlpatterns += [

View File

@ -43,16 +43,20 @@ class UserAccessPermissions(BaseAccessPermissions):
"""
return {key: full_data[key] for key in whitelist}
# We have four sets of data to be sent:
# * full data i. e. all fields,
# * many data i. e. all fields but not the default password,
# * little data i. e. all fields but not the default password, comments and active status,
# We have five sets of data to be sent:
# * full data i. e. all fields (including session_auth_hash),
# * all data i. e. all fields but not session_auth_hash,
# * many data i. e. all fields but not the default password and session_auth_hash,
# * little data i. e. all fields but not the default password, session_auth_hash, comments and active status,
# * no data.
# Prepare field set for users with "many" data and with "little" data.
many_data_fields = set(USERCANSEEEXTRASERIALIZER_FIELDS)
many_data_fields.add('groups_id')
many_data_fields.discard('groups')
# Prepare field set for users with "all" data, "many" data and with "little" data.
all_data_fields = set(USERCANSEEEXTRASERIALIZER_FIELDS)
all_data_fields.add('groups_id')
all_data_fields.discard('groups')
all_data_fields.add('default_password')
many_data_fields = all_data_fields.copy()
many_data_fields.discard('default_password')
litte_data_fields = set(USERCANSEESERIALIZER_FIELDS)
litte_data_fields.add('groups_id')
litte_data_fields.discard('groups')
@ -61,7 +65,7 @@ class UserAccessPermissions(BaseAccessPermissions):
if has_perm(user, 'users.can_see_name'):
if has_perm(user, 'users.can_see_extra_data'):
if has_perm(user, 'users.can_manage'):
data = full_data
data = [filtered_data(full, all_data_fields) for full in full_data]
else:
data = [filtered_data(full, many_data_fields) for full in full_data]
else:

View File

@ -2,7 +2,6 @@ from django.apps import AppConfig
from django.conf import settings
from django.contrib.auth.signals import user_logged_in
from ..utils.collection import Collection
from ..utils.projector import register_projector_elements
@ -45,11 +44,11 @@ class UsersAppConfig(AppConfig):
def get_startup_elements(self):
"""
Yields all collections required on startup i. e. opening the websocket
Yields all Cachables required on startup i. e. opening the websocket
connection.
"""
for model in ('User', 'Group', 'PersonalNote'):
yield Collection(self.get_model(model).get_collection_string())
for model_name in ('User', 'Group', 'PersonalNote'):
yield self.get_model(model_name)
def get_angular_constants(self):
from django.contrib.auth.models import Permission

View File

@ -2,11 +2,11 @@ import smtplib
from random import choice
from django.contrib.auth.hashers import make_password
from django.contrib.auth.models import Group as DjangoGroup
from django.contrib.auth.models import GroupManager as _GroupManager
from django.contrib.auth.models import (
AbstractBaseUser,
BaseUserManager,
Group as DjangoGroup,
GroupManager as _GroupManager,
Permission,
PermissionsMixin,
)
@ -286,6 +286,15 @@ class User(RESTModelMixin, PermissionsMixin, AbstractBaseUser):
return False
@property
def session_auth_hash(self):
"""
Returns the session auth hash of a user as attribute.
Needed for the django rest framework.
"""
return self.get_session_auth_hash()
class GroupManager(_GroupManager):
"""

View File

@ -1,7 +1,6 @@
from django.contrib.auth.hashers import make_password
from django.contrib.auth.models import Permission
from django.utils.translation import ugettext as _
from django.utils.translation import ugettext_lazy
from django.utils.translation import ugettext as _, ugettext_lazy
from ..utils.autoupdate import inform_changed_data
from ..utils.rest_api import (
@ -13,6 +12,7 @@ from ..utils.rest_api import (
)
from .models import Group, PersonalNote, User
USERCANSEESERIALIZER_FIELDS = (
'id',
'username',
@ -52,7 +52,7 @@ class UserFullSerializer(ModelSerializer):
class Meta:
model = User
fields = USERCANSEEEXTRASERIALIZER_FIELDS + ('default_password',)
fields = USERCANSEEEXTRASERIALIZER_FIELDS + ('default_password', 'session_auth_hash')
read_only_fields = ('last_email_send',)
def validate(self, data):

View File

@ -81,7 +81,7 @@ def create_builtin_groups_and_admin(**kwargs):
permission_dict['mediafiles.can_see'],
permission_dict['motions.can_see'],
permission_dict['users.can_see_name'], )
group_default = Group.objects.create(name='Default')
group_default = Group.objects.create(pk=1, name='Default')
group_default.permissions.add(*base_permissions)
# Delegates (pk 2)
@ -99,7 +99,7 @@ def create_builtin_groups_and_admin(**kwargs):
permission_dict['motions.can_create'],
permission_dict['motions.can_support'],
permission_dict['users.can_see_name'], )
group_delegates = Group.objects.create(name='Delegates')
group_delegates = Group.objects.create(pk=2, name='Delegates')
group_delegates.permissions.add(*delegates_permissions)
# Staff (pk 3)
@ -130,7 +130,7 @@ def create_builtin_groups_and_admin(**kwargs):
permission_dict['users.can_manage'],
permission_dict['users.can_see_extra_data'],
permission_dict['mediafiles.can_see_hidden'],)
group_staff = Group.objects.create(name='Staff')
group_staff = Group.objects.create(pk=3, name='Staff')
group_staff.permissions.add(*staff_permissions)
# Admin (pk 4)
@ -164,7 +164,7 @@ def create_builtin_groups_and_admin(**kwargs):
permission_dict['users.can_manage'],
permission_dict['users.can_see_extra_data'],
permission_dict['mediafiles.can_see_hidden'],)
group_admin = Group.objects.create(name='Admin')
group_admin = Group.objects.create(pk=4, name='Admin')
group_admin.permissions.add(*admin_permissions)
# Add users.can_see_name permission to staff/admin

View File

@ -2,6 +2,7 @@ from django.conf.urls import url
from . import views
urlpatterns = [
# Auth
url(r'^login/$',

View File

@ -1,10 +1,13 @@
import smtplib
from typing import List # noqa
from asgiref.sync import async_to_sync
from django.conf import settings
from django.contrib.auth import login as auth_login
from django.contrib.auth import logout as auth_logout
from django.contrib.auth import update_session_auth_hash
from django.contrib.auth import (
login as auth_login,
logout as auth_logout,
update_session_auth_hash,
)
from django.contrib.auth.forms import AuthenticationForm
from django.contrib.auth.password_validation import validate_password
from django.core import mail
@ -15,13 +18,17 @@ from django.utils.translation import ugettext as _
from ..core.config import config
from ..core.signals import permission_change
from ..utils.auth import anonymous_is_enabled, has_perm
from ..utils.auth import (
anonymous_is_enabled,
has_perm,
user_to_collection_user,
)
from ..utils.autoupdate import (
inform_changed_data,
inform_data_collection_element_list,
)
from ..utils.cache import restricted_data_cache
from ..utils.collection import CollectionElement
from ..utils.cache import element_cache
from ..utils.collection import Collection, CollectionElement
from ..utils.rest_api import (
ModelViewSet,
Response,
@ -103,7 +110,7 @@ class UserViewSet(ModelViewSet):
del request.data[key]
response = super().update(request, *args, **kwargs)
# Maybe some group assignments have changed. Better delete the restricted user cache
restricted_data_cache.del_user(user.id)
async_to_sync(element_cache.del_user)(user_to_collection_user(user))
return response
def destroy(self, request, *args, **kwargs):
@ -303,7 +310,7 @@ class GroupViewSet(ModelViewSet):
# Delete the user chaches of all affected users
for user in group.user_set.all():
restricted_data_cache.del_user(user.id)
async_to_sync(element_cache.del_user)(user_to_collection_user(user))
def diff(full, part):
"""
@ -321,8 +328,8 @@ class GroupViewSet(ModelViewSet):
collection_elements = [] # type: List[CollectionElement]
signal_results = permission_change.send(None, permissions=new_permissions, action='added')
for receiver, signal_collections in signal_results:
for collection in signal_collections:
collection_elements.extend(collection.element_generator())
for cachable in signal_collections:
collection_elements.extend(Collection(cachable.get_collection_string()).element_generator())
inform_data_collection_element_list(collection_elements)
# TODO: Some permissions are deleted.

View File

@ -1,9 +1,10 @@
from typing import Optional, Union
from typing import Dict, Optional, Union, cast
from django.contrib.auth import get_user_model
from django.contrib.auth.models import AnonymousUser
from django.db.models import Model
from .cache import element_cache
from .collection import CollectionElement
@ -46,6 +47,18 @@ def anonymous_is_enabled() -> bool:
return config['general_system_enable_anonymous']
async def async_anonymous_is_enabled() -> bool:
"""
Like anonymous_is_enabled but async.
"""
from ..core.config import config
if config.key_to_id is None:
await config.build_key_to_id()
config.key_to_id = cast(Dict[str, int], config.key_to_id)
element = await element_cache.get_element_full_data(config.get_collection_string(), config.key_to_id['general_system_enable_anonymous'])
return False if element is None else element['value']
AnyUser = Union[Model, CollectionElement, int, AnonymousUser, None]
@ -75,6 +88,10 @@ def user_to_collection_user(user: AnyUser) -> Optional[CollectionElement]:
"Unsupported type for user. Only CollectionElements for users can be"
"used. Not {}".format(user.collection_string))
elif isinstance(user, int):
# user 0 means anonymous
if user == 0:
user = None
else:
user = CollectionElement.from_values(User.get_collection_string(), user)
elif isinstance(user, AnonymousUser):
user = None

View File

@ -1,366 +1,14 @@
import json
import threading
import time
import warnings
from collections import OrderedDict, defaultdict
from typing import Any, Dict, Generator, Iterable, List, Optional, Tuple, Union
from collections import OrderedDict
from typing import Any, Dict, Iterable, List, Optional, Tuple, Union
from channels import Channel, Group
from channels.asgi import get_channel_layer
from channels.auth import channel_session_user, channel_session_user_from_http
from django.apps import apps
from django.core.exceptions import ObjectDoesNotExist
from django.db import transaction
from asgiref.sync import async_to_sync
from channels.layers import get_channel_layer
from django.conf import settings
from django.db.models import Model
from ..core.config import config
from ..core.models import Projector
from .auth import anonymous_is_enabled, has_perm, user_to_collection_user
from .cache import restricted_data_cache, websocket_user_cache
from .collection import AutoupdateFormat # noqa
from .collection import (
ChannelMessageFormat,
Collection,
CollectionElement,
format_for_autoupdate,
from_channel_message,
to_channel_message,
)
def send_or_wait(send_func: Any, *args: Any, **kwargs: Any) -> None:
"""
Wrapper for channels' send() method.
If the method send() raises ChannelFull exception the worker waits for 20
milliseconds and tries again. After 5 secondes it gives up, drops the
channel message and writes a warning to stderr.
Django channels' consumer atomicity feature is disabled.
"""
kwargs['immediately'] = True
for i in range(250):
try:
send_func(*args, **kwargs)
except get_channel_layer().ChannelFull:
time.sleep(0.02)
else:
break
else:
warnings.warn(
'Channel layer is full. Channel message dropped.',
RuntimeWarning
)
@channel_session_user_from_http
def ws_add_site(message: Any) -> None:
"""
Adds the websocket connection to a group specific to the connecting user.
The group with the name 'user-None' stands for all anonymous users.
Send all "startup-data" through the connection.
"""
if not anonymous_is_enabled() and not message.user.id:
send_or_wait(message.reply_channel.send, {'accept': False})
return
Group('site').add(message.reply_channel)
message.channel_session['user_id'] = message.user.id
# Saves the reply channel to the user. Uses 0 for anonymous users.
websocket_user_cache.add(message.user.id or 0, message.reply_channel.name)
# Open the websocket connection.
send_or_wait(message.reply_channel.send, {'accept': True})
# Collect all elements that shoud be send to the client when the websocket
# connection is established.
user = user_to_collection_user(message.user.id)
user_id = user.id if user is not None else 0
if restricted_data_cache.exists_for_user(user_id):
output = restricted_data_cache.get_data(user_id)
else:
output = []
for collection in get_startup_collections():
access_permissions = collection.get_access_permissions()
restricted_data = access_permissions.get_restricted_data(collection.get_full_data(), user)
for data in restricted_data:
if data is None:
# We do not want to send 'deleted' objects on startup.
# That's why we skip such data.
continue
formatted_data = format_for_autoupdate(
collection_string=collection.collection_string,
id=data['id'],
action='changed',
data=data)
output.append(formatted_data)
# Cache restricted data for user
restricted_data_cache.add_element(
user_id,
collection.collection_string,
data['id'],
formatted_data)
# Send all data.
if output:
send_or_wait(message.reply_channel.send, {'text': json.dumps(output)})
@channel_session_user
def ws_disconnect_site(message: Any) -> None:
"""
This function is called, when a client on the site disconnects.
"""
Group('site').discard(message.reply_channel)
websocket_user_cache.remove(message.user.id or 0, message.reply_channel.name)
@channel_session_user
def ws_receive_site(message: Any) -> None:
"""
If we recieve something from the client we currently just interpret this
as a notify message.
The server adds the sender's user id (0 for anonymous) and reply
channel name so that a receiver client may reply to the sender or to all
sender's instances.
"""
try:
incomming = json.loads(message.content['text'])
except ValueError:
# Message content is invalid. Just do nothing.
pass
else:
if isinstance(incomming, list):
notify(
incomming,
senderReplyChannelName=message.reply_channel.name,
senderUserId=message.user.id or 0)
def notify(incomming: List[Dict[str, Any]], **attributes: Any) -> None:
"""
The incomming should be a list of notify elements. Every item is broadcasted
to the given users, channels or projectors. If none is given, the message is
send to each site client.
"""
# Parse all items
receivers_users = defaultdict(list) # type: Dict[int, List[Any]]
receivers_projectors = defaultdict(list) # type: Dict[int, List[Any]]
receivers_reply_channels = defaultdict(list) # type: Dict[str, List[Any]]
items_for_all = []
for item in incomming:
if item.get('collection') == 'notify':
use_receivers_dict = False
for key, value in attributes.items():
item[key] = value
# Force the params to be a dict
if not isinstance(item.get('params'), dict):
item['params'] = {}
users = item.get('users')
if isinstance(users, list):
# Send this item only to all reply channels of some site users.
for user_id in users:
receivers_users[user_id].append(item)
use_receivers_dict = True
projectors = item.get('projectors')
if isinstance(projectors, list):
# Send this item only to all reply channels of some site users.
for projector_id in projectors:
receivers_projectors[projector_id].append(item)
use_receivers_dict = True
reply_channels = item.get('replyChannels')
if isinstance(reply_channels, list):
# Send this item only to some reply channels.
for reply_channel_name in reply_channels:
receivers_reply_channels[reply_channel_name].append(item)
use_receivers_dict = True
if not use_receivers_dict:
# Send this item to all reply channels.
items_for_all.append(item)
# Send all items
for user_id, channel_names in websocket_user_cache.get_all().items():
output = receivers_users[user_id]
if len(output) > 0:
for channel_name in channel_names:
send_or_wait(Channel(channel_name).send, {'text': json.dumps(output)})
for channel_name, output in receivers_reply_channels.items():
if len(output) > 0:
send_or_wait(Channel(channel_name).send, {'text': json.dumps(output)})
for projector_id, output in receivers_projectors.items():
if len(output) > 0:
send_or_wait(Group('projector-{}'.format(projector_id)).send, {'text': json.dumps(output)})
if len(items_for_all) > 0:
send_or_wait(Group('site').send, {'text': json.dumps(items_for_all)})
@channel_session_user_from_http
def ws_add_projector(message: Any, projector_id: int) -> None:
"""
Adds the websocket connection to a group specific to the projector with the given id.
Also sends all data that are shown on the projector.
"""
user = user_to_collection_user(message.user.id)
if not has_perm(user, 'core.can_see_projector'):
send_or_wait(message.reply_channel.send, {'text': 'No permissions to see this projector.'})
else:
try:
projector = Projector.objects.get(pk=projector_id)
except Projector.DoesNotExist:
send_or_wait(message.reply_channel.send, {'text': 'The projector {} does not exist.'.format(projector_id)})
else:
# At first, the client is added to the projector group, so it is
# informed if the data change.
Group('projector-{}'.format(projector_id)).add(message.reply_channel)
# Then it is also added to the global projector group which is
# used for broadcasting data.
Group('projector-all').add(message.reply_channel)
# Now check whether broadcast is active at the moment. If yes,
# change the local projector variable.
if config['projector_broadcast'] > 0:
projector = Projector.objects.get(pk=config['projector_broadcast'])
# Collect all elements that are on the projector.
output = [] # type: List[AutoupdateFormat]
for requirement in projector.get_all_requirements():
required_collection_element = CollectionElement.from_instance(requirement)
output.append(required_collection_element.as_autoupdate_for_projector())
# Collect all config elements.
config_collection = Collection(config.get_collection_string())
projector_data = (config_collection.get_access_permissions()
.get_projector_data(config_collection.get_full_data()))
for data in projector_data:
output.append(format_for_autoupdate(
config_collection.collection_string,
data['id'],
'changed',
data))
# Collect the projector instance.
collection_element = CollectionElement.from_instance(projector)
output.append(collection_element.as_autoupdate_for_projector())
# Send all the data that were only collected before.
send_or_wait(message.reply_channel.send, {'text': json.dumps(output)})
def ws_disconnect_projector(message: Any, projector_id: int) -> None:
"""
This function is called, when a client on the projector disconnects.
"""
Group('projector-{}'.format(projector_id)).discard(message.reply_channel)
Group('projector-all').discard(message.reply_channel)
def ws_receive_projector(message: Any, projector_id: int) -> None:
"""
If we recieve something from the client we currently just interpret this
as a notify message.
The server adds the sender's projector id and reply channel name so that
a receiver client may reply to the sender or to all sender's instances.
"""
try:
incomming = json.loads(message.content['text'])
except ValueError:
# Message content is invalid. Just do nothing.
pass
else:
if isinstance(incomming, list):
notify(
incomming,
senderReplyChannelName=message.reply_channel.name,
senderProjectorId=projector_id)
def send_data_projector(message: ChannelMessageFormat) -> None:
"""
Informs all projector clients about changed data.
"""
collection_elements = from_channel_message(message)
# Check whether broadcast is active at the moment and set the local
# projector queryset.
if config['projector_broadcast'] > 0:
queryset = Projector.objects.filter(pk=config['projector_broadcast'])
else:
queryset = Projector.objects.all()
# Loop over all projectors and send data that they need.
for projector in queryset:
output = []
for collection_element in collection_elements:
if collection_element.is_deleted():
output.append(collection_element.as_autoupdate_for_projector())
else:
for element in projector.get_collection_elements_required_for_this(collection_element):
output.append(element.as_autoupdate_for_projector())
if output:
if config['projector_broadcast'] > 0:
send_or_wait(
Group('projector-all').send,
{'text': json.dumps(output)})
else:
send_or_wait(
Group('projector-{}'.format(projector.pk)).send,
{'text': json.dumps(output)})
def send_data_site(message: ChannelMessageFormat) -> None:
"""
Informs all site users about changed data.
"""
collection_elements = from_channel_message(message)
# Send data to site users.
for user_id, channel_names in websocket_user_cache.get_all().items():
if not user_id:
# Anonymous user
user = None
else:
try:
user = user_to_collection_user(user_id)
except ObjectDoesNotExist:
# The user does not exist. Skip him/her.
continue
output = []
for collection_element in collection_elements:
formatted_data = collection_element.as_autoupdate_for_user(user)
if formatted_data['action'] == 'changed':
restricted_data_cache.update_element(
user_id or 0,
collection_element.collection_string,
collection_element.id,
formatted_data)
else:
restricted_data_cache.del_element(
user_id or 0,
collection_element.collection_string,
collection_element.id)
output.append(formatted_data)
for channel_name in channel_names:
send_or_wait(Channel(channel_name).send, {'text': json.dumps(output)})
from .cache import element_cache, get_element_id
from .collection import CollectionElement, to_channel_message
def to_ordered_dict(d: Optional[Dict]) -> Optional[OrderedDict]:
@ -377,7 +25,7 @@ def to_ordered_dict(d: Optional[Dict]) -> Optional[OrderedDict]:
def inform_changed_data(instances: Union[Iterable[Model], Model], information: Dict[str, Any] = None) -> None:
"""
Informs the autoupdate system and the caching system about the creation or
update of an element. This is done via the AutoupdateBundleMiddleware.
update of an element.
The argument instances can be one instance or an iterable over instances.
"""
@ -392,29 +40,31 @@ def inform_changed_data(instances: Union[Iterable[Model], Model], information: D
# Instance has no method get_root_rest_element. Just ignore it.
pass
# Put all collection elements into the autoupdate_bundle.
bundle = autoupdate_bundle.get(threading.get_ident())
if bundle is not None:
# Run autoupdate only if the bundle exists because we are in a request-response-cycle.
collection_elements = {}
for root_instance in root_instances:
collection_element = CollectionElement.from_instance(
root_instance,
information=information)
key = root_instance.get_collection_string() + str(root_instance.get_rest_pk()) + str(to_ordered_dict(information))
bundle[key] = collection_element
collection_elements[key] = collection_element
bundle = autoupdate_bundle.get(threading.get_ident())
if bundle is not None:
# Put all collection elements into the autoupdate_bundle.
bundle.update(collection_elements)
else:
# Send autoupdate directly
async_to_sync(send_autoupdate)(collection_elements.values())
def inform_deleted_data(elements: Iterable[Tuple[str, int]], information: Dict[str, Any] = None) -> None:
"""
Informs the autoupdate system and the caching system about the deletion of
elements. This is done via the AutoupdateBundleMiddleware.
elements.
The argument information is added to each collection element.
"""
# Put all stuff to be deleted into the autoupdate_bundle.
bundle = autoupdate_bundle.get(threading.get_ident())
if bundle is not None:
# Run autoupdate only if the bundle exists because we are in a request-response-cycle.
collection_elements = {} # type: Dict[str, Any]
for element in elements:
collection_element = CollectionElement.from_values(
collection_string=element[0],
@ -422,7 +72,15 @@ def inform_deleted_data(elements: Iterable[Tuple[str, int]], information: Dict[s
deleted=True,
information=information)
key = element[0] + str(element[1]) + str(to_ordered_dict(information))
bundle[key] = collection_element
collection_elements[key] = collection_element
bundle = autoupdate_bundle.get(threading.get_ident())
if bundle is not None:
# Put all collection elements into the autoupdate_bundle.
bundle.update(collection_elements)
else:
# Send autoupdate directly
async_to_sync(send_autoupdate)(collection_elements.values())
def inform_data_collection_element_list(collection_elements: List[CollectionElement],
@ -431,13 +89,18 @@ def inform_data_collection_element_list(collection_elements: List[CollectionElem
Informs the autoupdate system about some collection elements. This is
used just to send some data to all users.
"""
# Put all stuff into the autoupdate_bundle.
bundle = autoupdate_bundle.get(threading.get_ident())
if bundle is not None:
# Run autoupdate only if the bundle exists because we are in a request-response-cycle.
elements = {}
for collection_element in collection_elements:
key = collection_element.collection_string + str(collection_element.id) + str(to_ordered_dict(information))
bundle[key] = collection_element
elements[key] = collection_element
bundle = autoupdate_bundle.get(threading.get_ident())
if bundle is not None:
# Put all collection elements into the autoupdate_bundle.
bundle.update(elements)
else:
# Send autoupdate directly
async_to_sync(send_autoupdate)(elements.values())
"""
@ -461,41 +124,48 @@ class AutoupdateBundleMiddleware:
response = self.get_response(request)
bundle = autoupdate_bundle.pop(thread_id) # type: Dict[str, CollectionElement]
# If currently there is an open database transaction, then the
# send_autoupdate function is only called, when the transaction is
# commited. If there is currently no transaction, then the function
# is called immediately.
transaction.on_commit(lambda: send_autoupdate(bundle.values()))
async_to_sync(send_autoupdate)(bundle.values())
return response
def send_autoupdate(collection_elements: Iterable[CollectionElement]) -> None:
async def send_autoupdate(collection_elements: Iterable[CollectionElement]) -> None:
"""
Helper function, that sends collection_elements through a channel to the
autoupdate system.
Also updates the redis cache.
Does nothing if collection_elements is empty.
"""
if collection_elements:
send_or_wait(
Channel('autoupdate.send_data_projector').send,
to_channel_message(collection_elements))
send_or_wait(
Channel('autoupdate.send_data_site').send,
to_channel_message(collection_elements))
cache_elements = {} # type: Dict[str, Optional[Dict[str, Any]]]
for element in collection_elements:
element_id = get_element_id(element.collection_string, element.id)
if element.is_deleted():
cache_elements[element_id] = None
else:
cache_elements[element_id] = element.get_full_data()
if not getattr(settings, 'SKIP_CACHE', False):
# Hack for django 2.0 and channels 2.1 to stay in the same thread.
# This is needed for the tests.
change_id = await element_cache.change_elements(cache_elements)
else:
change_id = 1
def get_startup_collections() -> Generator[Collection, None, None]:
"""
Returns all Collections that should be send to the user at startup
"""
for app in apps.get_app_configs():
try:
# Get the method get_startup_elements() from an app.
# This method has to return an iterable of Collection objects.
get_startup_elements = app.get_startup_elements
except AttributeError:
# Skip apps that do not implement get_startup_elements.
continue
yield from get_startup_elements()
channel_layer = get_channel_layer()
# TODO: don't await. They can be send in parallel
await channel_layer.group_send(
"projector",
{
"type": "send_data",
"message": to_channel_message(collection_elements),
},
)
await channel_layer.group_send(
"site",
{
"type": "send_data",
"change_id": change_id,
},
)

View File

@ -1,506 +1,417 @@
import asyncio
import json
from collections import defaultdict
from typing import ( # noqa
from datetime import datetime
from typing import (
TYPE_CHECKING,
Any,
Callable,
Dict,
Generator,
Iterable,
List,
Optional,
Set,
Tuple,
Type,
Union,
)
from channels import Group
from channels.sessions import session_for_reply_channel
from asgiref.sync import sync_to_async
from channels.db import database_sync_to_async
from django.conf import settings
from django.core.cache import cache, caches
from .cache_providers import (
BaseCacheProvider,
Cachable,
MemmoryCacheProvider,
RedisCacheProvider,
get_all_cachables,
no_redis_dependency,
)
from .utils import get_element_id, get_user_id, split_element_id
if TYPE_CHECKING:
# Dummy import Collection for mypy
from .collection import Collection # noqa
UserCacheDataType = Dict[int, Set[str]]
# Dummy import Collection for mypy, can be fixed with python 3.7
from .collection import CollectionElement # noqa
class BaseWebsocketUserCache:
class ElementCache:
"""
Caches the reply channel names of all open websocket connections. The id of
the user that that opened the connection is used as reference.
Cache for the CollectionElements.
This is the Base cache that has to be overriden.
"""
cache_key = 'current_websocket_users'
Saves the full_data and if enabled the restricted data.
def add(self, user_id: int, channel_name: str) -> None:
"""
Adds a channel name to an user id.
"""
raise NotImplementedError()
There is one redis Hash (simular to python dict) for the full_data and one
Hash for every user.
def remove(self, user_id: int, channel_name: str) -> None:
"""
Removes one channel name from the cache.
"""
raise NotImplementedError()
The key of the Hashes is COLLECTIONSTRING:ID where COLLECTIONSTRING is the
collection_string of a collection and id the id of an element.
def get_all(self) -> UserCacheDataType:
"""
Returns all data using a dict where the key is a user id and the value
is a set of channel_names.
"""
raise NotImplementedError()
All elements have to be in the cache. If one element is missing, the cache
is invalid, but this can not be detected. When a plugin with a new
collection is added to OpenSlides, then the cache has to be rebuild manualy.
def save_data(self, data: UserCacheDataType) -> None:
"""
Saves the full data set (like created with build_data) to the cache.
"""
raise NotImplementedError()
There is an sorted set in redis with the change id as score. The values are
COLLETIONSTRING:ID for the elements that have been changed with that change
id. With this key it is possible, to get all elements as full_data or as
restricted_data that are newer then a specific change id.
def build_data(self) -> UserCacheDataType:
"""
Creates all the data, saves it to the cache and returns it.
"""
websocket_user_ids = defaultdict(set) # type: UserCacheDataType
for channel_name in Group('site').channel_layer.group_channels('site'):
session = session_for_reply_channel(channel_name)
user_id = session.get('user_id', None)
websocket_user_ids[user_id or 0].add(channel_name)
self.save_data(websocket_user_ids)
return websocket_user_ids
def get_cache_key(self) -> str:
"""
Returns the cache key.
"""
return self.cache_key
class RedisWebsocketUserCache(BaseWebsocketUserCache):
"""
Implementation of the WebsocketUserCache that uses redis.
This uses one cache key to store all connected user ids in a set and
for each user another set to save the channel names.
All method of this class are async. You either have to call them with
await in an async environment or use asgiref.sync.async_to_sync().
"""
def add(self, user_id: int, channel_name: str) -> None:
def __init__(
self,
redis: str,
use_restricted_data_cache: bool = False,
cache_provider_class: Type[BaseCacheProvider] = RedisCacheProvider,
cachable_provider: Callable[[], List[Cachable]] = get_all_cachables,
start_time: int = None) -> None:
"""
Adds a channel name to an user id.
"""
redis = get_redis_connection()
pipe = redis.pipeline()
pipe.sadd(self.get_cache_key(), user_id)
pipe.sadd(self.get_user_cache_key(user_id), channel_name)
pipe.execute()
Initializes the cache.
def remove(self, user_id: int, channel_name: str) -> None:
When restricted_data_cache is false, no restricted data is saved.
"""
Removes one channel name from the cache.
"""
redis = get_redis_connection()
redis.srem(self.get_user_cache_key(user_id), channel_name)
self.use_restricted_data_cache = use_restricted_data_cache
self.cache_provider = cache_provider_class(redis)
self.cachable_provider = cachable_provider
self._cachables = None # type: Optional[Dict[str, Cachable]]
def get_all(self) -> UserCacheDataType:
# Start time is used as first change_id if there is non in redis
if start_time is None:
start_time = int((datetime.utcnow() - datetime(1970, 1, 1)).total_seconds())
self.start_time = start_time
# Contains Futures to controll, that only one client updates the restricted_data.
self.restricted_data_cache_updater = {} # type: Dict[int, asyncio.Future]
@property
def cachables(self) -> Dict[str, Cachable]:
"""
Returns all data using a dict where the key is a user id and the value
is a set of channel_names.
Returns all Cachables as a dict where the key is the collection_string of the cachable.
"""
redis = get_redis_connection()
user_ids = redis.smembers(self.get_cache_key()) # type: Optional[List[str]]
if user_ids is None:
websocket_user_ids = self.build_data()
# This method is neccessary to lazy load the cachables
if self._cachables is None:
self._cachables = {cachable.get_collection_string(): cachable for cachable in self.cachable_provider()}
return self._cachables
async def save_full_data(self, db_data: Dict[str, List[Dict[str, Any]]]) -> None:
"""
Saves the full data.
"""
mapping = {}
for collection_string, elements in db_data.items():
for element in elements:
mapping.update(
{get_element_id(collection_string, element['id']):
json.dumps(element)})
await self.cache_provider.reset_full_cache(mapping)
async def build_full_data(self) -> Dict[str, List[Dict[str, Any]]]:
"""
Build or rebuild the full_data cache.
"""
db_data = {} # type: Dict[str, List[Dict[str, Any]]]
for collection_string, cachable in self.cachables.items():
db_data[collection_string] = await database_sync_to_async(cachable.get_elements)()
await self.save_full_data(db_data)
return db_data
async def exists_full_data(self) -> bool:
"""
Returns True, if the full_data_cache exists.
"""
return await self.cache_provider.data_exists()
async def change_elements(
self, elements: Dict[str, Optional[Dict[str, Any]]]) -> int:
"""
Changes elements in the cache.
elements is a list of the changed elements as dict. When the value is None,
it is interpreded as deleted. The key has to be an element_id.
Returns the new generated change_id.
"""
if not await self.exists_full_data():
await self.build_full_data()
deleted_elements = []
changed_elements = []
for element_id, data in elements.items():
if data:
# The arguments for redis.hset is pairs of key value
changed_elements.append(element_id)
changed_elements.append(json.dumps(data))
else:
websocket_user_ids = dict()
for redis_user_id in user_ids:
# Redis returns the id as string. So we have to convert it
user_id = int(redis_user_id)
channel_names = redis.smembers(self.get_user_cache_key(user_id)) # type: Optional[List[str]]
if channel_names is not None:
# If channel name is empty, then we can assume, that the user
# has no active connection.
websocket_user_ids[user_id] = set(channel_names)
return websocket_user_ids
deleted_elements.append(element_id)
def save_data(self, data: UserCacheDataType) -> None:
if changed_elements:
await self.cache_provider.add_elements(changed_elements)
if deleted_elements:
await self.cache_provider.del_elements(deleted_elements)
# TODO: The provider has to define the new change_id with lua. In other
# case it is possible, that two changes get the same id (which
# would not be a big problem).
change_id = await self.get_next_change_id()
await self.cache_provider.add_changed_elements(change_id, elements.keys())
return change_id
async def get_all_full_data(self) -> Dict[str, List[Dict[str, Any]]]:
"""
Saves the full data set (like created with the method build_data()) to
the cache.
Returns all full_data. If it does not exist, it is created.
The returned value is a dict where the key is the collection_string and
the value is a list of data.
"""
redis = get_redis_connection()
pipe = redis.pipeline()
# Save all user ids
pipe.delete(self.get_cache_key())
pipe.sadd(self.get_cache_key(), *data.keys())
for user_id, channel_names in data.items():
pipe.delete(self.get_user_cache_key(user_id))
pipe.sadd(self.get_user_cache_key(user_id), *channel_names)
pipe.execute()
def get_cache_key(self) -> str:
"""
Returns the cache key.
"""
return cache.make_key(self.cache_key)
def get_user_cache_key(self, user_id: int) -> str:
"""
Returns a cache key to save the channel names for a specific user.
"""
return cache.make_key('{}:{}'.format(self.cache_key, user_id))
class DjangoCacheWebsocketUserCache(BaseWebsocketUserCache):
"""
Implementation of the WebsocketUserCache that uses the django cache.
If you use this with the inmemory cache, then you should only use one
worker.
This uses only one cache key to save a dict where the key is the user id and
the value is a set of channel names.
"""
def add(self, user_id: int, channel_name: str) -> None:
"""
Adds a channel name for a user using the django cache.
"""
websocket_user_ids = cache.get(self.get_cache_key())
if websocket_user_ids is None:
websocket_user_ids = dict()
if user_id in websocket_user_ids:
websocket_user_ids[user_id].add(channel_name)
if not await self.exists_full_data():
out = await self.build_full_data()
else:
websocket_user_ids[user_id] = set([channel_name])
cache.set(self.get_cache_key(), websocket_user_ids)
out = defaultdict(list)
full_data = await self.cache_provider.get_all_data()
for element_id, data in full_data.items():
collection_string, __ = split_element_id(element_id)
out[collection_string].append(json.loads(data.decode()))
return dict(out)
def remove(self, user_id: int, channel_name: str) -> None:
async def get_full_data(
self, change_id: int = 0) -> Tuple[Dict[str, List[Dict[str, Any]]], List[str]]:
"""
Removes one channel name from the django cache.
Returns all full_data since change_id. If it does not exist, it is created.
Returns two values inside a tuple. The first value is a dict where the
key is the collection_string and the value is a list of data. The second
is a list of element_ids with deleted elements.
Only returns elements with the change_id or newer. When change_id is 0,
all elements are returned.
Raises a RuntimeError when the lowest change_id in redis is higher then
the requested change_id. In this case the method has to be rerun with
change_id=0. This is importend because there could be deleted elements
that the cache does not know about.
"""
websocket_user_ids = cache.get(self.get_cache_key())
if websocket_user_ids is not None and user_id in websocket_user_ids:
websocket_user_ids[user_id].discard(channel_name)
cache.set(self.get_cache_key(), websocket_user_ids)
if change_id == 0:
return (await self.get_all_full_data(), [])
def get_all(self) -> UserCacheDataType:
lowest_change_id = await self.get_lowest_change_id()
if change_id < lowest_change_id:
# When change_id is lower then the lowest change_id in redis, we can
# not inform the user about deleted elements.
raise RuntimeError(
"change_id {} is lower then the lowest change_id in redis {}. "
"Catch this exception and rerun the method with change_id=0."
.format(change_id, lowest_change_id))
if not await self.exists_full_data():
# If the cache does not exist, create it.
await self.build_full_data()
raw_changed_elements, deleted_elements = await self.cache_provider.get_data_since(change_id)
return (
{collection_string: [json.loads(value.decode()) for value in value_list]
for collection_string, value_list in raw_changed_elements.items()},
deleted_elements)
async def get_element_full_data(self, collection_string: str, id: int) -> Optional[Dict[str, Any]]:
"""
Returns the data using the django cache.
Returns one element as full data.
If the cache is empty, it is created.
Returns None if the element does not exist.
"""
websocket_user_ids = cache.get(self.get_cache_key())
if websocket_user_ids is None:
return self.build_data()
return websocket_user_ids
if not await self.exists_full_data():
await self.build_full_data()
def save_data(self, data: UserCacheDataType) -> None:
"""
Saves the data using the django cache.
"""
cache.set(self.get_cache_key(), data)
element = await self.cache_provider.get_element(get_element_id(collection_string, id))
class FullDataCache:
"""
Caches all data as full data.
Helps to get all data from one collection.
"""
base_cache_key = 'full_data_cache'
def build_for_collection(self, collection_string: str) -> None:
"""
Build the cache for collection from a django model.
Rebuilds the cache for that collection, if it already exists.
"""
redis = get_redis_connection()
pipe = redis.pipeline()
# Clear the cache for collection
pipe.delete(self.get_cache_key(collection_string))
# Save all elements
from .collection import get_model_from_collection_string
model = get_model_from_collection_string(collection_string)
try:
query = model.objects.get_full_queryset()
except AttributeError:
# If the model des not have to method get_full_queryset(), then use
# the default queryset from django.
query = model.objects
# Build a dict from the instance id to the full_data
mapping = {instance.pk: json.dumps(model.get_access_permissions().get_full_data(instance))
for instance in query.all()}
if mapping:
# Save the dict into a redis map, if there is at least one value
pipe.hmset(
self.get_cache_key(collection_string),
mapping)
pipe.execute()
def add_element(self, collection_string: str, id: int, data: Dict[str, Any]) -> None:
"""
Adds one element to the cache. If the cache does not exists for the collection,
it is created.
"""
redis = get_redis_connection()
# If the cache does not exist for the collection, then create it first.
if not self.exists_for_collection(collection_string):
self.build_for_collection(collection_string)
redis.hset(
self.get_cache_key(collection_string),
id,
json.dumps(data))
def del_element(self, collection_string: str, id: int) -> None:
"""
Removes one element from the cache.
Does nothing if the cache does not exist.
"""
redis = get_redis_connection()
redis.hdel(
self.get_cache_key(collection_string),
id)
def exists_for_collection(self, collection_string: str) -> bool:
"""
Returns True if the cache for the collection exists, else False.
"""
redis = get_redis_connection()
return redis.exists(self.get_cache_key(collection_string))
def get_data(self, collection_string: str) -> List[Dict[str, Any]]:
"""
Returns all data for the collection.
"""
redis = get_redis_connection()
return [json.loads(element.decode()) for element in redis.hvals(self.get_cache_key(collection_string))]
def get_element(self, collection_string: str, id: int) -> Dict[str, Any]:
"""
Returns one element from the collection.
Raises model.DoesNotExist if the element is not in the cache.
"""
redis = get_redis_connection()
element = redis.hget(self.get_cache_key(collection_string), id)
if element is None:
from .collection import get_model_from_collection_string
model = get_model_from_collection_string(collection_string)
raise model.DoesNotExist(collection_string, id)
return None
return json.loads(element.decode())
def get_cache_key(self, collection_string: str) -> str:
async def exists_restricted_data(self, user: Optional['CollectionElement']) -> bool:
"""
Returns the cache key for a collection.
Returns True, if the restricted_data exists for the user.
"""
return cache.make_key('{}:{}'.format(self.base_cache_key, collection_string))
class DummyFullDataCache:
"""
Dummy FullDataCache that does nothing.
"""
def build_for_collection(self, collection_string: str) -> None:
pass
def add_element(self, collection_string: str, id: int, data: Dict[str, Any]) -> None:
pass
def del_element(self, collection_string: str, id: int) -> None:
pass
def exists_for_collection(self, collection_string: str) -> bool:
if not self.use_restricted_data_cache:
return False
def get_data(self, collection_string: str) -> List[Dict[str, Any]]:
from .collection import get_model_from_collection_string
model = get_model_from_collection_string(collection_string)
try:
query = model.objects.get_full_queryset()
except AttributeError:
# If the model des not have to method get_full_queryset(), then use
# the default queryset from django.
query = model.objects
return await self.cache_provider.data_exists(get_user_id(user))
return [model.get_access_permissions().get_full_data(instance)
for instance in query.all()]
def get_element(self, collection_string: str, id: int) -> Dict[str, Any]:
from .collection import get_model_from_collection_string
model = get_model_from_collection_string(collection_string)
try:
query = model.objects.get_full_queryset()
except AttributeError:
# If the model des not have to method get_full_queryset(), then use
# the default queryset from django.
query = model.objects
return model.get_access_permissions().get_full_data(query.get(pk=id))
class RestrictedDataCache:
async def del_user(self, user: Optional['CollectionElement']) -> None:
"""
Caches all data for a specific users.
Helps to get all data from all collections for a specific user.
The cached values are expected to be formatted for outout via websocket.
Removes one user from the resticted_data_cache.
"""
await self.cache_provider.del_restricted_data(get_user_id(user))
base_cache_key = 'restricted_user_cache'
def update_element(self, user_id: int, collection_string: str, id: int, data: object) -> None:
async def update_restricted_data(
self, user: Optional['CollectionElement']) -> None:
"""
Adds on element to the cache only if the cache exists for the user.
Note: This method is not atomic. So in very rare cases it is possible
that the restricted date cache can become corrupt. The best solution would be to
use a lua script instead. See also #3427.
Updates the restricted data for an user from the full_data_cache.
"""
if self.exists_for_user(user_id):
self.add_element(user_id, collection_string, id, data)
def add_element(self, user_id: int, collection_string: str, id: int, data: object) -> None:
"""
Adds one element to the cache. If the cache does not exists for the user,
it is created.
"""
redis = get_redis_connection()
redis.hset(
self.get_cache_key(user_id),
"{}/{}".format(collection_string, id),
json.dumps(data))
def del_element(self, user_id: int, collection_string: str, id: int) -> None:
"""
Removes one element from the cache.
Does nothing if the cache does not exist.
"""
redis = get_redis_connection()
redis.hdel(
self.get_cache_key(user_id),
"{}/{}".format(collection_string, id))
def del_user(self, user_id: int) -> None:
"""
Removes all elements for one user from the cache.
"""
redis = get_redis_connection()
redis.delete(self.get_cache_key(user_id))
def del_all(self) -> None:
"""
Deletes all elements from the cache.
This method uses the redis command SCAN. See
https://redis.io/commands/scan#scan-guarantees for its limitations. If
an element is added to the cache while del_all() is in process, it is
possible, that it is not deleted.
"""
redis = get_redis_connection()
# Get all keys that start with self.base_cache_key and delete them
match = cache.make_key('{}:*'.format(self.base_cache_key))
cursor = 0
while True:
cursor, keys = redis.scan(cursor, match)
for key in keys:
redis.delete(key)
if cursor == 0:
# TODO: When elements are changed at the same time then this method run
# this could make the cache invalid.
# This could be fixed when get_full_data would be used with a
# max change_id.
if not self.use_restricted_data_cache:
# If the restricted_data_cache is not used, there is nothing to do
return
def exists_for_user(self, user_id: int) -> bool:
"""
Returns True if the cache for the user exists, else False.
"""
redis = get_redis_connection()
return redis.exists(self.get_cache_key(user_id))
def get_data(self, user_id: int) -> List[object]:
"""
Returns all data for the user.
The returned value is a list of the elements.
"""
redis = get_redis_connection()
return [json.loads(element.decode()) for element in redis.hvals(self.get_cache_key(user_id))]
def get_cache_key(self, user_id: int) -> str:
"""
Returns the cache key for a user.
"""
return cache.make_key('{}:{}'.format(self.base_cache_key, user_id))
class DummyRestrictedDataCache:
"""
Dummy RestrictedDataCache that does nothing.
"""
def update_element(self, user_id: int, collection_string: str, id: int, data: object) -> None:
pass
def add_element(self, user_id: int, collection_string: str, id: int, data: object) -> None:
pass
def del_element(self, user_id: int, collection_string: str, id: int) -> None:
pass
def del_user(self, user_id: int) -> None:
pass
def del_all(self) -> None:
pass
def exists_for_user(self, user_id: int) -> bool:
return False
def get_data(self, user_id: int) -> List[object]:
pass
def use_redis_cache() -> bool:
"""
Returns True if Redis is used als caching backend.
"""
# Try to write a special key.
# If this succeeds, there is noone else currently updating the cache.
# TODO: Make a timeout. Else this could block forever
if await self.cache_provider.set_lock_restricted_data(get_user_id(user)):
future = asyncio.Future() # type: asyncio.Future
self.restricted_data_cache_updater[get_user_id(user)] = future
# Get change_id for this user
value = await self.cache_provider.get_change_id_user(get_user_id(user))
# If the change id is not in the cache yet, use -1 to get all data since 0
user_change_id = int(value) if value else -1
change_id = await self.get_current_change_id()
if change_id > user_change_id:
try:
from django_redis.cache import RedisCache
except ImportError:
return False
return isinstance(caches['default'], RedisCache)
def get_redis_connection() -> Any:
"""
Returns an object that can be used to talk directly to redis.
"""
from django_redis import get_redis_connection
return get_redis_connection("default")
if use_redis_cache():
websocket_user_cache = RedisWebsocketUserCache() # type: BaseWebsocketUserCache
if settings.DISABLE_USER_CACHE:
restricted_data_cache = DummyRestrictedDataCache() # type: Union[RestrictedDataCache, DummyRestrictedDataCache]
full_data_elements, deleted_elements = await self.get_full_data(user_change_id + 1)
except RuntimeError:
# The user_change_id is lower then the lowest change_id in the cache.
# The whole restricted_data for that user has to be recreated.
full_data_elements = await self.get_all_full_data()
await self.cache_provider.del_restricted_data(get_user_id(user))
else:
restricted_data_cache = RestrictedDataCache()
full_data_cache = FullDataCache() # type: Union[FullDataCache, DummyFullDataCache]
# Remove deleted elements
if deleted_elements:
await self.cache_provider.del_elements(deleted_elements, get_user_id(user))
mapping = {}
for collection_string, full_data in full_data_elements.items():
restricter = self.cachables[collection_string].restrict_elements
elements = await sync_to_async(restricter)(user, full_data)
for element in elements:
mapping.update(
{get_element_id(collection_string, element['id']):
json.dumps(element)})
mapping['_config:change_id'] = str(change_id)
await self.cache_provider.update_restricted_data(get_user_id(user), mapping)
# Unset the lock
await self.cache_provider.del_lock_restricted_data(get_user_id(user))
future.set_result(1)
else:
websocket_user_cache = DjangoCacheWebsocketUserCache()
restricted_data_cache = DummyRestrictedDataCache()
full_data_cache = DummyFullDataCache()
# Wait until the update if finshed
if get_user_id(user) in self.restricted_data_cache_updater:
# The active worker is on the same asgi server, we can use the future
await self.restricted_data_cache_updater[get_user_id(user)]
else:
while await self.cache_provider.get_lock_restricted_data(get_user_id(user)):
await asyncio.sleep(0.01)
async def get_all_restricted_data(self, user: Optional['CollectionElement']) -> Dict[str, List[Dict[str, Any]]]:
"""
Like get_all_full_data but with restricted_data for an user.
"""
if not self.use_restricted_data_cache:
all_restricted_data = {}
for collection_string, full_data in (await self.get_all_full_data()).items():
restricter = self.cachables[collection_string].restrict_elements
elements = await sync_to_async(restricter)(user, full_data)
all_restricted_data[collection_string] = elements
return all_restricted_data
await self.update_restricted_data(user)
out = defaultdict(list) # type: Dict[str, List[Dict[str, Any]]]
restricted_data = await self.cache_provider.get_all_data(get_user_id(user))
for element_id, data in restricted_data.items():
if element_id.decode().startswith('_config'):
continue
collection_string, __ = split_element_id(element_id)
out[collection_string].append(json.loads(data.decode()))
return dict(out)
async def get_restricted_data(
self,
user: Optional['CollectionElement'],
change_id: int = 0) -> Tuple[Dict[str, List[Dict[str, Any]]], List[str]]:
"""
Like get_full_data but with restricted_data for an user.
"""
if change_id == 0:
# Return all data
return (await self.get_all_restricted_data(user), [])
if not self.use_restricted_data_cache:
changed_elements, deleted_elements = await self.get_full_data(change_id)
restricted_data = {}
for collection_string, full_data in changed_elements.items():
restricter = self.cachables[collection_string].restrict_elements
elements = await sync_to_async(restricter)(user, full_data)
restricted_data[collection_string] = elements
return restricted_data, deleted_elements
lowest_change_id = await self.get_lowest_change_id()
if change_id < lowest_change_id:
# When change_id is lower then the lowest change_id in redis, we can
# not inform the user about deleted elements.
raise RuntimeError(
"change_id {} is lower then the lowest change_id in redis {}. "
"Catch this exception and rerun the method with change_id=0."
.format(change_id, lowest_change_id))
# If another coroutine or another daphne server also updates the restricted
# data, this waits until it is done.
await self.update_restricted_data(user)
raw_changed_elements, deleted_elements = await self.cache_provider.get_data_since(change_id, get_user_id(user))
return (
{collection_string: [json.loads(value.decode()) for value in value_list]
for collection_string, value_list in raw_changed_elements.items()},
deleted_elements)
async def get_current_change_id(self) -> int:
"""
Returns the current change id.
Returns start_time if there is no change id yet.
"""
value = await self.cache_provider.get_current_change_id()
if not value:
return self.start_time
# Return the score (second element) of the first (and only) element
return value[0][1]
async def get_next_change_id(self) -> int:
"""
Returns the next change_id.
Returns the start time in seconds + 1, if there is no change_id in yet.
"""
current_id = await self.get_current_change_id()
return current_id + 1
async def get_lowest_change_id(self) -> int:
"""
Returns the lowest change id.
Raises a RuntimeError if there is no change_id.
"""
value = await self.cache_provider.get_lowest_change_id()
if not value:
raise RuntimeError('There is no known change_id.')
# Return the score (second element) of the first (and only) element
return value
def load_element_cache(redis_addr: str = '', restricted_data: bool = True) -> ElementCache:
"""
Generates an element cache instance.
"""
if not redis_addr:
return ElementCache(redis='', cache_provider_class=MemmoryCacheProvider)
if no_redis_dependency:
raise ImportError("OpenSlides is configured to use redis as cache backend, but aioredis is not installed.")
return ElementCache(redis=redis_addr, use_restricted_data_cache=restricted_data)
redis_address = getattr(settings, 'REDIS_ADDRESS', '')
use_restricted_data = getattr(settings, 'RESTRICTED_DATA_CACHE', True)
element_cache = load_element_cache(redis_addr=redis_address, restricted_data=use_restricted_data)

View File

@ -0,0 +1,508 @@
from collections import defaultdict
from typing import Set # noqa
from typing import (
TYPE_CHECKING,
Any,
Dict,
Generator,
Iterable,
List,
Optional,
Tuple,
Union,
)
from django.apps import apps
from .utils import split_element_id, str_dict_to_bytes
if TYPE_CHECKING:
# Dummy import Collection for mypy, can be fixed with python 3.7
from .collection import CollectionElement # noqa
try:
import aioredis
except ImportError:
no_redis_dependency = True
else:
no_redis_dependency = False
class BaseCacheProvider:
"""
Base class for cache provider.
See RedisCacheProvider as reverence implementation.
"""
full_data_cache_key = 'full_data_cache'
restricted_user_cache_key = 'restricted_data_cache:{user_id}'
change_id_cache_key = 'change_id_cache'
lock_key = '_config:updating'
def __init__(self, *args: Any) -> None:
pass
def get_full_data_cache_key(self) -> str:
return self.full_data_cache_key
def get_restricted_data_cache_key(self, user_id: int) -> str:
return self.restricted_user_cache_key.format(user_id=user_id)
def get_change_id_cache_key(self) -> str:
return self.change_id_cache_key
def clear_cache(self) -> None:
raise NotImplementedError("CacheProvider has to implement the method clear_cache().")
async def reset_full_cache(self, data: Dict[str, str]) -> None:
raise NotImplementedError("CacheProvider has to implement the method reset_full_cache().")
async def data_exists(self, user_id: Optional[int] = None) -> bool:
raise NotImplementedError("CacheProvider has to implement the method exists_full_data().")
async def add_elements(self, elements: List[str]) -> None:
raise NotImplementedError("CacheProvider has to implement the method add_elements().")
async def del_elements(self, elements: List[str], user_id: Optional[int] = None) -> None:
raise NotImplementedError("CacheProvider has to implement the method del_elements().")
async def add_changed_elements(self, change_id: int, element_ids: Iterable[str]) -> None:
raise NotImplementedError("CacheProvider has to implement the method add_changed_elements().")
async def get_all_data(self, user_id: Optional[int] = None) -> Dict[bytes, bytes]:
raise NotImplementedError("CacheProvider has to implement the method get_all_data().")
async def get_data_since(self, change_id: int, user_id: Optional[int] = None) -> Tuple[Dict[str, List[bytes]], List[str]]:
raise NotImplementedError("CacheProvider has to implement the method get_data_since().")
async def get_element(self, element_id: str) -> Optional[bytes]:
raise NotImplementedError("CacheProvider has to implement the method get_element().")
async def del_restricted_data(self, user_id: int) -> None:
raise NotImplementedError("CacheProvider has to implement the method del_restricted_data().")
async def set_lock_restricted_data(self, user_id: int) -> bool:
raise NotImplementedError("CacheProvider has to implement the method set_lock_restricted_data().")
async def get_lock_restricted_data(self, user_id: int) -> bool:
raise NotImplementedError("CacheProvider has to implement the method get_lock_restricted_data().")
async def del_lock_restricted_data(self, user_id: int) -> None:
raise NotImplementedError("CacheProvider has to implement the method del_lock_restricted_data().")
async def get_change_id_user(self, user_id: int) -> Optional[int]:
raise NotImplementedError("CacheProvider has to implement the method get_change_id_user().")
async def update_restricted_data(self, user_id: int, data: Dict[str, str]) -> None:
raise NotImplementedError("CacheProvider has to implement the method update_restricted_data().")
async def get_current_change_id(self) -> List[Tuple[str, int]]:
raise NotImplementedError("CacheProvider has to implement the method get_current_change_id().")
async def get_lowest_change_id(self) -> Optional[int]:
raise NotImplementedError("CacheProvider has to implement the method get_lowest_change_id().")
class RedisCacheProvider(BaseCacheProvider):
"""
Cache provider that loads and saves the data to redis.
"""
redis_pool = None # type: Optional[aioredis.RedisConnection]
def __init__(self, redis: str) -> None:
self.redis_address = redis
async def get_connection(self) -> 'aioredis.RedisConnection':
"""
Returns a redis connection.
"""
if self.redis_pool is None:
self.redis_pool = await aioredis.create_redis_pool(self.redis_address)
return self.redis_pool
async def reset_full_cache(self, data: Dict[str, str]) -> None:
"""
Deletes the cache and write new data in it.
"""
# TODO: lua or transaction
redis = await self.get_connection()
await redis.delete(self.get_full_data_cache_key())
await redis.hmset_dict(self.get_full_data_cache_key(), data)
async def data_exists(self, user_id: Optional[int] = None) -> bool:
"""
Returns True, when there is data in the cache.
If user_id is None, the method tests for full_data. If user_id is an int, it tests
for the restricted_data_cache for the user with the user_id. 0 is for anonymous.
"""
redis = await self.get_connection()
if user_id is None:
cache_key = self.get_full_data_cache_key()
else:
cache_key = self.get_restricted_data_cache_key(user_id)
return await redis.exists(cache_key)
async def add_elements(self, elements: List[str]) -> None:
"""
Add or change elements to the cache.
elements is a list with an even len. the odd values are the element_ids and the even
values are the elements. The elements have to be encoded, for example with json.
"""
redis = await self.get_connection()
await redis.hmset(
self.get_full_data_cache_key(),
*elements)
async def del_elements(self, elements: List[str], user_id: Optional[int] = None) -> None:
"""
Deletes elements from the cache.
elements has to be a list of element_ids.
If user_id is None, the elements are deleted from the full_data cache. If user_id is an
int, the elements are deleted one restricted_data_cache. 0 is for anonymous.
"""
redis = await self.get_connection()
if user_id is None:
cache_key = self.get_full_data_cache_key()
else:
cache_key = self.get_restricted_data_cache_key(user_id)
await redis.hdel(
cache_key,
*elements)
async def add_changed_elements(self, change_id: int, element_ids: Iterable[str]) -> None:
"""
Saves which elements are change with a change_id.
args has to be an even iterable. The odd values have to be a change id (int) and the
even values have to be element_ids.
"""
def zadd_args(change_id: int) -> Generator[Union[int, str], None, None]:
"""
Small helper to generates the arguments for the redis command zadd.
"""
for element_id in element_ids:
yield change_id
yield element_id
redis = await self.get_connection()
await redis.zadd(self.get_change_id_cache_key(), *zadd_args(change_id))
# Saves the lowest_change_id if it does not exist
await redis.zadd(self.get_change_id_cache_key(), change_id, '_config:lowest_change_id', exist='ZSET_IF_NOT_EXIST')
async def get_all_data(self, user_id: Optional[int] = None) -> Dict[bytes, bytes]:
"""
Returns all data from a cache.
if user_id is None, then the data is returned from the full_data_cache. If it is and
int, it is returned from a restricted_data_cache. 0 is for anonymous.
"""
if user_id is None:
cache_key = self.get_full_data_cache_key()
else:
cache_key = self.get_restricted_data_cache_key(user_id)
redis = await self.get_connection()
return await redis.hgetall(cache_key)
async def get_element(self, element_id: str) -> Optional[bytes]:
"""
Returns one element from the full_data_cache.
Returns None, when the element does not exist.
"""
redis = await self.get_connection()
return await redis.hget(
self.get_full_data_cache_key(),
element_id)
async def get_data_since(self, change_id: int, user_id: Optional[int] = None) -> Tuple[Dict[str, List[bytes]], List[str]]:
"""
Returns all elements since a change_id.
The returend value is a two element tuple. The first value is a dict the elements where
the key is the collection_string and the value a list of (json-) encoded elements. The
second element is a list of element_ids, that have been deleted since the change_id.
if user_id is None, the full_data is returned. If user_id is an int, the restricted_data
for an user is used. 0 is for the anonymous user.
"""
# TODO: rewrite with lua to get all elements with one request
redis = await self.get_connection()
changed_elements = defaultdict(list) # type: Dict[str, List[bytes]]
deleted_elements = [] # type: List[str]
for element_id in await redis.zrangebyscore(self.get_change_id_cache_key(), min=change_id):
if element_id.startswith(b'_config'):
continue
element_json = await redis.hget(self.get_full_data_cache_key(), element_id) # Optional[bytes]
if element_json is None:
# The element is not in the cache. It has to be deleted.
deleted_elements.append(element_id)
else:
collection_string, id = split_element_id(element_id)
changed_elements[collection_string].append(element_json)
return changed_elements, deleted_elements
async def del_restricted_data(self, user_id: int) -> None:
"""
Deletes all restricted_data for an user. 0 is for the anonymous user.
"""
redis = await self.get_connection()
await redis.delete(self.get_restricted_data_cache_key(user_id))
async def set_lock_restricted_data(self, user_id: int) -> bool:
"""
Tries to sets a lock for the restricted_data of an user.
Returns True when the lock could be set.
Returns False when the lock was already set.
"""
redis = await self.get_connection()
return await redis.hsetnx(self.get_restricted_data_cache_key(user_id), self.lock_key, 1)
async def get_lock_restricted_data(self, user_id: int) -> bool:
"""
Returns True, when the lock for the restricted_data of an user is set. Else False.
"""
redis = await self.get_connection()
return await redis.hget(self.get_restricted_data_cache_key(user_id), self.lock_key)
async def del_lock_restricted_data(self, user_id: int) -> None:
"""
Deletes the lock for the restricted_data of an user. Does nothing when the
lock is not set.
"""
redis = await self.get_connection()
await redis.hdel(self.get_restricted_data_cache_key(user_id), self.lock_key)
async def get_change_id_user(self, user_id: int) -> Optional[int]:
"""
Get the change_id for the restricted_data of an user.
This is the change_id where the restricted_data was last calculated.
"""
redis = await self.get_connection()
return await redis.hget(self.get_restricted_data_cache_key(user_id), '_config:change_id')
async def update_restricted_data(self, user_id: int, data: Dict[str, str]) -> None:
"""
Updates the restricted_data for an user.
data has to be a dict where the key is an element_id and the value the (json-) encoded
element.
"""
redis = await self.get_connection()
await redis.hmset_dict(self.get_restricted_data_cache_key(user_id), data)
async def get_current_change_id(self) -> List[Tuple[str, int]]:
"""
Get the highest change_id from redis.
"""
redis = await self.get_connection()
return await redis.zrevrangebyscore(
self.get_change_id_cache_key(),
withscores=True,
count=1,
offset=0)
async def get_lowest_change_id(self) -> Optional[int]:
"""
Get the lowest change_id from redis.
Returns None if lowest score does not exist.
"""
redis = await self.get_connection()
return await redis.zscore(
self.get_change_id_cache_key(),
'_config:lowest_change_id')
class MemmoryCacheProvider(BaseCacheProvider):
"""
CacheProvider for the ElementCache that uses only the memory.
See the RedisCacheProvider for a description of the methods.
This provider supports only one process. It saves the data into the memory.
When you use different processes they will use diffrent data.
"""
def __init__(self, *args: Any, **kwargs: Any) -> None:
self.clear_cache()
def clear_cache(self) -> None:
self.full_data = {} # type: Dict[str, str]
self.restricted_data = {} # type: Dict[int, Dict[str, str]]
self.change_id_data = {} # type: Dict[int, Set[str]]
async def reset_full_cache(self, data: Dict[str, str]) -> None:
self.full_data = data
async def data_exists(self, user_id: Optional[int] = None) -> bool:
if user_id is None:
cache_dict = self.full_data
else:
cache_dict = self.restricted_data.get(user_id, {})
return bool(cache_dict)
async def add_elements(self, elements: List[str]) -> None:
if len(elements) % 2:
raise ValueError("The argument elements of add_elements has to be a list with an even number of elements.")
for i in range(0, len(elements), 2):
self.full_data[elements[i]] = elements[i+1]
async def del_elements(self, elements: List[str], user_id: Optional[int] = None) -> None:
if user_id is None:
cache_dict = self.full_data
else:
cache_dict = self.restricted_data.get(user_id, {})
for element in elements:
try:
del cache_dict[element]
except KeyError:
pass
async def add_changed_elements(self, change_id: int, element_ids: Iterable[str]) -> None:
element_ids = list(element_ids)
for element_id in element_ids:
if change_id in self.change_id_data:
self.change_id_data[change_id].add(element_id)
else:
self.change_id_data[change_id] = {element_id}
async def get_all_data(self, user_id: Optional[int] = None) -> Dict[bytes, bytes]:
if user_id is None:
cache_dict = self.full_data
else:
cache_dict = self.restricted_data.get(user_id, {})
return str_dict_to_bytes(cache_dict)
async def get_element(self, element_id: str) -> Optional[bytes]:
value = self.full_data.get(element_id, None)
return value.encode() if value is not None else None
async def get_data_since(
self, change_id: int, user_id: Optional[int] = None) -> Tuple[Dict[str, List[bytes]], List[str]]:
changed_elements = defaultdict(list) # type: Dict[str, List[bytes]]
deleted_elements = [] # type: List[str]
if user_id is None:
cache_dict = self.full_data
else:
cache_dict = self.restricted_data.get(user_id, {})
for data_change_id, element_ids in self.change_id_data.items():
if data_change_id < change_id:
continue
for element_id in element_ids:
element_json = cache_dict.get(element_id, None)
if element_json is None:
deleted_elements.append(element_id)
else:
collection_string, id = split_element_id(element_id)
changed_elements[collection_string].append(element_json.encode())
return changed_elements, deleted_elements
async def del_restricted_data(self, user_id: int) -> None:
try:
del self.restricted_data[user_id]
except KeyError:
pass
async def set_lock_restricted_data(self, user_id: int) -> bool:
data = self.restricted_data.setdefault(user_id, {})
if self.lock_key in data:
return False
data[self.lock_key] = "1"
return True
async def get_lock_restricted_data(self, user_id: int) -> bool:
data = self.restricted_data.get(user_id, {})
return self.lock_key in data
async def del_lock_restricted_data(self, user_id: int) -> None:
data = self.restricted_data.get(user_id, {})
try:
del data[self.lock_key]
except KeyError:
pass
async def get_change_id_user(self, user_id: int) -> Optional[int]:
data = self.restricted_data.get(user_id, {})
change_id = data.get('_config:change_id', None)
return int(change_id) if change_id is not None else None
async def update_restricted_data(self, user_id: int, data: Dict[str, str]) -> None:
redis_data = self.restricted_data.setdefault(user_id, {})
redis_data.update(data)
async def get_current_change_id(self) -> List[Tuple[str, int]]:
change_data = self.change_id_data
if change_data:
return [('no_usefull_value', max(change_data.keys()))]
return []
async def get_lowest_change_id(self) -> Optional[int]:
change_data = self.change_id_data
if change_data:
return min(change_data.keys())
return None
class Cachable:
"""
A Cachable is an object that returns elements that can be cached.
It needs at least the methods defined here.
"""
def get_collection_string(self) -> str:
"""
Returns the string representing the name of the cachable.
"""
raise NotImplementedError("Cachable has to implement the method get_collection_string().")
def get_elements(self) -> List[Dict[str, Any]]:
"""
Returns all elements of the cachable.
"""
raise NotImplementedError("Cachable has to implement the method get_collection_string().")
def restrict_elements(
self,
user: Optional['CollectionElement'],
elements: List[Dict[str, Any]]) -> List[Dict[str, Any]]:
"""
Converts full_data to restricted_data.
elements can be an empty list, a list with some elements of the cachable or with all
elements of the cachable.
The default implementation returns the full_data.
"""
return elements
def get_all_cachables() -> List[Cachable]:
"""
Returns all element of OpenSlides.
"""
out = [] # type: List[Cachable]
for app in apps.get_app_configs():
try:
# Get the method get_startup_elements() from an app.
# This method has to return an iterable of Collection objects.
get_startup_elements = app.get_startup_elements
except AttributeError:
# Skip apps that do not implement get_startup_elements.
continue
out.extend(get_startup_elements())
return out

View File

@ -10,11 +10,15 @@ from typing import (
cast,
)
from asgiref.sync import async_to_sync
from django.apps import apps
from django.conf import settings
from django.db.models import Model
from mypy_extensions import TypedDict
from .cache import full_data_cache
from .cache import element_cache
from .cache_providers import Cachable
if TYPE_CHECKING:
from .access_permissions import BaseAccessPermissions # noqa
@ -74,19 +78,12 @@ class CollectionElement:
'CollectionElement.from_values() but not CollectionElement() '
'directly.')
if self.is_deleted():
# Delete the element from the cache, if self.is_deleted() is True:
full_data_cache.del_element(self.collection_string, self.id)
else:
# The call to get_full_data() has some sideeffects. When the object
# was created with from_instance() or the object is not in the cache
# then get_full_data() will save the object into the cache.
# This will also raise a DoesNotExist error, if the object does
# neither exist in the cache nor in the database.
self.get_full_data()
if not self.deleted:
self.get_full_data() # This raises DoesNotExist, if the element does not exist.
@classmethod
def from_instance(cls, instance: Model, deleted: bool = False, information: Dict[str, Any] = None) -> 'CollectionElement':
def from_instance(
cls, instance: Model, deleted: bool = False, information: Dict[str, Any] = None) -> 'CollectionElement':
"""
Returns a collection element from a database instance.
@ -175,6 +172,20 @@ class CollectionElement:
"""
return self.get_model().get_access_permissions()
def get_element_from_db(self) -> Optional[Dict[str, Any]]:
# Hack for django 2.0 and channels 2.1 to stay in the same thread.
# This is needed for the tests.
try:
query = self.get_model().objects.get_full_queryset()
except AttributeError:
# If the model des not have to method get_full_queryset(), then use
# the default queryset from django.
query = self.get_model().objects
try:
return self.get_access_permissions().get_full_data(query.get(pk=self.id))
except self.get_model().DoesNotExist:
return None
def get_full_data(self) -> Dict[str, Any]:
"""
Returns the full_data of this collection_element from with all other
@ -188,14 +199,20 @@ class CollectionElement:
# else: use the cache.
if self.full_data is None:
if self.instance is None:
# Make sure the cache exists
if not full_data_cache.exists_for_collection(self.collection_string):
# Build the cache if it does not exists.
full_data_cache.build_for_collection(self.collection_string)
self.full_data = full_data_cache.get_element(self.collection_string, self.id)
# The type of data has to be set for mypy
data = None # type: Optional[Dict[str, Any]]
if getattr(settings, 'SKIP_CACHE', False):
# Hack for django 2.0 and channels 2.1 to stay in the same thread.
# This is needed for the tests.
data = self.get_element_from_db()
else:
data = async_to_sync(element_cache.get_element_full_data)(self.collection_string, self.id)
if data is None:
raise self.get_model().DoesNotExist(
"Collection {} with id {} does not exist".format(self.collection_string, self.id))
self.full_data = data
else:
self.full_data = self.get_access_permissions().get_full_data(self.instance)
full_data_cache.add_element(self.collection_string, self.id, self.full_data)
return self.full_data
def is_deleted(self) -> bool:
@ -205,7 +222,7 @@ class CollectionElement:
return self.deleted
class Collection:
class Collection(Cachable):
"""
Represents all elements of one collection.
"""
@ -242,17 +259,32 @@ class Collection:
full_data['id'],
full_data=full_data)
def get_elements_from_db(self) ->Dict[str, List[Dict[str, Any]]]:
# Hack for django 2.0 and channels 2.1 to stay in the same thread.
# This is needed for the tests.
try:
query = self.get_model().objects.get_full_queryset()
except AttributeError:
# If the model des not have to method get_full_queryset(), then use
# the default queryset from django.
query = self.get_model().objects
return {self.collection_string: [self.get_model().get_access_permissions().get_full_data(instance) for instance in query.all()]}
def get_full_data(self) -> List[Dict[str, Any]]:
"""
Returns a list of dictionaries with full_data of all collection
elements.
"""
if self.full_data is None:
# Build the cache, if it does not exists.
if not full_data_cache.exists_for_collection(self.collection_string):
full_data_cache.build_for_collection(self.collection_string)
self.full_data = full_data_cache.get_data(self.collection_string)
# The type of all_full_data has to be set for mypy
all_full_data = {} # type: Dict[str, List[Dict[str, Any]]]
if getattr(settings, 'SKIP_CACHE', False):
# Hack for django 2.0 and channels 2.1 to stay in the same thread.
# This is needed for the tests.
all_full_data = self.get_elements_from_db()
else:
all_full_data = async_to_sync(element_cache.get_all_full_data)()
self.full_data = all_full_data[self.collection_string]
return self.full_data
def as_list_for_user(self, user: Optional[CollectionElement]) -> List[Dict[str, Any]]:
@ -262,6 +294,27 @@ class Collection:
"""
return self.get_access_permissions().get_restricted_data(self.get_full_data(), user)
def get_collection_string(self) -> str:
"""
Returns the collection_string.
"""
return self.collection_string
def get_elements(self) -> List[Dict[str, Any]]:
"""
Returns all elements of the Collection as full_data.
"""
return self.get_full_data()
def restrict_elements(
self,
user: Optional['CollectionElement'],
elements: List[Dict[str, Any]]) -> List[Dict[str, Any]]:
"""
Converts the full_data to restricted data.
"""
return self.get_model().get_access_permissions().get_restricted_data(user, elements)
_models_to_collection_string = {} # type: Dict[str, Type[Model]]
@ -295,7 +348,8 @@ def get_model_from_collection_string(collection_string: str) -> Type[Model]:
return model
def format_for_autoupdate(collection_string: str, id: int, action: str, data: Dict[str, Any] = None) -> AutoupdateFormat:
def format_for_autoupdate(
collection_string: str, id: int, action: str, data: Dict[str, Any] = None) -> AutoupdateFormat:
"""
Returns a dict that can be used for autoupdate.
"""

View File

@ -0,0 +1,306 @@
from typing import Any, Dict, List, Optional
from asgiref.sync import sync_to_async
from channels.db import database_sync_to_async
from channels.generic.websocket import AsyncJsonWebsocketConsumer
from ..core.config import config
from ..core.models import Projector
from .auth import async_anonymous_is_enabled, has_perm
from .cache import element_cache, split_element_id
from .collection import AutoupdateFormat # noqa
from .collection import (
Collection,
CollectionElement,
format_for_autoupdate,
from_channel_message,
)
class SiteConsumer(AsyncJsonWebsocketConsumer):
"""
Websocket Consumer for the site.
"""
groups = ['site']
async def connect(self) -> None:
"""
A user connects to the site.
If it is an anonymous user and anonymous is disabled, the connection is closed.
Sends the startup data to the user.
"""
# TODO: add a way to ask for the data since a change_id and send only data that is newer
if not await async_anonymous_is_enabled() and self.scope['user'].id is None:
await self.close()
else:
await self.accept()
data = await startup_data(self.scope['user'])
await self.send_json(data)
async def receive_json(self, content: Any) -> None:
"""
If we recieve something from the client we currently just interpret this
as a notify message.
The server adds the sender's user id (0 for anonymous) and reply
channel name so that a receiver client may reply to the sender or to all
sender's instances.
"""
if notify_message_is_valid(content):
await self.channel_layer.group_send(
"projector",
{
"type": "send_notify",
"incomming": content,
"senderReplyChannelName": self.channel_name,
"senderUserId": self.scope['user'].id or 0,
},
)
await self.channel_layer.group_send(
"site",
{
"type": "send_notify",
"incomming": content,
"senderReplyChannelName": self.channel_name,
"senderUserId": self.scope['user'].id or 0,
},
)
else:
await self.send_json({'error': 'invalid message'})
async def send_notify(self, event: Dict[str, Any]) -> None:
"""
Send a notify message to the user.
"""
user_id = self.scope['user'].id or 0
out = []
for item in event['incomming']:
users = item.get('users')
reply_channels = item.get('replyChannels')
projectors = item.get('projectors')
if ((isinstance(users, list) and user_id in users)
or (isinstance(reply_channels, list) and self.channel_name in reply_channels)
or (users is None and reply_channels is None and projectors is None)):
item['senderReplyChannelName'] = event.get('senderReplyChannelName')
item['senderUserId'] = event.get('senderUserId')
item['senderProjectorId'] = event.get('senderProjectorId')
out.append(item)
if out:
await self.send_json(out)
async def send_data(self, event: Dict[str, Any]) -> None:
"""
Send changed or deleted elements to the user.
"""
change_id = event['change_id']
output = []
changed_elements, deleted_elements = await element_cache.get_restricted_data(self.scope['user'], change_id)
for collection_string, elements in changed_elements.items():
for element in elements:
output.append(format_for_autoupdate(
collection_string=collection_string,
id=element['id'],
action='changed',
data=element))
for element_id in deleted_elements:
collection_string, id = split_element_id(element_id)
output.append(format_for_autoupdate(
collection_string=collection_string,
id=id,
action='deleted'))
await self.send_json(output)
class ProjectorConsumer(AsyncJsonWebsocketConsumer):
"""
Websocket Consumer for the projector.
"""
groups = ['projector']
async def connect(self) -> None:
"""
Adds the websocket connection to a group specific to the projector with the given id.
Also sends all data that are shown on the projector.
"""
user = self.scope['user']
projector_id = self.scope["url_route"]["kwargs"]["projector_id"]
await self.accept()
if not await database_sync_to_async(has_perm)(user, 'core.can_see_projector'):
await self.send_json({'text': 'No permissions to see this projector.'})
# TODO: Shouldend we just close the websocket connection with an error message?
# self.close(code=4403)
else:
out = await sync_to_async(projector_startup_data)(projector_id)
await self.send_json(out)
async def receive_json(self, content: Any) -> None:
"""
If we recieve something from the client we currently just interpret this
as a notify message.
The server adds the sender's user id (0 for anonymous) and reply
channel name so that a receiver client may reply to the sender or to all
sender's instances.
"""
projector_id = self.scope["url_route"]["kwargs"]["projector_id"]
await self.channel_layer.group_send(
"projector",
{
"type": "send_notify",
"incomming": content,
"senderReplyChannelName": self.channel_name,
"senderProjectorId": projector_id,
},
)
await self.channel_layer.group_send(
"site",
{
"type": "send_notify",
"incomming": content,
"senderReplyChannelName": self.channel_name,
"senderProjectorId": projector_id,
},
)
async def send_notify(self, event: Dict[str, Any]) -> None:
"""
Send a notify message to the projector.
"""
projector_id = self.scope["url_route"]["kwargs"]["projector_id"]
out = []
for item in event['incomming']:
users = item.get('users')
reply_channels = item.get('replyChannels')
projectors = item.get('projectors')
if ((isinstance(projectors, list) and projector_id in projectors)
or (isinstance(reply_channels, list) and self.channel_name in reply_channels)
or (users is None and reply_channels is None and projectors is None)):
item['senderReplyChannelName'] = event.get('senderReplyChannelName')
item['senderUserId'] = event.get('senderUserId')
item['senderProjectorId'] = event.get('senderProjectorId')
out.append(item)
if out:
await self.send_json(out)
async def send_data(self, event: Dict[str, Any]) -> None:
"""
Informs all projector clients about changed data.
"""
projector_id = self.scope["url_route"]["kwargs"]["projector_id"]
collection_elements = from_channel_message(event['message'])
output = await projector_sync_send_data(projector_id, collection_elements)
if output:
await self.send_json(output)
async def startup_data(user: Optional[CollectionElement], change_id: int = 0) -> List[Any]:
"""
Returns all data for startup.
"""
# TODO: use the change_id argument
output = []
restricted_data = await element_cache.get_all_restricted_data(user)
for collection_string, elements in restricted_data.items():
for element in elements:
formatted_data = format_for_autoupdate(
collection_string=collection_string,
id=element['id'],
action='changed',
data=element)
output.append(formatted_data)
return output
def projector_startup_data(projector_id: int) -> Any:
"""
Generate the startup data for a projector.
"""
try:
projector = Projector.objects.get(pk=projector_id)
except Projector.DoesNotExist:
return {'text': 'The projector {} does not exist.'.format(projector_id)}
else:
# Now check whether broadcast is active at the moment. If yes,
# change the local projector variable.
if config['projector_broadcast'] > 0:
projector = Projector.objects.get(pk=config['projector_broadcast'])
# Collect all elements that are on the projector.
output = [] # type: List[AutoupdateFormat]
for requirement in projector.get_all_requirements():
required_collection_element = CollectionElement.from_instance(requirement)
output.append(required_collection_element.as_autoupdate_for_projector())
# Collect all config elements.
config_collection = Collection(config.get_collection_string())
projector_data = (config_collection.get_access_permissions()
.get_projector_data(config_collection.get_full_data()))
for data in projector_data:
output.append(format_for_autoupdate(
config_collection.collection_string,
data['id'],
'changed',
data))
# Collect the projector instance.
collection_element = CollectionElement.from_instance(projector)
output.append(collection_element.as_autoupdate_for_projector())
# Send all the data that were only collected before.
return output
@sync_to_async
def projector_sync_send_data(projector_id: int, collection_elements: List[CollectionElement]) -> List[Any]:
"""
sync function that generates the elements for an projector.
"""
# Load the projector object. If broadcast is on, use the broadcast projector
# instead.
if config['projector_broadcast'] > 0:
projector_id = config['projector_broadcast']
projector = Projector.objects.get(pk=projector_id)
# TODO: This runs once for every open projector tab. Either use
# caching or something else, so this is only called once
output = []
for collection_element in collection_elements:
if collection_element.is_deleted():
output.append(collection_element.as_autoupdate_for_projector())
else:
for element in projector.get_collection_elements_required_for_this(collection_element):
output.append(element.as_autoupdate_for_projector())
return output
def notify_message_is_valid(message: object) -> bool:
"""
Returns True, when the message is a valid notify_message.
"""
if not isinstance(message, list):
# message has to be a list
return False
if not message:
# message must contain at least one element
return False
for element in message:
if not isinstance(element, dict):
# All elements have to be a dict
return False
# TODO: There could be more checks. For example 'users' has to be a list of int
# Check could be done with json-schema:
# https://pypi.org/project/jsonschema/
return True

View File

@ -13,6 +13,7 @@ from django.core.exceptions import ImproperlyConfigured
from django.utils.crypto import get_random_string
from mypy_extensions import NoReturn
DEVELOPMENT_VERSION = 'Development Version'
UNIX_VERSION = 'Unix Version'
WINDOWS_VERSION = 'Windows Version'
@ -327,16 +328,6 @@ def is_local_installation() -> bool:
return True if '--local-installation' in sys.argv or 'manage.py' in sys.argv[0] else False
def get_geiss_path() -> str:
"""
Returns the path and file to the Geiss binary.
"""
from django.conf import settings
download_dir = getattr(settings, 'OPENSLIDES_USER_DATA_PATH', '')
bin_name = 'geiss.exe' if is_windows() else 'geiss'
return os.path.join(download_dir, bin_name)
def is_windows() -> bool:
"""
Returns True if the current system is Windows. Returns False otherwise.

View File

@ -0,0 +1,63 @@
from typing import Any, Dict, Union
from channels.auth import (
AuthMiddleware,
CookieMiddleware,
SessionMiddleware,
_get_user_session_key,
)
from django.conf import settings
from django.contrib.auth import BACKEND_SESSION_KEY, HASH_SESSION_KEY
from django.contrib.auth.models import AnonymousUser
from django.utils.crypto import constant_time_compare
from .cache import element_cache
from .collection import CollectionElement
class CollectionAuthMiddleware(AuthMiddleware):
"""
Like the channels AuthMiddleware but returns a CollectionElement instead of
a django Model as user.
"""
async def resolve_scope(self, scope: Dict[str, Any]) -> None:
scope["user"]._wrapped = await get_user(scope)
async def get_user(scope: Dict[str, Any]) -> Union[CollectionElement, AnonymousUser]:
"""
Returns a User-CollectionElement from a channels-scope-session.
If no user is retrieved, return AnonymousUser.
"""
# This can not return None because a LazyObject can not become None
# This code is basicly from channels.auth:
# https://github.com/django/channels/blob/d5e81a78e96770127da79248349808b6ee6ec2a7/channels/auth.py#L16
if "session" not in scope:
raise ValueError("Cannot find session in scope. You should wrap your consumer in SessionMiddleware.")
session = scope["session"]
user = None
try:
user_id = _get_user_session_key(session)
backend_path = session[BACKEND_SESSION_KEY]
except KeyError:
pass
else:
if backend_path in settings.AUTHENTICATION_BACKENDS:
user = await element_cache.get_element_full_data("users/user", user_id)
if user is not None:
# Verify the session
session_hash = session.get(HASH_SESSION_KEY)
session_hash_verified = session_hash and constant_time_compare(
session_hash,
user['session_auth_hash'])
if not session_hash_verified:
session.flush()
user = None
return CollectionElement.from_values("users/user", user_id, full_data=user) if user else AnonymousUser()
# Handy shortcut for applying all three layers at once
AuthMiddlewareStack = lambda inner: CookieMiddleware(SessionMiddleware(CollectionAuthMiddleware(inner))) # noqa

View File

@ -1,4 +1,4 @@
from typing import Any, Callable # noqa
from typing import Any, Callable
from django.contrib.auth.models import Permission
from django.contrib.contenttypes.models import ContentType

View File

@ -1,12 +1,17 @@
from typing import Any, Dict
from typing import TYPE_CHECKING, Any, Dict, List, Optional
from django.core.exceptions import ImproperlyConfigured
from django.db import models
from .access_permissions import BaseAccessPermissions # noqa
from .access_permissions import BaseAccessPermissions
from .utils import convert_camel_case_to_pseudo_snake_case
if TYPE_CHECKING:
# Dummy import Collection for mypy, can be fixed with python 3.7
from .collection import Collection, CollectionElement # noqa
class MinMaxIntegerField(models.IntegerField):
"""
IntegerField with options to set a min- and a max-value.
@ -117,3 +122,29 @@ class RESTModelMixin:
else:
inform_deleted_data([(self.get_collection_string(), instance_pk)], information=information)
return return_value
@classmethod
def get_elements(cls) -> List[Dict[str, Any]]:
"""
Returns all elements as full_data.
"""
# Get the query to receive all data from the database.
try:
query = cls.objects.get_full_queryset() # type: ignore
except AttributeError:
# If the model des not have to method get_full_queryset(), then use
# the default queryset from django.
query = cls.objects # type: ignore
# Build a dict from the instance id to the full_data
return [cls.get_access_permissions().get_full_data(instance) for instance in query.all()]
@classmethod
def restrict_elements(
cls,
user: Optional['CollectionElement'],
elements: List[Dict[str, Any]]) -> List[Dict[str, Any]]:
"""
Converts a list of elements from full_data to restricted_data.
"""
return cls.get_access_permissions().get_restricted_data(elements, user)

View File

@ -5,17 +5,16 @@ from django.http import Http404
from rest_framework import status # noqa
from rest_framework.decorators import detail_route, list_route # noqa
from rest_framework.metadata import SimpleMetadata # noqa
from rest_framework.mixins import ListModelMixin as _ListModelMixin
from rest_framework.mixins import RetrieveModelMixin as _RetrieveModelMixin
from rest_framework.mixins import ( # noqa
CreateModelMixin,
DestroyModelMixin,
ListModelMixin as _ListModelMixin,
RetrieveModelMixin as _RetrieveModelMixin,
UpdateModelMixin,
)
from rest_framework.relations import MANY_RELATION_KWARGS
from rest_framework.response import Response
from rest_framework.routers import DefaultRouter
from rest_framework.serializers import ModelSerializer as _ModelSerializer
from rest_framework.serializers import ( # noqa
CharField,
DictField,
@ -26,20 +25,24 @@ from rest_framework.serializers import ( # noqa
ListField,
ListSerializer,
ManyRelatedField,
ModelSerializer as _ModelSerializer,
PrimaryKeyRelatedField,
RelatedField,
Serializer,
SerializerMethodField,
ValidationError,
)
from rest_framework.viewsets import GenericViewSet as _GenericViewSet # noqa
from rest_framework.viewsets import ModelViewSet as _ModelViewSet # noqa
from rest_framework.viewsets import ViewSet as _ViewSet # noqa
from rest_framework.viewsets import ( # noqa
GenericViewSet as _GenericViewSet,
ModelViewSet as _ModelViewSet,
ViewSet as _ViewSet,
)
from .access_permissions import BaseAccessPermissions
from .auth import user_to_collection_user
from .collection import Collection, CollectionElement
router = DefaultRouter()

View File

@ -79,59 +79,38 @@ DATABASES = {
use_redis = False
if use_redis:
# Redis configuration for django-redis-session. Keep this synchronized to
# the caching settings
# Django Channels
# https://channels.readthedocs.io/en/latest/topics/channel_layers.html#configuration
CHANNEL_LAYERS['default']['BACKEND'] = 'channels_redis.core.RedisChannelLayer'
CHANNEL_LAYERS['default']['CONFIG'] = {"capacity": 100000}
# Collection Cache
# Can be:
# a Redis URI — "redis://host:6379/0?encoding=utf-8";
# a (host, port) tuple — ('localhost', 6379);
# or a unix domain socket path string — "/path/to/redis.sock".
REDIS_ADDRESS = "redis://127.0.0.1"
# When use_redis is True, the restricted data cache caches the data individuel
# for each user. This requires a lot of memory if there are a lot of active
# users.
RESTRICTED_DATA_CACHE = True
# Session backend
# Redis configuration for django-redis-sessions.
# https://github.com/martinrusev/django-redis-sessions
SESSION_ENGINE = 'redis_sessions.session'
SESSION_REDIS = {
'host': '127.0.0.1',
'post': 6379,
'db': 0,
}
# Django Channels
# Unless you have only a small assembly uncomment the following lines to
# activate Redis as backend for Django Channels and Cache. You have to install
# a Redis server and the python packages asgi_redis and django-redis.
# https://channels.readthedocs.io/en/latest/backends.html#redis
CHANNEL_LAYERS['default']['BACKEND'] = 'asgi_redis.RedisChannelLayer'
CHANNEL_LAYERS['default']['CONFIG']['prefix'] = 'asgi:'
# Caching
# Django uses a inmemory cache at default. This supports only one thread. If
# you use more then one thread another caching backend is required. We recommand
# django-redis: https://niwinz.github.io/django-redis/latest/#_user_guide
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": "redis://127.0.0.1:6379/0",
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
},
"KEY_PREFIX": "openslides-cache",
}
}
# Session backend
# Per default django uses the database as session backend. This can be slow.
# One possibility is to use the cache session backend with redis as cache backend
# Another possibility is to use a native redis session backend. For example:
# https://github.com/martinrusev/django-redis-sessions
# SESSION_ENGINE = 'django.contrib.sessions.backends.cache'
SESSION_ENGINE = 'redis_sessions.session'
# When use_redis is True, the restricted data cache caches the data individuel
# for each user. This requires a lot of memory if there are a lot of active
# users. If use_redis is False, this setting has no effect.
DISABLE_USER_CACHE = False
# Internationalization
# https://docs.djangoproject.com/en/1.10/topics/i18n/

View File

@ -1,37 +1,12 @@
from django.test import TestCase as _TestCase
from django.test.runner import DiscoverRunner
from ..core.config import config
class OpenSlidesDiscoverRunner(DiscoverRunner):
def run_tests(self, test_labels, extra_tests=None, **kwargs): # type: ignore
"""
Test Runner which does not create a database, if only unittest are run.
"""
if len(test_labels) == 1 and test_labels[0].startswith('tests.unit'):
# Do not create a test database, if only unittests are tested
create_database = False
else:
create_database = True
self.setup_test_environment()
suite = self.build_suite(test_labels, extra_tests)
if create_database:
old_config = self.setup_databases()
result = self.run_suite(suite)
if create_database:
self.teardown_databases(old_config)
self.teardown_test_environment()
return self.suite_result(suite, result)
class TestCase(_TestCase):
"""
Resets the config object after each test.
"""
def tearDown(self) -> None:
from django_redis import get_redis_connection
config.key_to_id = {}
get_redis_connection("default").flushall()
config.save_default_values()

View File

@ -1,7 +1,13 @@
import re
from typing import TYPE_CHECKING, Dict, Optional, Tuple, Union
import roman
if TYPE_CHECKING:
# Dummy import Collection for mypy, can be fixed with python 3.7
from .collection import Collection, CollectionElement # noqa
CAMEL_CASE_TO_PSEUDO_SNAKE_CASE_CONVERSION_REGEX_1 = re.compile('(.)([A-Z][a-z]+)')
CAMEL_CASE_TO_PSEUDO_SNAKE_CASE_CONVERSION_REGEX_2 = re.compile('([a-z0-9])([A-Z])')
@ -29,3 +35,43 @@ def to_roman(number: int) -> str:
return roman.toRoman(number)
except (roman.NotIntegerError, roman.OutOfRangeError):
return str(number)
def get_element_id(collection_string: str, id: int) -> str:
"""
Returns a combined string from the collection_string and an id.
"""
return "{}:{}".format(collection_string, id)
def split_element_id(element_id: Union[str, bytes]) -> Tuple[str, int]:
"""
Splits a combined element_id into the collection_string and the id.
"""
if isinstance(element_id, bytes):
element_id = element_id.decode()
collection_str, id = element_id.rsplit(":", 1)
return (collection_str, int(id))
def get_user_id(user: Optional['CollectionElement']) -> int:
"""
Returns the user id for an CollectionElement user.
Returns 0 for anonymous.
"""
if user is None:
user_id = 0
else:
user_id = user.id
return user_id
def str_dict_to_bytes(str_dict: Dict[str, str]) -> Dict[bytes, bytes]:
"""
Converts the key and the value of a dict from str to bytes.
"""
out = {}
for key, value in str_dict.items():
out[key.encode()] = value.encode()
return out

View File

@ -1,5 +1,6 @@
import bleach
allowed_tags = [
'a', 'img', # links and images
'br', 'p', 'span', 'blockquote', # text layout

View File

@ -6,6 +6,8 @@ coverage
#flake8
# Use master of flake8 until flake8 3.6 is released that supports python3.7
git+https://gitlab.com/pycqa/flake8.git
isort==4.2.5
isort
mypy
fakeredis
pytest-django
pytest-asyncio
pytest-cov

View File

@ -2,8 +2,7 @@
-r requirements_production.txt
# Requirements for Redis and PostgreSQL support
asgi-redis>=1.3,<1.5
django-redis>=4.7.0,<4.10
channels-redis>=2.2,<2.3
django-redis-sessions>=0.6.1,<0.7
psycopg2-binary>=2.7,<2.8
txredisapi==1.4.4
psycopg2-binary>=2.7.3.2,<2.8
aioredis>=1.1.0,<1.2

View File

@ -1,8 +1,8 @@
# Requirements for OpenSlides in production in alphabetical order
bleach>=1.5.0,<2.2
channels>=1.1,<1.2
daphne<2
Django>=1.10.4,<2.2
channels>=2.1.2,<2.2
daphne>=2.2,<2.3
Django>=1.11,<2.2
djangorestframework>=3.4,<3.9
jsonfield2>=3.0,<3.1
mypy_extensions>=0.3,<0.4

View File

@ -13,6 +13,8 @@ max_line_length = 150
[isort]
include_trailing_comma = true
multi_line_output = 3
lines_after_imports = 2
combine_as_imports = true
[mypy]
ignore_missing_imports = true
@ -25,5 +27,6 @@ disallow_untyped_defs = true
[mypy-openslides.core.config]
disallow_untyped_defs = true
[mypy-tests.*]
ignore_errors = true
[tool:pytest]
DJANGO_SETTINGS_MODULE = tests.settings
testpaths = tests/

40
tests/conftest.py Normal file
View File

@ -0,0 +1,40 @@
from django.test import TestCase, TransactionTestCase
from pytest_django.plugin import validate_django_db
def pytest_collection_modifyitems(items):
"""
Helper until https://github.com/pytest-dev/pytest-django/issues/214 is fixed.
"""
def get_marker_transaction(test):
marker = test.get_marker('django_db')
if marker:
validate_django_db(marker)
return marker.kwargs['transaction']
return None
def has_fixture(test, fixture):
funcargnames = getattr(test, 'funcargnames', None)
return funcargnames and fixture in funcargnames
def weight_test_case(test):
"""
Key function for ordering test cases like the Django test runner.
"""
is_test_case_subclass = test.cls and issubclass(test.cls, TestCase)
is_transaction_test_case_subclass = test.cls and issubclass(test.cls, TransactionTestCase)
if is_test_case_subclass or get_marker_transaction(test) is False:
return 0
elif has_fixture(test, 'db'):
return 0
if is_transaction_test_case_subclass or get_marker_transaction(test) is True:
return 1
elif has_fixture(test, 'transactional_db'):
return 1
return 0
items.sort(key=weight_test_case)

View File

@ -12,6 +12,7 @@ from openslides.motions.models import Motion
from openslides.topics.models import Topic
from openslides.users.models import Group, User
MOTION_NUMBER_OF_PARAGRAPHS = 4
LOREM_IPSUM = [

View File

@ -1,8 +1,8 @@
import pytest
from django.contrib.auth import get_user_model
from django.contrib.auth.models import Permission
from django.urls import reverse
from django.utils.translation import ugettext
from django_redis import get_redis_connection
from rest_framework import status
from rest_framework.test import APIClient
@ -12,10 +12,11 @@ from openslides.core.config import config
from openslides.core.models import Countdown
from openslides.motions.models import Motion
from openslides.topics.models import Topic
from openslides.users.models import User
from openslides.utils.collection import CollectionElement
from openslides.utils.test import TestCase
from ..helpers import count_queries
class RetrieveItem(TestCase):
"""
@ -89,17 +90,18 @@ class RetrieveItem(TestCase):
self.assertTrue(response.data.get('comment') is None)
class TestDBQueries(TestCase):
@pytest.mark.django_db(transaction=False)
def test_agenda_item_db_queries():
"""
Tests that receiving elements only need the required db queries.
Tests that only the following db queries are done:
* 1 requests to get the list of all agenda items,
* 1 request to get all speakers,
* 3 requests to get the assignments, motions and topics and
Therefore in setup some agenda items are created and received with different
user accounts.
* 1 request to get an agenda item (why?)
* 2 requests for the motionsversions.
TODO: The last three request are a bug.
"""
def setUp(self):
self.client = APIClient()
config['general_system_enable_anonymous'] = True
for index in range(10):
Topic.objects.create(title='topic{}'.format(index))
parent = Topic.objects.create(title='parent').agenda_item
@ -110,42 +112,7 @@ class TestDBQueries(TestCase):
Motion.objects.create(title='motion2')
Assignment.objects.create(title='assignment', open_posts=5)
def test_admin(self):
"""
Tests that only the following db queries are done:
* 7 requests to get the session an the request user with its permissions,
* 1 requests to get the list of all agenda items,
* 1 request to get all speakers,
* 3 requests to get the assignments, motions and topics and
* 1 request to get an agenda item (why?)
* 2 requests for the motionsversions.
TODO: The last two request for the motionsversions are a bug.
"""
self.client.force_login(User.objects.get(pk=1))
get_redis_connection("default").flushall()
with self.assertNumQueries(15):
self.client.get(reverse('item-list'))
def test_anonymous(self):
"""
Tests that only the following db queries are done:
* 3 requests to get the permission for anonymous,
* 1 requests to get the list of all agenda items,
* 1 request to get all speakers,
* 3 requests to get the assignments, motions and topics and
* 1 request to get an agenda item (why?)
* 2 requests for the motionsversions.
TODO: The last two request for the motionsversions are a bug.
"""
get_redis_connection("default").flushall()
with self.assertNumQueries(11):
self.client.get(reverse('item-list'))
assert count_queries(Item.get_elements) == 8
class ManageSpeaker(TestCase):

View File

@ -1,34 +1,19 @@
import pytest
from django.contrib.auth import get_user_model
from django.urls import reverse
from django_redis import get_redis_connection
from rest_framework import status
from rest_framework.test import APIClient
from openslides.assignments.models import Assignment
from openslides.core.config import config
from openslides.users.models import User
from openslides.utils.test import TestCase
from ..helpers import count_queries
class TestDBQueries(TestCase):
"""
Tests that receiving elements only need the required db queries.
Therefore in setup some assignments are created and received with different
user accounts.
"""
def setUp(self):
self.client = APIClient()
config['general_system_enable_anonymous'] = True
config.save_default_values()
for index in range(10):
Assignment.objects.create(title='motion{}'.format(index), open_posts=1)
def test_admin(self):
@pytest.mark.django_db(transaction=False)
def test_assignment_db_queries():
"""
Tests that only the following db queries are done:
* 7 requests to get the session an the request user with its permissions,
* 1 requests to get the list of all assignments,
* 1 request to get all related users,
* 1 request to get the agenda item,
@ -39,28 +24,10 @@ class TestDBQueries(TestCase):
TODO: The last request are a bug.
"""
self.client.force_login(User.objects.get(pk=1))
get_redis_connection("default").flushall()
with self.assertNumQueries(22):
self.client.get(reverse('assignment-list'))
for index in range(10):
Assignment.objects.create(title='assignment{}'.format(index), open_posts=1)
def test_anonymous(self):
"""
Tests that only the following db queries are done:
* 3 requests to get the permission for anonymous,
* 1 requests to get the list of all assignments,
* 1 request to get all related users,
* 1 request to get the agenda item,
* 1 request to get the polls,
* 1 request to get the tags and
* 10 request to fetch each related user again.
TODO: The last 10 requests are an bug.
"""
get_redis_connection("default").flushall()
with self.assertNumQueries(18):
self.client.get(reverse('assignment-list'))
assert count_queries(Assignment.get_elements) == 15
class CanidatureSelf(TestCase):
@ -110,7 +77,6 @@ class CanidatureSelf(TestCase):
group_delegates = type(group_admin).objects.get(name='Delegates')
admin.groups.add(group_delegates)
admin.groups.remove(group_admin)
get_redis_connection('default').flushall()
response = self.client.post(reverse('assignment-candidature-self', args=[self.assignment.pk]))
@ -157,7 +123,6 @@ class CanidatureSelf(TestCase):
group_delegates = type(group_admin).objects.get(name='Delegates')
admin.groups.add(group_delegates)
admin.groups.remove(group_admin)
get_redis_connection('default').flushall()
response = self.client.delete(reverse('assignment-candidature-self', args=[self.assignment.pk]))
@ -238,7 +203,6 @@ class CandidatureOther(TestCase):
group_delegates = type(group_admin).objects.get(name='Delegates')
admin.groups.add(group_delegates)
admin.groups.remove(group_admin)
get_redis_connection('default').flushall()
response = self.client.post(
reverse('assignment-candidature-other', args=[self.assignment.pk]),
@ -294,7 +258,6 @@ class CandidatureOther(TestCase):
group_delegates = type(group_admin).objects.get(name='Delegates')
admin.groups.add(group_delegates)
admin.groups.remove(group_admin)
get_redis_connection('default').flushall()
response = self.client.delete(
reverse('assignment-candidature-other', args=[self.assignment.pk]),

View File

@ -5,9 +5,11 @@ from django.urls import reverse
from rest_framework import status
from rest_framework.test import APIClient
from openslides import __license__ as license
from openslides import __url__ as url
from openslides import __version__ as version
from openslides import (
__license__ as license,
__url__ as url,
__version__ as version,
)
from openslides.core.config import ConfigVariable, config
from openslides.core.models import Projector
from openslides.topics.models import Topic
@ -114,7 +116,7 @@ class ConfigViewSet(TestCase):
# Save the old value of the config object and add the test values
# TODO: Can be changed to setUpClass when Django 1.8 is no longer supported
self._config_values = config.config_variables.copy()
config.key_to_id = {}
config.key_to_id = None
config.update_config_variables(set_simple_config_view_integration_config_test())
config.save_default_values()

View File

@ -1,150 +1,62 @@
import pytest
from django.urls import reverse
from django_redis import get_redis_connection
from rest_framework import status
from rest_framework.test import APIClient
from openslides.core.config import config
from openslides.core.models import ChatMessage, Projector, Tag
from openslides.users.models import User
from openslides.utils.test import TestCase
from ..helpers import count_queries
class TestProjectorDBQueries(TestCase):
"""
Tests that receiving elements only need the required db queries.
Therefore in setup some objects are created and received with different
user accounts.
"""
def setUp(self):
self.client = APIClient()
config['general_system_enable_anonymous'] = True
config.save_default_values()
for index in range(10):
Projector.objects.create(name="Projector{}".format(index))
def test_admin(self):
@pytest.mark.django_db(transaction=False)
def test_projector_db_queries():
"""
Tests that only the following db queries are done:
* 7 requests to get the session an the request user with its permissions,
* 1 requests to get the list of all projectors,
* 1 request to get the list of the projector defaults.
"""
self.client.force_login(User.objects.get(pk=1))
get_redis_connection("default").flushall()
with self.assertNumQueries(9):
self.client.get(reverse('projector-list'))
for index in range(10):
Projector.objects.create(name="Projector{}".format(index))
def test_anonymous(self):
assert count_queries(Projector.get_elements) == 2
@pytest.mark.django_db(transaction=False)
def test_chat_message_db_queries():
"""
Tests that only the following db queries are done:
* 3 requests to get the permission for anonymous,
* 1 requests to get the list of all projectors,
* 1 request to get the list of the projector defaults and
* 1 requests to get the list of all chatmessages.
"""
get_redis_connection("default").flushall()
with self.assertNumQueries(5):
self.client.get(reverse('projector-list'))
class TestCharmessageDBQueries(TestCase):
"""
Tests that receiving elements only need the required db queries.
Therefore in setup some objects are created and received with different
user accounts.
"""
def setUp(self):
self.client = APIClient()
config['general_system_enable_anonymous'] = True
config.save_default_values()
user = User.objects.get(pk=1)
user = User.objects.get(username='admin')
for index in range(10):
ChatMessage.objects.create(user=user)
def test_admin(self):
assert count_queries(ChatMessage.get_elements) == 1
@pytest.mark.django_db(transaction=False)
def test_tag_db_queries():
"""
Tests that only the following db queries are done:
* 7 requests to get the session an the request user with its permissions,
* 1 requests to get the list of all chatmessages,
* 1 requests to get the list of all tags.
"""
self.client.force_login(User.objects.get(pk=1))
get_redis_connection("default").flushall()
with self.assertNumQueries(8):
self.client.get(reverse('chatmessage-list'))
class TestTagDBQueries(TestCase):
"""
Tests that receiving elements only need the required db queries.
Therefore in setup some objects are created and received with different
user accounts.
"""
def setUp(self):
self.client = APIClient()
config['general_system_enable_anonymous'] = True
config.save_default_values()
for index in range(10):
Tag.objects.create(name='tag{}'.format(index))
def test_admin(self):
assert count_queries(Tag.get_elements) == 1
@pytest.mark.django_db(transaction=False)
def test_config_db_queries():
"""
Tests that only the following db queries are done:
* 5 requests to get the session an the request user with its permissions,
* 1 requests to get the list of all tags,
"""
self.client.force_login(User.objects.get(pk=1))
get_redis_connection("default").flushall()
with self.assertNumQueries(6):
self.client.get(reverse('tag-list'))
def test_anonymous(self):
"""
Tests that only the following db queries are done:
* 1 requests to see if anonyomus is enabled
* 1 requests to get the list of all projectors,
"""
get_redis_connection("default").flushall()
with self.assertNumQueries(2):
self.client.get(reverse('tag-list'))
class TestConfigDBQueries(TestCase):
"""
Tests that receiving elements only need the required db queries.
Therefore in setup some objects are created and received with different
user accounts.
"""
def setUp(self):
self.client = APIClient()
config['general_system_enable_anonymous'] = True
config.save_default_values()
def test_admin(self):
"""
Tests that only the following db queries are done:
* 5 requests to get the session an the request user with its permissions and
* 1 requests to get the list of all config values
"""
self.client.force_login(User.objects.get(pk=1))
get_redis_connection("default").flushall()
with self.assertNumQueries(6):
self.client.get(reverse('config-list'))
config.save_default_values()
def test_anonymous(self):
"""
Tests that only the following db queries are done:
* 1 requests to see if anonymous is enabled and get all config values
"""
get_redis_connection("default").flushall()
with self.assertNumQueries(1):
self.client.get(reverse('config-list'))
assert count_queries(Tag.get_elements) == 1
class ChatMessageViewSet(TestCase):
@ -152,7 +64,7 @@ class ChatMessageViewSet(TestCase):
Tests requests to deal with chat messages.
"""
def setUp(self):
admin = User.objects.get(pk=1)
admin = User.objects.get(username='admin')
self.client.force_login(admin)
ChatMessage.objects.create(message='test_message_peechiel8IeZoohaem9e', user=admin)

View File

@ -0,0 +1,73 @@
from typing import Any, Dict, List
from asgiref.sync import sync_to_async
from django.db import DEFAULT_DB_ALIAS, connections
from django.test.utils import CaptureQueriesContext
from openslides.core.config import config
from openslides.users.models import User
from openslides.utils.autoupdate import inform_data_collection_element_list
from openslides.utils.cache import element_cache, get_element_id
from openslides.utils.cache_providers import Cachable
from openslides.utils.collection import CollectionElement
class TConfig(Cachable):
"""
Cachable, that fills the cache with the default values of the config variables.
"""
def get_collection_string(self) -> str:
return config.get_collection_string()
def get_elements(self) -> List[Dict[str, Any]]:
elements = []
config.key_to_id = {}
for id, item in enumerate(config.config_variables.values()):
elements.append({'id': id+1, 'key': item.name, 'value': item.default_value})
config.key_to_id[item.name] = id+1
return elements
class TUser(Cachable):
"""
Cachable, that fills the cache with the default values of the config variables.
"""
def get_collection_string(self) -> str:
return User.get_collection_string()
def get_elements(self) -> List[Dict[str, Any]]:
return [
{'id': 1, 'username': 'admin', 'title': '', 'first_name': '',
'last_name': 'Administrator', 'structure_level': '', 'number': '', 'about_me': '',
'groups_id': [4], 'is_present': False, 'is_committee': False, 'email': '',
'last_email_send': None, 'comment': '', 'is_active': True, 'default_password': 'admin',
'session_auth_hash': '362d4f2de1463293cb3aaba7727c967c35de43ee'}]
async def set_config(key, value):
"""
Set a config variable in the element_cache without hitting the database.
"""
if not await element_cache.exists_full_data():
# Encure that the cache exists and the default values of the config are in it.
await element_cache.build_full_data()
collection_string = config.get_collection_string()
config_id = config.key_to_id[key] # type: ignore
full_data = {'id': config_id, 'key': key, 'value': value}
await element_cache.change_elements({get_element_id(collection_string, config_id): full_data})
await sync_to_async(inform_data_collection_element_list)([
CollectionElement.from_values(collection_string, config_id, full_data=full_data)])
def count_queries(func, *args, **kwargs) -> int:
context = CaptureQueriesContext(connections[DEFAULT_DB_ALIAS])
with context:
func(*args, **kwargs)
print("%d queries executed\nCaptured queries were:\n%s" % (
len(context),
'\n'.join(
'%d. %s' % (i, query['sql']) for i, query in enumerate(context.captured_queries, start=1))))
return len(context)

View File

@ -1,26 +1,17 @@
import pytest
from django.core.files.uploadedfile import SimpleUploadedFile
from django.urls import reverse
from django_redis import get_redis_connection
from rest_framework.test import APIClient
from openslides.core.config import config
from openslides.mediafiles.models import Mediafile
from openslides.users.models import User
from openslides.utils.test import TestCase
from ..helpers import count_queries
class TestDBQueries(TestCase):
@pytest.mark.django_db(transaction=False)
def test_mediafiles_db_queries():
"""
Tests that receiving elements only need the required db queries.
Therefore in setup some objects are created and received with different
user accounts.
Tests that only the following db queries are done:
* 1 requests to get the list of all files.
"""
def setUp(self):
self.client = APIClient()
config['general_system_enable_anonymous'] = True
config.save_default_values()
for index in range(10):
Mediafile.objects.create(
title='some_file{}'.format(index),
@ -28,23 +19,4 @@ class TestDBQueries(TestCase):
'some_file{}'.format(index),
b'some content.'))
def test_admin(self):
"""
Tests that only the following db queries are done:
* 7 requests to get the session an the request user with its permissions and
* 1 requests to get the list of all files.
"""
self.client.force_login(User.objects.get(pk=1))
get_redis_connection('default').flushall()
with self.assertNumQueries(8):
self.client.get(reverse('mediafile-list'))
def test_anonymous(self):
"""
Tests that only the following db queries are done:
* 3 requests to get the permission for anonymous and
* 1 requests to get the list of all projectors.
"""
get_redis_connection('default').flushall()
with self.assertNumQueries(4):
self.client.get(reverse('mediafile-list'))
assert count_queries(Mediafile.get_elements) == 1

View File

@ -1,9 +1,9 @@
import json
import pytest
from django.contrib.auth import get_user_model
from django.contrib.auth.models import Permission
from django.urls import reverse
from django_redis import get_redis_connection
from rest_framework import status
from rest_framework.test import APIClient
@ -22,19 +22,22 @@ from openslides.users.models import Group
from openslides.utils.collection import CollectionElement
from openslides.utils.test import TestCase
from ..helpers import count_queries
class TestMotionDBQueries(TestCase):
@pytest.mark.django_db(transaction=False)
def test_motion_db_queries():
"""
Tests that receiving elements only need the required db queries.
Therefore in setup some objects are created and received with different
user accounts.
Tests that only the following db queries are done:
* 1 requests to get the list of all motions,
* 1 request to get the motion versions,
* 1 request to get the agenda item,
* 1 request to get the motion log,
* 1 request to get the polls,
* 1 request to get the attachments,
* 1 request to get the tags,
* 2 requests to get the submitters and supporters.
"""
def setUp(self):
self.client = APIClient()
config['general_system_enable_anonymous'] = True
config.save_default_values()
for index in range(10):
Motion.objects.create(title='motion{}'.format(index))
get_user_model().objects.create_user(
@ -42,114 +45,31 @@ class TestMotionDBQueries(TestCase):
password='password')
# TODO: Create some polls etc.
def test_admin(self):
assert count_queries(Motion.get_elements) == 9
@pytest.mark.django_db(transaction=False)
def test_category_db_queries():
"""
Tests that only the following db queries are done:
* 7 requests to get the session an the request user with its permissions,
* 1 requests to get the list of all motions,
* 1 request to get the motion versions,
* 1 request to get the agenda item,
* 1 request to get the motion log,
* 1 request to get the polls,
* 1 request to get the attachments,
* 1 request to get the tags,
* 2 requests to get the submitters and supporters.
* 1 requests to get the list of all categories.
"""
self.client.force_login(get_user_model().objects.get(pk=1))
get_redis_connection('default').flushall()
with self.assertNumQueries(16):
self.client.get(reverse('motion-list'))
def test_anonymous(self):
"""
Tests that only the following db queries are done:
* 3 requests to get the permission for anonymous,
* 1 requests to get the list of all motions,
* 1 request to get the motion versions,
* 1 request to get the agenda item,
* 1 request to get the motion log,
* 1 request to get the polls,
* 1 request to get the attachments,
* 1 request to get the tags,
* 2 requests to get the submitters and supporters.
"""
get_redis_connection('default').flushall()
with self.assertNumQueries(12):
self.client.get(reverse('motion-list'))
class TestCategoryDBQueries(TestCase):
"""
Tests that receiving elements only need the required db queries.
Therefore in setup some objects are created and received with different
user accounts.
"""
def setUp(self):
self.client = APIClient()
config.save_default_values()
config['general_system_enable_anonymous'] = True
for index in range(10):
Category.objects.create(name='category{}'.format(index))
def test_admin(self):
assert count_queries(Category.get_elements) == 1
@pytest.mark.django_db(transaction=False)
def test_workflow_db_queries():
"""
Tests that only the following db queries are done:
* 7 requests to get the session an the request user with its permissions and
* 1 requests to get the list of all categories.
"""
self.client.force_login(get_user_model().objects.get(pk=1))
get_redis_connection('default').flushall()
with self.assertNumQueries(8):
self.client.get(reverse('category-list'))
def test_anonymous(self):
"""
Tests that only the following db queries are done:
* 3 requests to get the permission for anonymous (config and permissions)
* 1 requests to get the list of all motions and
"""
get_redis_connection('default').flushall()
with self.assertNumQueries(4):
self.client.get(reverse('category-list'))
class TestWorkflowDBQueries(TestCase):
"""
Tests that receiving elements only need the required db queries.
"""
def setUp(self):
self.client = APIClient()
config.save_default_values()
config['general_system_enable_anonymous'] = True
# There do not need to be more workflows
def test_admin(self):
"""
Tests that only the following db queries are done:
* 7 requests to get the session an the request user with its permissions,
* 1 requests to get the list of all workflows,
* 1 request to get all states and
* 1 request to get the next states of all states.
"""
self.client.force_login(get_user_model().objects.get(pk=1))
get_redis_connection('default').flushall()
with self.assertNumQueries(10):
self.client.get(reverse('workflow-list'))
def test_anonymous(self):
"""
Tests that only the following db queries are done:
* 3 requests to get the permission for anonymous,
* 1 requests to get the list of all workflows,
* 1 request to get all states and
* 1 request to get the next states of all states.
"""
get_redis_connection('default').flushall()
with self.assertNumQueries(6):
self.client.get(reverse('workflow-list'))
assert count_queries(Workflow.get_elements) == 3
class CreateMotion(TestCase):
@ -328,6 +248,10 @@ class CreateMotion(TestCase):
content_type__app_label='motions',
codename='can_manage_comments',
))
group_delegate.permissions.add(Permission.objects.get(
content_type__app_label='motions',
codename='can_see_comments',
))
response = self.client.post(
reverse('motion-list'),
@ -383,7 +307,6 @@ class CreateMotion(TestCase):
self.admin = get_user_model().objects.get(username='admin')
self.admin.groups.add(2)
self.admin.groups.remove(4)
get_redis_connection('default').flushall()
response = self.client.post(
reverse('motion-list'),
@ -424,24 +347,6 @@ class RetrieveMotion(TestCase):
username='user_{}'.format(index),
password='password')
def test_number_of_queries(self):
"""
Tests that only the following db queries are done:
* 7 requests to get the session and the request user with its permissions (3 of them are possibly a bug)
* 1 request to get the motion,
* 1 request to get the version,
* 1 request to get the agenda item,
* 1 request to get the log,
* 3 request to get the polls (1 of them is possibly a bug),
* 1 request to get the attachments,
* 1 request to get the tags,
* 2 requests to get the submitters and supporters.
TODO: Fix all bugs.
"""
get_redis_connection('default').flushall()
with self.assertNumQueries(18):
self.client.get(reverse('motion-detail', args=[self.motion.pk]))
def test_guest_state_with_required_permission_to_see(self):
config['general_system_enable_anonymous'] = True
guest_client = APIClient()
@ -450,7 +355,7 @@ class RetrieveMotion(TestCase):
state.save()
# The cache has to be cleared, see:
# https://github.com/OpenSlides/OpenSlides/issues/3396
get_redis_connection('default').flushall()
response = guest_client.get(reverse('motion-detail', args=[self.motion.pk]))
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
@ -484,7 +389,6 @@ class RetrieveMotion(TestCase):
group.permissions.remove(permission)
config['general_system_enable_anonymous'] = True
guest_client = APIClient()
get_redis_connection('default').flushall()
response_1 = guest_client.get(reverse('motion-detail', args=[self.motion.pk]))
self.assertEqual(response_1.status_code, status.HTTP_200_OK)
@ -495,7 +399,7 @@ class RetrieveMotion(TestCase):
extra_user = get_user_model().objects.create_user(
username='username_wequePhieFoom0hai3wa',
password='password_ooth7taechai5Oocieya')
get_redis_connection('default').flushall()
response_3 = guest_client.get(reverse('user-detail', args=[extra_user.pk]))
self.assertEqual(response_3.status_code, status.HTTP_403_FORBIDDEN)
@ -576,7 +480,6 @@ class UpdateMotion(TestCase):
self.motion.supporters.add(supporter)
config['motions_remove_supporters'] = True
self.assertEqual(self.motion.supporters.count(), 1)
get_redis_connection('default').flushall()
response = self.client.patch(
reverse('motion-detail', args=[self.motion.pk]),
@ -939,7 +842,7 @@ class SupportMotion(TestCase):
def test_support(self):
config['motions_min_supporters'] = 1
get_redis_connection('default').flushall()
response = self.client.post(reverse('motion-support', args=[self.motion.pk]))
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.data, {'detail': 'You have supported this motion successfully.'})

View File

@ -1,54 +1,26 @@
from django.contrib.auth import get_user_model
import pytest
from django.urls import reverse
from django_redis import get_redis_connection
from rest_framework import status
from rest_framework.test import APIClient
from openslides.agenda.models import Item
from openslides.core.config import config
from openslides.topics.models import Topic
from openslides.utils.test import TestCase
from ..helpers import count_queries
class TestDBQueries(TestCase):
"""
Tests that receiving elements only need the required db queries.
Therefore in setup some topics are created and received with different
user accounts.
"""
def setUp(self):
self.client = APIClient()
config['general_system_enable_anonymous'] = True
config.save_default_values()
for index in range(10):
Topic.objects.create(title='topic-{}'.format(index))
def test_admin(self):
@pytest.mark.django_db(transaction=False)
def test_topic_item_db_queries():
"""
Tests that only the following db queries are done:
* 7 requests to get the session an the request user with its permissions,
* 1 requests to get the list of all topics,
* 1 request to get attachments,
* 1 request to get the agenda item
"""
self.client.force_login(get_user_model().objects.get(pk=1))
get_redis_connection('default').flushall()
with self.assertNumQueries(10):
self.client.get(reverse('topic-list'))
for index in range(10):
Topic.objects.create(title='topic-{}'.format(index))
def test_anonymous(self):
"""
Tests that only the following db queries are done:
* 3 requests to get the permission for anonymous,
* 1 requests to get the list of all topics,
* 1 request to get attachments,
* 1 request to get the agenda item,
"""
get_redis_connection('default').flushall()
with self.assertNumQueries(6):
self.client.get(reverse('topic-list'))
assert count_queries(Topic.get_elements) == 3
class TopicCreate(TestCase):

View File

@ -1,6 +1,6 @@
import pytest
from django.core import mail
from django.urls import reverse
from django_redis import get_redis_connection
from rest_framework import status
from rest_framework.test import APIClient
@ -9,84 +9,33 @@ from openslides.users.models import Group, PersonalNote, User
from openslides.users.serializers import UserFullSerializer
from openslides.utils.test import TestCase
from ..helpers import count_queries
class TestUserDBQueries(TestCase):
"""
Tests that receiving elements only need the required db queries.
Therefore in setup some objects are created and received with different
user accounts.
"""
def setUp(self):
self.client = APIClient()
config['general_system_enable_anonymous'] = True
config.save_default_values()
for index in range(10):
User.objects.create(username='user{}'.format(index))
def test_admin(self):
@pytest.mark.django_db(transaction=False)
def test_user_db_queries():
"""
Tests that only the following db queries are done:
* 2 requests to get the session and the request user with its permissions,
* 2 requests to get the list of all users and
* 1 requests to get the list of all groups.
"""
self.client.force_login(User.objects.get(pk=1))
get_redis_connection('default').flushall()
with self.assertNumQueries(7):
self.client.get(reverse('user-list'))
for index in range(10):
User.objects.create(username='user{}'.format(index))
def test_anonymous(self):
assert count_queries(User.get_elements) == 3
@pytest.mark.django_db(transaction=False)
def test_group_db_queries():
"""
Tests that only the following db queries are done:
* 3 requests to get the permission for anonymous,
* 1 requests to get the list of all users and
* 2 request to get all groups (needed by the user serializer).
* 1 request to get the list of all groups.
* 1 request to get the permissions
"""
get_redis_connection('default').flushall()
with self.assertNumQueries(6):
self.client.get(reverse('user-list'))
class TestGroupDBQueries(TestCase):
"""
Tests that receiving elements only need the required db queries.
Therefore in setup some objects are created and received with different
user accounts.
"""
def setUp(self):
self.client = APIClient()
config['general_system_enable_anonymous'] = True
config.save_default_values()
for index in range(10):
Group.objects.create(name='group{}'.format(index))
def test_admin(self):
"""
Tests that only the following db queries are done:
* 6 requests to get the session an the request user with its permissions and
* 1 request to get the list of all groups.
The data of the groups where loaded when the admin was authenticated. So
only the list of all groups has be fetched from the db.
"""
self.client.force_login(User.objects.get(pk=1))
get_redis_connection('default').flushall()
with self.assertNumQueries(7):
self.client.get(reverse('group-list'))
def test_anonymous(self):
"""
Tests that only the following db queries are done:
* 1 requests to find out if anonymous is enabled
* 2 request to get the list of all groups and
"""
get_redis_connection('default').flushall()
with self.assertNumQueries(3):
self.client.get(reverse('group-list'))
assert count_queries(Group.get_elements) == 2
class UserGetTest(TestCase):
@ -98,7 +47,7 @@ class UserGetTest(TestCase):
It is invalid, that a user is in the group with the pk 1. But if the
database is invalid, the user should nevertheless be received.
"""
admin = User.objects.get(pk=1)
admin = User.objects.get(username='admin')
group1 = Group.objects.get(pk=1)
admin.groups.add(group1)
self.client.login(username='admin', password='admin')
@ -178,7 +127,7 @@ class UserUpdate(TestCase):
admin_client = APIClient()
admin_client.login(username='admin', password='admin')
# This is the builtin user 'Administrator' with username 'admin'. The pk is valid.
user_pk = 1
user_pk = User.objects.get(username='admin').pk
response = admin_client.patch(
reverse('user-detail', args=[user_pk]),
@ -198,14 +147,14 @@ class UserUpdate(TestCase):
admin_client = APIClient()
admin_client.login(username='admin', password='admin')
# This is the builtin user 'Administrator'. The pk is valid.
user_pk = 1
user_pk = User.objects.get(username='admin').pk
response = admin_client.put(
reverse('user-detail', args=[user_pk]),
{'last_name': 'New name Ohy4eeyei5'})
self.assertEqual(response.status_code, 200)
self.assertEqual(User.objects.get(pk=1).username, 'New name Ohy4eeyei5')
self.assertEqual(User.objects.get(pk=user_pk).username, 'New name Ohy4eeyei5')
def test_update_deactivate_yourselfself(self):
"""
@ -214,7 +163,7 @@ class UserUpdate(TestCase):
admin_client = APIClient()
admin_client.login(username='admin', password='admin')
# This is the builtin user 'Administrator'. The pk is valid.
user_pk = 1
user_pk = User.objects.get(username='admin').pk
response = admin_client.patch(
reverse('user-detail', args=[user_pk]),
@ -581,7 +530,7 @@ class PersonalNoteTest(TestCase):
Tests for PersonalNote model.
"""
def test_anonymous_without_personal_notes(self):
admin = User.objects.get(pk=1)
admin = User.objects.get(username='admin')
personal_note = PersonalNote.objects.create(user=admin, notes='["admin_personal_note_OoGh8choro0oosh0roob"]')
config['general_system_enable_anonymous'] = True
guest_client = APIClient()

View File

@ -1,17 +1,17 @@
from channels.tests import ChannelTestCase as TestCase
from django_redis import get_redis_connection
from unittest import skip
from openslides.topics.models import Topic
from openslides.utils import collection
from openslides.utils.test import TestCase
class TestCollectionElementCache(TestCase):
@skip("Does not work as long as caching does not work in the tests")
def test_clean_cache(self):
"""
Tests that the data is retrieved from the database.
"""
topic = Topic.objects.create(title='test topic')
get_redis_connection("default").flushall()
with self.assertNumQueries(3):
collection_element = collection.CollectionElement.from_values('topics/topic', 1)
@ -19,6 +19,7 @@ class TestCollectionElementCache(TestCase):
self.assertEqual(topic.title, instance['title'])
@skip("Does not work as long as caching does not work in the tests")
def test_with_cache(self):
"""
Tests that no db query is used when the valie is in the cache.
@ -43,6 +44,7 @@ class TestCollectionElementCache(TestCase):
collection.CollectionElement.from_values('topics/topic', 999)
@skip("Does not work as long as caching does not work in the tests")
class TestCollectionCache(TestCase):
def test_clean_cache(self):
"""
@ -52,7 +54,6 @@ class TestCollectionCache(TestCase):
Topic.objects.create(title='test topic2')
Topic.objects.create(title='test topic3')
topic_collection = collection.Collection('topics/topic')
get_redis_connection("default").flushall()
with self.assertNumQueries(3):
instance_list = list(topic_collection.get_full_data())
@ -62,7 +63,6 @@ class TestCollectionCache(TestCase):
"""
Tests that no db query is used when the list is received twice.
"""
get_redis_connection("default").flushall()
Topic.objects.create(title='test topic1')
Topic.objects.create(title='test topic2')
Topic.objects.create(title='test topic3')

View File

@ -0,0 +1,185 @@
from importlib import import_module
import pytest
from asgiref.sync import sync_to_async
from channels.testing import WebsocketCommunicator
from django.conf import settings
from django.contrib.auth import (
BACKEND_SESSION_KEY,
HASH_SESSION_KEY,
SESSION_KEY,
)
from openslides.asgi import application
from openslides.core.config import config
from openslides.utils.autoupdate import inform_deleted_data
from openslides.utils.cache import element_cache
from ...unit.utils.cache_provider import (
Collection1,
Collection2,
get_cachable_provider,
)
from ..helpers import TConfig, TUser, set_config
@pytest.fixture(autouse=True)
def prepare_element_cache(settings):
"""
Resets the element cache.
Uses a cacheable_provider for tests with example data.
"""
settings.SKIP_CACHE = False
element_cache.cache_provider.clear_cache()
orig_cachable_provider = element_cache.cachable_provider
element_cache.cachable_provider = get_cachable_provider([Collection1(), Collection2(), TConfig(), TUser()])
element_cache._cachables = None
yield
# Reset the cachable_provider
element_cache.cachable_provider = orig_cachable_provider
element_cache._cachables = None
element_cache.cache_provider.clear_cache()
@pytest.fixture
def communicator(request, event_loop):
communicator = WebsocketCommunicator(application, "/ws/site/")
# This style is needed for python 3.5. Use the generaor style when 3.5 ist dropped
def fin():
async def afin():
await communicator.disconnect()
event_loop.run_until_complete(afin())
request.addfinalizer(fin)
return communicator
@pytest.mark.asyncio
async def test_normal_connection(communicator):
await set_config('general_system_enable_anonymous', True)
await communicator.connect()
response = await communicator.receive_json_from()
# Test, that there is a lot of startup data.
assert len(response) > 5
@pytest.mark.asyncio
async def test_receive_changed_data(communicator):
await set_config('general_system_enable_anonymous', True)
await communicator.connect()
await communicator.receive_json_from()
# Change a config value after the startup data has been received
await set_config('general_event_name', 'Test Event')
response = await communicator.receive_json_from()
id = config.get_key_to_id()['general_event_name']
assert response == [
{'action': 'changed',
'collection': 'core/config',
'data': {'id': id, 'key': 'general_event_name', 'value': 'Test Event'},
'id': id}]
@pytest.mark.asyncio
async def test_anonymous_disabled(communicator):
connected, __ = await communicator.connect()
assert not connected
@pytest.mark.asyncio
async def test_with_user():
# login user with id 1
engine = import_module(settings.SESSION_ENGINE)
session = engine.SessionStore() # type: ignore
session[SESSION_KEY] = '1'
session[HASH_SESSION_KEY] = '362d4f2de1463293cb3aaba7727c967c35de43ee' # see helpers.TUser
session[BACKEND_SESSION_KEY] = 'django.contrib.auth.backends.ModelBackend'
session.save()
scn = settings.SESSION_COOKIE_NAME
cookies = (b'cookie', '{}={}'.format(scn, session.session_key).encode())
communicator = WebsocketCommunicator(application, "/ws/site/", headers=[cookies])
connected, __ = await communicator.connect()
assert connected
await communicator.disconnect()
@pytest.mark.asyncio
async def test_receive_deleted_data(communicator):
await set_config('general_system_enable_anonymous', True)
await communicator.connect()
await communicator.receive_json_from()
# Delete test element
await sync_to_async(inform_deleted_data)([(Collection1().get_collection_string(), 1)])
response = await communicator.receive_json_from()
assert response == [{'action': 'deleted', 'collection': Collection1().get_collection_string(), 'id': 1}]
@pytest.mark.asyncio
async def test_send_invalid_notify_not_a_list(communicator):
await set_config('general_system_enable_anonymous', True)
await communicator.connect()
# Await the startup data
await communicator.receive_json_from()
await communicator.send_json_to({'testmessage': 'foobar, what else.'})
response = await communicator.receive_json_from()
assert response == {'error': 'invalid message'}
@pytest.mark.asyncio
async def test_send_invalid_notify_no_elements(communicator):
await set_config('general_system_enable_anonymous', True)
await communicator.connect()
# Await the startup data
await communicator.receive_json_from()
await communicator.send_json_to([])
response = await communicator.receive_json_from()
assert response == {'error': 'invalid message'}
@pytest.mark.asyncio
async def test_send_invalid_notify_str_in_list(communicator):
await set_config('general_system_enable_anonymous', True)
await communicator.connect()
# Await the startup data
await communicator.receive_json_from()
await communicator.send_json_to([{}, 'testmessage'])
response = await communicator.receive_json_from()
assert response == {'error': 'invalid message'}
@pytest.mark.asyncio
async def test_send_valid_notify(communicator):
await set_config('general_system_enable_anonymous', True)
await communicator.connect()
# Await the startup data
await communicator.receive_json_from()
await communicator.send_json_to([{'testmessage': 'foobar, what else.'}])
response = await communicator.receive_json_from()
assert isinstance(response, list)
assert len(response) == 1
assert response[0]['testmessage'] == 'foobar, what else.'
assert 'senderReplyChannelName' in response[0]
assert response[0]['senderUserId'] == 0

View File

@ -1,11 +1,11 @@
from django_redis import get_redis_connection
from openslides.core.config import ConfigVariable, config
from openslides.core.exceptions import ConfigError, ConfigNotFound
from openslides.utils.test import TestCase
class TestConfigException(Exception):
class TTestConfigException(Exception):
pass
@ -57,7 +57,6 @@ class HandleConfigTest(TestCase):
def test_change_config_value(self):
self.assertEqual(config['string_var'], 'default_string_rien4ooCZieng6ah')
config['string_var'] = 'other_special_unique_string dauTex9eAiy7jeen'
get_redis_connection('default').flushall()
self.assertEqual(config['string_var'], 'other_special_unique_string dauTex9eAiy7jeen')
def test_missing_cache_(self):
@ -79,7 +78,7 @@ class HandleConfigTest(TestCase):
message.
"""
with self.assertRaisesMessage(
TestConfigException,
TTestConfigException,
'Change callback dhcnfg34dlg06kdg successfully called.'):
self.set_config_var(
key='var_with_callback_ghvnfjd5768gdfkwg0hm2',
@ -155,7 +154,7 @@ def set_simple_config_collection_disabled_view():
def set_simple_config_collection_with_callback():
def callback():
raise TestConfigException('Change callback dhcnfg34dlg06kdg successfully called.')
raise TTestConfigException('Change callback dhcnfg34dlg06kdg successfully called.')
yield ConfigVariable(
name='var_with_callback_ghvnfjd5768gdfkwg0hm2',
default_value='',

View File

@ -1,5 +1,3 @@
from django_redis import get_redis_connection
from openslides.core.config import config
from openslides.motions.exceptions import WorkflowError
from openslides.motions.models import Motion, State, Workflow
@ -132,7 +130,6 @@ class ModelTest(TestCase):
def test_is_amendment(self):
config['motions_amendments_enabled'] = True
get_redis_connection('default').flushall()
amendment = Motion.objects.create(title='amendment', parent=self.motion)
self.assertTrue(amendment.is_amendment())
@ -153,7 +150,6 @@ class ModelTest(TestCase):
If the config is set to manually, the method does nothing.
"""
config['motions_identifier'] = 'manually'
get_redis_connection("default").flushall()
motion = Motion()
motion.set_identifier()
@ -169,7 +165,6 @@ class ModelTest(TestCase):
config['motions_amendments_enabled'] = True
self.motion.identifier = 'Parent identifier'
self.motion.save()
get_redis_connection("default").flushall()
motion = Motion(parent=self.motion)
motion.set_identifier()
@ -184,7 +179,6 @@ class ModelTest(TestCase):
config['motions_amendments_enabled'] = True
self.motion.identifier = 'Parent identifier'
self.motion.save()
get_redis_connection("default").flushall()
Motion.objects.create(title='Amendment1', parent=self.motion)
motion = Motion(parent=self.motion)

View File

@ -6,6 +6,7 @@ import os
from openslides.global_settings import * # noqa
# Path to the directory for user specific data files
OPENSLIDES_USER_DATA_PATH = os.path.realpath(os.path.dirname(os.path.abspath(__file__)))
@ -42,6 +43,16 @@ DATABASES = {
}
}
# Configure session in the cache
CACHES = {
'default': {
'BACKEND': 'django.core.cache.backends.locmem.LocMemCache',
}
}
SESSION_ENGINE = "django.contrib.sessions.backends.cache"
# When use_redis is True, the restricted data cache caches the data individuel
# for each user. This requires a lot of memory if there are a lot of active
# users. If use_redis is False, this setting has no effect.
@ -72,20 +83,12 @@ MOTION_IDENTIFIER_MIN_DIGITS = 1
# Special settings only for testing
TEST_RUNNER = 'openslides.utils.test.OpenSlidesDiscoverRunner'
# Use a faster password hasher.
PASSWORD_HASHERS = [
'django.contrib.auth.hashers.MD5PasswordHasher',
]
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": "redis://127.0.0.1:6379/0",
"OPTIONS": {
"REDIS_CLIENT_CLASS": "fakeredis.FakeStrictRedis",
}
}
}
# At least in Django 2.1 and Channels 2.1 the django transactions can not be shared between
# threads. So we have to skip the asyncio-cache.
SKIP_CACHE = True

View File

@ -0,0 +1,82 @@
import asyncio # noqa
from typing import Any, Callable, Dict, List, Optional
from openslides.utils.cache_providers import Cachable, MemmoryCacheProvider
from openslides.utils.collection import CollectionElement # noqa
def restrict_elements(
user: Optional['CollectionElement'],
elements: List[Dict[str, Any]]) -> List[Dict[str, Any]]:
"""
Adds the prefix 'restricted_' to all values except id.
"""
out = []
for element in elements:
restricted_element = {}
for key, value in element.items():
if key == 'id':
restricted_element[key] = value
else:
restricted_element[key] = 'restricted_{}'.format(value)
out.append(restricted_element)
return out
class Collection1(Cachable):
def get_collection_string(self) -> str:
return 'app/collection1'
def get_elements(self) -> List[Dict[str, Any]]:
return [
{'id': 1, 'value': 'value1'},
{'id': 2, 'value': 'value2'}]
def restrict_elements(self, user: Optional['CollectionElement'], elements: List[Dict[str, Any]]) -> List[Dict[str, Any]]:
return restrict_elements(user, elements)
class Collection2(Cachable):
def get_collection_string(self) -> str:
return 'app/collection2'
def get_elements(self) -> List[Dict[str, Any]]:
return [
{'id': 1, 'key': 'value1'},
{'id': 2, 'key': 'value2'}]
def restrict_elements(self, user: Optional['CollectionElement'], elements: List[Dict[str, Any]]) -> List[Dict[str, Any]]:
return restrict_elements(user, elements)
def get_cachable_provider(cachables: List[Cachable] = [Collection1(), Collection2()]) -> Callable[[], List[Cachable]]:
"""
Returns a cachable_provider.
"""
return lambda: cachables
def example_data():
return {
'app/collection1': [
{'id': 1, 'value': 'value1'},
{'id': 2, 'value': 'value2'}],
'app/collection2': [
{'id': 1, 'key': 'value1'},
{'id': 2, 'key': 'value2'}]}
class TTestCacheProvider(MemmoryCacheProvider):
"""
CacheProvider simular to the MemmoryCacheProvider with special methods for
testing.
"""
async def del_lock_restricted_data_after_wait(self, user_id: int, future: asyncio.Future = None) -> None:
if future is None:
asyncio.ensure_future(self.del_lock_restricted_data(user_id))
else:
async def set_future() -> None:
await self.del_lock_restricted_data(user_id)
future.set_result(1) # type: ignore
asyncio.ensure_future(set_future())

View File

@ -0,0 +1,483 @@
import asyncio
import json
from typing import Any, Dict, List
import pytest
from openslides.utils.cache import ElementCache
from .cache_provider import (
TTestCacheProvider,
example_data,
get_cachable_provider,
)
def decode_dict(encoded_dict: Dict[str, str]) -> Dict[str, Any]:
"""
Helper function that loads the json values of a dict.
"""
return {key: json.loads(value) for key, value in encoded_dict.items()}
def sort_dict(encoded_dict: Dict[str, List[Dict[str, Any]]]) -> Dict[str, List[Dict[str, Any]]]:
"""
Helper function that sorts the value of a dict.
"""
return {key: sorted(value, key=lambda x: x['id']) for key, value in encoded_dict.items()}
@pytest.fixture
def element_cache():
return ElementCache(
'test_redis',
cache_provider_class=TTestCacheProvider,
cachable_provider=get_cachable_provider(),
start_time=0)
@pytest.mark.asyncio
async def test_save_full_data(element_cache):
input_data = {
'app/collection1': [
{'id': 1, 'value': 'value1'},
{'id': 2, 'value': 'value2'}],
'app/collection2': [
{'id': 1, 'key': 'value1'},
{'id': 2, 'key': 'value2'}]}
calculated_data = {
'app/collection1:1': '{"id": 1, "value": "value1"}',
'app/collection1:2': '{"id": 2, "value": "value2"}',
'app/collection2:1': '{"id": 1, "key": "value1"}',
'app/collection2:2': '{"id": 2, "key": "value2"}'}
await element_cache.save_full_data(input_data)
assert decode_dict(element_cache.cache_provider.full_data) == decode_dict(calculated_data)
@pytest.mark.asyncio
async def test_build_full_data(element_cache):
result = await element_cache.build_full_data()
assert result == example_data()
assert decode_dict(element_cache.cache_provider.full_data) == decode_dict({
'app/collection1:1': '{"id": 1, "value": "value1"}',
'app/collection1:2': '{"id": 2, "value": "value2"}',
'app/collection2:1': '{"id": 1, "key": "value1"}',
'app/collection2:2': '{"id": 2, "key": "value2"}'})
@pytest.mark.asyncio
async def test_exists_full_data(element_cache):
"""
Test that the return value of exists_full_data is the the same as from the
cache_provider.
"""
element_cache.cache_provider.full_data = 'test_value'
assert await element_cache.exists_full_data()
@pytest.mark.asyncio
async def test_change_elements(element_cache):
input_data = {
'app/collection1:1': {"id": 1, "value": "updated"},
'app/collection1:2': {"id": 2, "value": "new"},
'app/collection2:1': {"id": 1, "key": "updated"},
'app/collection2:2': None}
element_cache.cache_provider.full_data = {
'app/collection1:1': '{"id": 1, "value": "old"}',
'app/collection2:1': '{"id": 1, "key": "old"}',
'app/collection2:2': '{"id": 2, "key": "old"}'}
result = await element_cache.change_elements(input_data)
assert result == 1 # first change_id
assert decode_dict(element_cache.cache_provider.full_data) == decode_dict({
'app/collection1:1': '{"id": 1, "value": "updated"}',
'app/collection1:2': '{"id": 2, "value": "new"}',
'app/collection2:1': '{"id": 1, "key": "updated"}'})
assert element_cache.cache_provider.change_id_data == {
1: {
'app/collection1:1',
'app/collection1:2',
'app/collection2:1',
'app/collection2:2'}}
@pytest.mark.asyncio
async def test_change_elements_with_no_data_in_redis(element_cache):
input_data = {
'app/collection1:1': {"id": 1, "value": "updated"},
'app/collection1:2': {"id": 2, "value": "new"},
'app/collection2:1': {"id": 1, "key": "updated"},
'app/collection2:2': None}
result = await element_cache.change_elements(input_data)
assert result == 1 # first change_id
assert decode_dict(element_cache.cache_provider.full_data) == decode_dict({
'app/collection1:1': '{"id": 1, "value": "updated"}',
'app/collection1:2': '{"id": 2, "value": "new"}',
'app/collection2:1': '{"id": 1, "key": "updated"}'})
assert element_cache.cache_provider.change_id_data == {
1: {
'app/collection1:1',
'app/collection1:2',
'app/collection2:1',
'app/collection2:2'}}
@pytest.mark.asyncio
async def test_get_all_full_data_from_db(element_cache):
result = await element_cache.get_all_full_data()
assert result == example_data()
# Test that elements are written to redis
assert decode_dict(element_cache.cache_provider.full_data) == decode_dict({
'app/collection1:1': '{"id": 1, "value": "value1"}',
'app/collection1:2': '{"id": 2, "value": "value2"}',
'app/collection2:1': '{"id": 1, "key": "value1"}',
'app/collection2:2': '{"id": 2, "key": "value2"}'})
@pytest.mark.asyncio
async def test_get_all_full_data_from_redis(element_cache):
element_cache.cache_provider.full_data = {
'app/collection1:1': '{"id": 1, "value": "value1"}',
'app/collection1:2': '{"id": 2, "value": "value2"}',
'app/collection2:1': '{"id": 1, "key": "value1"}',
'app/collection2:2': '{"id": 2, "key": "value2"}'}
result = await element_cache.get_all_full_data()
# The output from redis has to be the same then the db_data
assert sort_dict(result) == sort_dict(example_data())
@pytest.mark.asyncio
async def test_get_full_data_change_id_0(element_cache):
element_cache.cache_provider.full_data = {
'app/collection1:1': '{"id": 1, "value": "value1"}',
'app/collection1:2': '{"id": 2, "value": "value2"}',
'app/collection2:1': '{"id": 1, "key": "value1"}',
'app/collection2:2': '{"id": 2, "key": "value2"}'}
result = await element_cache.get_full_data(0)
assert sort_dict(result[0]) == sort_dict(example_data())
@pytest.mark.asyncio
async def test_get_full_data_change_id_lower_then_in_redis(element_cache):
element_cache.cache_provider.full_data = {
'app/collection1:1': '{"id": 1, "value": "value1"}',
'app/collection1:2': '{"id": 2, "value": "value2"}',
'app/collection2:1': '{"id": 1, "key": "value1"}',
'app/collection2:2': '{"id": 2, "key": "value2"}'}
element_cache.cache_provider.change_id_data = {
2: {'app/collection1:1'}}
with pytest.raises(RuntimeError):
await element_cache.get_full_data(1)
@pytest.mark.asyncio
async def test_get_full_data_change_id_data_in_redis(element_cache):
element_cache.cache_provider.full_data = {
'app/collection1:1': '{"id": 1, "value": "value1"}',
'app/collection1:2': '{"id": 2, "value": "value2"}',
'app/collection2:1': '{"id": 1, "key": "value1"}',
'app/collection2:2': '{"id": 2, "key": "value2"}'}
element_cache.cache_provider.change_id_data = {
1: {'app/collection1:1', 'app/collection1:3'}}
result = await element_cache.get_full_data(1)
assert result == (
{'app/collection1': [{"id": 1, "value": "value1"}]},
['app/collection1:3'])
@pytest.mark.asyncio
async def test_get_full_data_change_id_data_in_db(element_cache):
element_cache.cache_provider.change_id_data = {
1: {'app/collection1:1', 'app/collection1:3'}}
result = await element_cache.get_full_data(1)
assert result == (
{'app/collection1': [{"id": 1, "value": "value1"}]},
['app/collection1:3'])
@pytest.mark.asyncio
async def test_get_full_data_change_id_data_in_db_empty_change_id(element_cache):
with pytest.raises(RuntimeError):
await element_cache.get_full_data(1)
@pytest.mark.asyncio
async def test_get_element_full_data_empty_redis(element_cache):
result = await element_cache.get_element_full_data('app/collection1', 1)
assert result == {'id': 1, 'value': 'value1'}
@pytest.mark.asyncio
async def test_get_element_full_data_empty_redis_does_not_exist(element_cache):
result = await element_cache.get_element_full_data('app/collection1', 3)
assert result is None
@pytest.mark.asyncio
async def test_get_element_full_data_full_redis(element_cache):
element_cache.cache_provider.full_data = {
'app/collection1:1': '{"id": 1, "value": "value1"}',
'app/collection1:2': '{"id": 2, "value": "value2"}',
'app/collection2:1': '{"id": 1, "key": "value1"}',
'app/collection2:2': '{"id": 2, "key": "value2"}'}
result = await element_cache.get_element_full_data('app/collection1', 1)
assert result == {'id': 1, 'value': 'value1'}
@pytest.mark.asyncio
async def test_exist_restricted_data(element_cache):
element_cache.use_restricted_data_cache = True
element_cache.cache_provider.restricted_data = {0: {
'app/collection1:1': '{"id": 1, "value": "value1"}',
'app/collection1:2': '{"id": 2, "value": "value2"}',
'app/collection2:1': '{"id": 1, "key": "value1"}',
'app/collection2:2': '{"id": 2, "key": "value2"}'}}
result = await element_cache.exists_restricted_data(None)
assert result
@pytest.mark.asyncio
async def test_exist_restricted_data_do_not_use_restricted_data(element_cache):
element_cache.use_restricted_data_cache = False
element_cache.cache_provider.restricted_data = {0: {
'app/collection1:1': '{"id": 1, "value": "value1"}',
'app/collection1:2': '{"id": 2, "value": "value2"}',
'app/collection2:1': '{"id": 1, "key": "value1"}',
'app/collection2:2': '{"id": 2, "key": "value2"}'}}
result = await element_cache.exists_restricted_data(None)
assert not result
@pytest.mark.asyncio
async def test_del_user(element_cache):
element_cache.use_restricted_data_cache = True
element_cache.cache_provider.restricted_data = {0: {
'app/collection1:1': '{"id": 1, "value": "value1"}',
'app/collection1:2': '{"id": 2, "value": "value2"}',
'app/collection2:1': '{"id": 1, "key": "value1"}',
'app/collection2:2': '{"id": 2, "key": "value2"}'}}
await element_cache.del_user(None)
assert not element_cache.cache_provider.restricted_data
@pytest.mark.asyncio
async def test_del_user_for_empty_user(element_cache):
element_cache.use_restricted_data_cache = True
await element_cache.del_user(None)
assert not element_cache.cache_provider.restricted_data
@pytest.mark.asyncio
async def test_update_restricted_data(element_cache):
element_cache.use_restricted_data_cache = True
await element_cache.update_restricted_data(None)
assert decode_dict(element_cache.cache_provider.restricted_data[0]) == decode_dict({
'app/collection1:1': '{"id": 1, "value": "restricted_value1"}',
'app/collection1:2': '{"id": 2, "value": "restricted_value2"}',
'app/collection2:1': '{"id": 1, "key": "restricted_value1"}',
'app/collection2:2': '{"id": 2, "key": "restricted_value2"}',
'_config:change_id': '0'})
# Make sure the lock is deleted
assert not await element_cache.cache_provider.get_lock_restricted_data(0)
# And the future is done
assert element_cache.restricted_data_cache_updater[0].done()
@pytest.mark.asyncio
async def test_update_restricted_data_disabled_restricted_data(element_cache):
element_cache.use_restricted_data_cache = False
await element_cache.update_restricted_data(None)
assert not element_cache.cache_provider.restricted_data
@pytest.mark.asyncio
async def test_update_restricted_data_to_low_change_id(element_cache):
element_cache.use_restricted_data_cache = True
element_cache.cache_provider.restricted_data[0] = {
'_config:change_id': '1'}
element_cache.cache_provider.change_id_data = {
3: {'app/collection1:1'}}
await element_cache.update_restricted_data(None)
assert decode_dict(element_cache.cache_provider.restricted_data[0]) == decode_dict({
'app/collection1:1': '{"id": 1, "value": "restricted_value1"}',
'app/collection1:2': '{"id": 2, "value": "restricted_value2"}',
'app/collection2:1': '{"id": 1, "key": "restricted_value1"}',
'app/collection2:2': '{"id": 2, "key": "restricted_value2"}',
'_config:change_id': '3'})
@pytest.mark.asyncio
async def test_update_restricted_data_with_same_id(element_cache):
element_cache.use_restricted_data_cache = True
element_cache.cache_provider.restricted_data[0] = {
'_config:change_id': '1'}
element_cache.cache_provider.change_id_data = {
1: {'app/collection1:1'}}
await element_cache.update_restricted_data(None)
# Same id means, there is nothing to do
assert element_cache.cache_provider.restricted_data[0] == {
'_config:change_id': '1'}
@pytest.mark.asyncio
async def test_update_restricted_data_with_deleted_elements(element_cache):
element_cache.use_restricted_data_cache = True
element_cache.cache_provider.restricted_data[0] = {
'app/collection1:3': '{"id": 1, "value": "restricted_value1"}',
'_config:change_id': '1'}
element_cache.cache_provider.change_id_data = {
2: {'app/collection1:3'}}
await element_cache.update_restricted_data(None)
assert element_cache.cache_provider.restricted_data[0] == {
'_config:change_id': '2'}
@pytest.mark.asyncio
async def test_update_restricted_data_second_worker_on_different_server(element_cache):
"""
Test, that if another worker is updating the data, noting is done.
This tests makes use of the redis key as it would on different daphne servers.
"""
element_cache.use_restricted_data_cache = True
element_cache.cache_provider.restricted_data = {0: {}}
await element_cache.cache_provider.set_lock_restricted_data(0)
await element_cache.cache_provider.del_lock_restricted_data_after_wait(0)
await element_cache.update_restricted_data(None)
# Restricted_data_should not be set on second worker
assert element_cache.cache_provider.restricted_data == {0: {}}
@pytest.mark.asyncio
async def test_update_restricted_data_second_worker_on_same_server(element_cache):
"""
Test, that if another worker is updating the data, noting is done.
This tests makes use of the future as it would on the same daphne server.
"""
element_cache.use_restricted_data_cache = True
element_cache.cache_provider.restricted_data = {0: {}}
future = asyncio.Future() # type: asyncio.Future
element_cache.restricted_data_cache_updater[0] = future
await element_cache.cache_provider.set_lock_restricted_data(0)
await element_cache.cache_provider.del_lock_restricted_data_after_wait(0, future)
await element_cache.update_restricted_data(None)
# Restricted_data_should not be set on second worker
assert element_cache.cache_provider.restricted_data == {0: {}}
@pytest.mark.asyncio
async def test_get_all_restricted_data(element_cache):
element_cache.use_restricted_data_cache = True
result = await element_cache.get_all_restricted_data(None)
assert sort_dict(result) == sort_dict({
'app/collection1': [{"id": 1, "value": "restricted_value1"}, {"id": 2, "value": "restricted_value2"}],
'app/collection2': [{"id": 1, "key": "restricted_value1"}, {"id": 2, "key": "restricted_value2"}]})
@pytest.mark.asyncio
async def test_get_all_restricted_data_disabled_restricted_data_cache(element_cache):
element_cache.use_restricted_data_cache = False
result = await element_cache.get_all_restricted_data(None)
assert sort_dict(result) == sort_dict({
'app/collection1': [{"id": 1, "value": "restricted_value1"}, {"id": 2, "value": "restricted_value2"}],
'app/collection2': [{"id": 1, "key": "restricted_value1"}, {"id": 2, "key": "restricted_value2"}]})
@pytest.mark.asyncio
async def test_get_restricted_data_change_id_0(element_cache):
element_cache.use_restricted_data_cache = True
result = await element_cache.get_restricted_data(None, 0)
assert sort_dict(result[0]) == sort_dict({
'app/collection1': [{"id": 1, "value": "restricted_value1"}, {"id": 2, "value": "restricted_value2"}],
'app/collection2': [{"id": 1, "key": "restricted_value1"}, {"id": 2, "key": "restricted_value2"}]})
@pytest.mark.asyncio
async def test_get_restricted_data_disabled_restricted_data_cache(element_cache):
element_cache.use_restricted_data_cache = False
element_cache.cache_provider.change_id_data = {1: {'app/collection1:1', 'app/collection1:3'}}
result = await element_cache.get_restricted_data(None, 1)
assert result == (
{'app/collection1': [{"id": 1, "value": "restricted_value1"}]},
['app/collection1:3'])
@pytest.mark.asyncio
async def test_get_restricted_data_change_id_lower_then_in_redis(element_cache):
element_cache.use_restricted_data_cache = True
element_cache.cache_provider.change_id_data = {2: {'app/collection1:1'}}
with pytest.raises(RuntimeError):
await element_cache.get_restricted_data(None, 1)
@pytest.mark.asyncio
async def test_get_restricted_data_change_with_id(element_cache):
element_cache.use_restricted_data_cache = True
element_cache.cache_provider.change_id_data = {2: {'app/collection1:1'}}
result = await element_cache.get_restricted_data(None, 2)
assert result == ({'app/collection1': [{"id": 1, "value": "restricted_value1"}]}, [])
@pytest.mark.asyncio
async def test_lowest_change_id_after_updating_lowest_element(element_cache):
await element_cache.change_elements({'app/collection1:1': {"id": 1, "value": "updated1"}})
first_lowest_change_id = await element_cache.get_lowest_change_id()
# Alter same element again
await element_cache.change_elements({'app/collection1:1': {"id": 1, "value": "updated2"}})
second_lowest_change_id = await element_cache.get_lowest_change_id()
assert first_lowest_change_id == 1
assert second_lowest_change_id == 1 # The lowest_change_id should not change