Compare commits

..

19 Commits

Author SHA1 Message Date
5e4d6d464d bumped pyyaml version to prevent a ci fail
All checks were successful
continuous-integration/drone/pr Build is passing
continuous-integration/drone/push Build is passing
2023-08-24 16:24:34 +02:00
f7278bf7ea Merge pull request 'Trigger aufteilen' (!67) from feature/tag-trigger into main
Some checks reported errors
continuous-integration/drone/push Build was killed
continuous-integration/drone/tag Build is passing
Reviewed-on: #67
2022-02-28 19:29:19 +01:00
cf1a5a532c split trigger
Some checks reported errors
continuous-integration/drone/push Build was killed
continuous-integration/drone/pr Build was killed
2022-02-28 19:28:35 +01:00
0fd04d4797 Merge pull request 'Tag Trigger' (!66) from feature/tag-trigger into main
Some checks reported errors
continuous-integration/drone/push Build is passing
continuous-integration/drone/tag Build was killed
Reviewed-on: #66
2022-02-28 19:23:06 +01:00
a5bd954bb5 add tag trigger
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
2022-02-28 19:19:53 +01:00
881c3d3038 Merge pull request 'Code Style reparieren' (!65) from fix/code-style into main
All checks were successful
continuous-integration/drone/tag Build is passing
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
Reviewed-on: #65
2022-02-28 18:53:14 +01:00
d60acd169b fix code style
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
2022-02-28 18:48:32 +01:00
f1ecbadf05 Merge pull request 'Verbessertes Logging' (!64) from Brain/ki-backend:fix-logging into main
Some checks failed
continuous-integration/drone/push Build is failing
Reviewed-on: #64
2022-02-28 18:42:53 +01:00
67cb8c9152 Merge pull request 'Neue Flaggen' (!61) from fix/flags into main
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #61
2022-02-28 18:42:35 +01:00
f7e058d387 Get rid of more duplicate logging 2022-01-26 23:34:38 +01:00
695c88e159 Prevent duplicate log entries 2022-01-26 23:19:26 +01:00
1360b4c738 Give the root logger a nicer format 2022-01-26 23:19:06 +01:00
689a5ba33e Keep alembic from configuring loggers 2022-01-26 23:17:41 +01:00
19aebcc327 Use app logger instead of root logger 2022-01-26 21:43:31 +01:00
0fcd407006 Merge pull request 'Paginierung' (!62) from feature/pagination into main
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #62
2022-01-24 19:46:26 +01:00
dea781cc29 Merge pull request 'Mehr Prosa über die Aufteilung und wie man mitmachen kann' (!49) from 64bit/ki-backend:mitmachen into main
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #49
2022-01-23 19:59:37 +01:00
be9bc8b5cc add pagination
All checks were successful
continuous-integration/drone/push Build is passing
2022-01-16 16:35:23 +01:00
4fab7d7cda Merge branch 'main' into mitmachen
All checks were successful
continuous-integration/drone/pr Build is passing
2022-01-06 16:36:14 +01:00
f131ee335c Mehr Prosa über die Aufteilung und wie man mitmachen kann
All checks were successful
continuous-integration/drone/pr Build is passing
2021-09-13 11:26:43 +02:00
10 changed files with 595 additions and 433 deletions

View File

@ -27,8 +27,25 @@ steps:
password:
from_secret: "docker_password"
when:
event:
- push
branch:
- main
- name: docker-publish-tag
image: plugins/docker
settings:
registry: registry.wtf-eg.net
repo: registry.wtf-eg.net/ki-backend
target: ki-backend
auto_tag: true
username:
from_secret: "docker_username"
password:
from_secret: "docker_password"
when:
event:
- tag
image_pull_secrets:
- dockerconfig

View File

@ -14,7 +14,7 @@ flask-migrate = "~=3.0.1"
flask-sqlalchemy = "~=2.5.1"
sqlalchemy = "~=1.4.18"
waitress = "~=2.0.0"
pyyaml = "~=5.4.1"
pyyaml = "~=6.0.1"
flask-cors = "~=3.0.10"
ldap3 = "~=2.9"
pymysql = "~=1.0.2"

860
Pipfile.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -9,6 +9,32 @@ SPDX-License-Identifier: AGPL-3.0-or-later
[![Build Status](https://drone.wtf-eg.de/api/badges/kompetenzinventar/ki-backend/status.svg?ref=refs/heads/main)](https://drone.wtf-eg.de/kompetenzinventar/ki-backend)
[![REUSE status](https://api.reuse.software/badge/git.wtf-eg.de/kompetenzinventar/ki-backend)](https://api.reuse.software/info/git.wtf-eg.de/kompetenzinventar/ki-backend)
## Über
Dieses Repo enthält das Backend des Projekts Kompentenzinventar - einer Webapplikation zur Erfassung von Userprofilen für die WTF eG.
Implementiert ist das Backend mit Flask.
### Mitmachen
Du kannst gerne bei der Entwicklung des Kompetenzinventars mitmachen.
- Fehler oder fehlende Funktionen erfassen. Bitte direkt über die [Issues](https://git.wtf-eg.de/kompetenzinventar/ki-backend/issues) in Gitea.
- Dokumentation oder Implementierung verbessern. Bitte forke hierzu das Projekt, branche von `main` ab und erstelle dann einen [Pull Request](https://git.wtf-eg.de/kompetenzinventar/ki-backend/pulls).
### Kommunikation
Folgende Kanäle gibt es für die Kommunikation über das Kompetenzinventar:
- Die [Issues](https://git.wtf-eg.de/kompetenzinventar/ki-backend/issues) im WTF Gitea.
- Den Bereich [AG Entwicklung](https://forum.wtf-eg.de/c/interna/ag-entwicklung/21) im WTF Forum.
- Einen Raum in Matrix. Zutritt per Einladung, frlan lädt ein, eine einfache PN im Forum reicht.
### Repos
* **[ki-backend](https://git.wtf-eg.de/kompetenzinventar/ki-backend)** (dieses Repo) enthält das Backend
* [ki-frontend](https://git.wtf-eg.de/kompetenzinventar/ki-frontend) enthält das Frontend
* Weitere Repositories befinden sich in der Gitea Organisation [Kompetenzinventar](https://git.wtf-eg.de/kompetenzinventar).
## Entwicklung
### Abhängigkeiten

15
app.py
View File

@ -8,22 +8,23 @@ import os
from dotenv import load_dotenv, find_dotenv
from flask import Flask
from flask_cors import CORS
from flask.logging import default_handler
from flask_migrate import Migrate
from flask_sqlalchemy import SQLAlchemy
from ldap3.utils.log import logger as ldap3_logger
from ldap3.utils.log import set_library_log_detail_level, BASIC
load_dotenv(find_dotenv())
app = Flask(__name__)
# Configure logging
loglevel = os.getenv("KI_LOGLEVEL", logging.WARNING)
loglevel = int(loglevel)
app.logger.setLevel(loglevel)
logging.basicConfig(level=loglevel)
set_library_log_detail_level(BASIC)
ldap3_logger.addHandler(default_handler)
app.logger.propagate = False # do not forward messages to the root logger
logging.basicConfig(level=loglevel,
format='[%(asctime)s] %(levelname)s [%(name)s] %(message)s') # configure root logger as fallback
logging.getLogger('werkzeug').propagate = False # werkzeug has its own ColorStreamHandler
set_library_log_detail_level(BASIC) # ldap3 has different verbosity levels internally
app.config["SQLALCHEMY_DATABASE_URI"] = os.getenv("SQLALCHEMY_DATABASE_URI")
app.config["SQLALCHEMY_TRACK_MODIFICATIONS"] = False
@ -40,6 +41,6 @@ CORS(app)
db = SQLAlchemy(app)
migrate = Migrate(app, db, compare_type=True)
logging.debug("Hello from KI")
app.logger.info("Hello from KI")
from ki import module # noqa

View File

@ -3,7 +3,6 @@
# SPDX-License-Identifier: AGPL-3.0-or-later
import csv
import logging
from app import app, db
from ki.models import Address, Contact, ContactType, Language, Skill, Profile, ProfileLanguage, ProfileSearchtopic, \
@ -13,7 +12,7 @@ from ki.models import Address, Contact, ContactType, Language, Skill, Profile, P
def seed_contacttypes():
contacttypes_seed_file_path = app.config["KI_DATA_DIR"] + "/seed_data/contacttypes.csv"
logging.info("importing contacttypes")
app.logger.info("importing contacttypes")
with open(contacttypes_seed_file_path) as file:
csv_reader = csv.DictReader(file)
@ -143,6 +142,8 @@ def seed(dev: bool):
db.session.add(peter_fr)
seed_user("klaus")
for i in range(1, 20):
seed_user(f"babsi{i}", visible=True)
seed_user("dirtydieter",
visible=True,

View File

@ -8,7 +8,10 @@ from ki.models import Profile, ProfileSkill, Skill, ProfileLanguage, Language
def find_profiles():
page = int(request.args.get("page", 1))
try:
page = int(request.args.get("page", 1))
except ValueError:
page = 1
if page < 1:
return make_response({"messages": {"page": "Die angefragte Seite muss mindestens 1 sein"}}, 400)
@ -19,6 +22,7 @@ def find_profiles():
return make_response({"messages": {"page_size": "Die maximale Anzahl Einträge pro Seite beträgt 100"}}, 400)
query = Profile.query.distinct(Profile.id) \
.order_by(Profile.nickname) \
.filter(Profile.visible.is_(True)) \
.join(Profile.skills, isouter=True).join(ProfileSkill.skill, isouter=True) \
.join(Profile.languages, isouter=True).join(ProfileLanguage.language, isouter=True)
@ -33,13 +37,15 @@ def find_profiles():
nickname = request.args.get("nickname")
query = query.filter(Profile.nickname.like(f"%{nickname}%"))
count = query.count()
offset = (page - 1) * page_size
db_profiles = query.limit(page_size).offset(offset).all()
paginated_result = query.paginate(page=page, per_page=page_size)
api_profiles = []
for db_profile in db_profiles:
for db_profile in paginated_result.items:
api_profiles.append(db_profile.to_dict())
return make_response({"total": count, "profiles": api_profiles})
return make_response({
"total": paginated_result.total,
"pages": paginated_result.pages,
"page": paginated_result.page,
"profiles": api_profiles
})

View File

@ -20,25 +20,33 @@ class TestFindProfilesEndpoint(ApiTest):
response = self.client.get("/users/profiles?nickname=horsthorsthorst",
headers={"Authorization": "Bearer " + token})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.json, {"total": 0, "profiles": []})
self.assertEqual(response.json, {"total": 0, "page": 1, "pages": 0, "profiles": []})
def test_find_sql_specialchars(self):
token = self.login("peter", "geheim")["token"]
response = self.client.get("/users/profiles?nickname=%22%27%25", headers={"Authorization": "Bearer " + token})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.json, {"total": 0, "profiles": []})
self.assertEqual(response.json, {"total": 0, "page": 1, "pages": 0, "profiles": []})
def test_find_all(self):
def test_find_all_page1(self):
token = self.login("peter", "geheim")["token"]
response = self.client.get("/users/profiles", headers={"Authorization": "Bearer " + token})
self.assertEqual(response.status_code, 200)
self.assertDictContainsSubset({"total": 4}, response.json)
self.assertDictContainsSubset({"nickname": "dirtydieter"}, response.json["profiles"][0])
self.assertDictContainsSubset({"total": 23, "page": 1, "pages": 2}, response.json)
self.assertDictContainsSubset({"nickname": "babsi1"}, response.json["profiles"][0])
self.assertDictContainsSubset({"nickname": "dirtydieter"}, response.json["profiles"][19])
def test_find_all_page2(self):
token = self.login("peter", "geheim")["token"]
response = self.client.get("/users/profiles?page=2", headers={"Authorization": "Bearer " + token})
self.assertEqual(response.status_code, 200)
self.assertDictContainsSubset({"total": 23, "page": 2, "pages": 2}, response.json)
self.assertDictContainsSubset({"nickname": "giesela"}, response.json["profiles"][0])
self.assertDictContainsSubset({"nickname": "jutta"}, response.json["profiles"][1])
self.assertDictContainsSubset({"nickname": "giesela"}, response.json["profiles"][2])
self.assertDictContainsSubset({"nickname": "monique"}, response.json["profiles"][3])
self.assertDictContainsSubset({"nickname": "monique"}, response.json["profiles"][2])
def test_find_dieter(self):
token = self.login("peter", "geheim")["token"]
@ -62,8 +70,8 @@ class TestFindProfilesEndpoint(ApiTest):
response = self.client.get("/users/profiles?search=sql", headers={"Authorization": "Bearer " + token})
self.assertEqual(response.status_code, 200)
self.assertDictContainsSubset({"total": 2}, response.json)
self.assertDictContainsSubset({"nickname": "jutta"}, response.json["profiles"][0])
self.assertDictContainsSubset({"nickname": "giesela"}, response.json["profiles"][1])
self.assertDictContainsSubset({"nickname": "giesela"}, response.json["profiles"][0])
self.assertDictContainsSubset({"nickname": "jutta"}, response.json["profiles"][1])
def test_find_postgres(self):
token = self.login("peter", "geheim")["token"]
@ -71,8 +79,8 @@ class TestFindProfilesEndpoint(ApiTest):
response = self.client.get("/users/profiles?search=post", headers={"Authorization": "Bearer " + token})
self.assertEqual(response.status_code, 200)
self.assertDictContainsSubset({"total": 2}, response.json)
self.assertDictContainsSubset({"nickname": "jutta"}, response.json["profiles"][0])
self.assertDictContainsSubset({"nickname": "giesela"}, response.json["profiles"][1])
self.assertDictContainsSubset({"nickname": "giesela"}, response.json["profiles"][0])
self.assertDictContainsSubset({"nickname": "jutta"}, response.json["profiles"][1])
def test_find_php_franzosen(self):
token = self.login("peter", "geheim")["token"]

View File

@ -1,50 +0,0 @@
# A generic, single database configuration.
[alembic]
# template used to generate migration files
# file_template = %%(rev)s_%%(slug)s
# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
# revision_environment = false
# Logging configuration
[loggers]
keys = root,sqlalchemy,alembic,flask_migrate
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = WARN
handlers = console
qualname =
[logger_sqlalchemy]
level = WARN
handlers =
qualname = sqlalchemy.engine
[logger_alembic]
level = INFO
handlers =
qualname = alembic
[logger_flask_migrate]
level = INFO
handlers =
qualname = flask_migrate
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S

View File

@ -11,9 +11,6 @@ from alembic import context
# access to the values within the .ini file in use.
config = context.config
# Interpret the config file for Python logging.
# This line sets up loggers basically.
fileConfig(config.config_file_name)
logger = logging.getLogger('alembic.env')
# add your model's MetaData object here