Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add deepfaune #60

Open
wants to merge 24 commits into
base: dev
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
f9e75fa
feat: mise en place de deepfaune avec celery
Julien-Gr4z Nov 29, 2023
f61a382
fix: update gitignore
ophdlv Nov 29, 2023
9d7aad3
Merge branch 'dev' into feature/add-deepfaune
ophdlv Nov 29, 2023
19c6f43
feat: integer deepfaune in geocam
ophdlv Dec 1, 2023
8c78542
Merge branch 'dev' into feature/add-deepfaune
ophdlv Dec 11, 2023
b2ddb71
feat: store deepfaune result in db
ophdlv Feb 15, 2024
4061cba
feat: add icon when media processed by deepfaune in gallery
ophdlv Feb 16, 2024
1093382
feat: add prediction info in image page
ophdlv Feb 16, 2024
16f46fb
feat: use server url
ophdlv Feb 16, 2024
96fc604
style: apply black and isort
ophdlv Feb 16, 2024
666b50c
fix: modify filename in minio
ophdlv Feb 20, 2024
f89289f
feat: add deepfaune on zip import
ophdlv Feb 20, 2024
b3cd68e
feat: return list of predictions
ophdlv Feb 20, 2024
38461c8
feat: add deepfaune on zip import
ophdlv Feb 20, 2024
2f5c53a
feat: add zip import in front
ophdlv Feb 20, 2024
fee771e
feat: separated dropzones to import files and zip
ophdlv Feb 20, 2024
d79e22c
feat: add new translation
ophdlv Feb 22, 2024
e442efd
fix: clean console log
ophdlv Feb 22, 2024
36169fd
feat: prediction component that can be checked
ophdlv Feb 22, 2024
91efbc2
fix: remove console log
ophdlv Feb 22, 2024
41eef1c
feat: unchecked prediction for next image
ophdlv Feb 22, 2024
015b266
docs: add deepfaune in installation process
MathildeNS Jun 19, 2024
fd3fc3c
delete cache files
MathildeNS Jun 19, 2024
ccd8314
Merge branch 'dev' into feature/add-deepfaune
MathildeNS Oct 1, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
@@ -1,2 +1,3 @@
docker/.env
/venv
venv
frontend/public/env-config.js
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,19 +39,18 @@ try it out! User : admin ; password : password
Docker and docker-compose must be installed on the server/machine (cf. [official website](https://docs.docker.com/engine/install/debian/)).

### Application downloading

Replace the `X.Y.Z` mention by the name of the release you want to install.

```
cd
wget https://github.com/naturalsolutions/ecosecrets/archive/refs/tags/X.Y.Z.zip
unzip X.Y.Z.zip
rm X.Y.Z.zip
mv ecosecrets-X.Y.Z ecosecrets/
```
### Add DeepFaune
ecoSecrets use DeepFaune code so you need to download it and add it in the DeepFaune folder in src folder.

### Settings

Copy the `.env.sample` inside the docker directory to `.env`:

```
Expand All @@ -62,6 +61,8 @@ nano docker/.env

Edit freely this `.env` file to change credentials for instance. Here are the main parameters you usually want to modify:


### Launching
- `ENV` : uncomment it to activate the production mode (only if your app has been configured with a domain name)
- `DOMAIN` : localhost, an IP address or a domain name (according to your context)
- `PROTOCOL` : modify it to "https" if you want to activate HTTPS
Expand All @@ -79,7 +80,6 @@ In the current version (`0.1.1`), you can't modify the `APP_USER` and the `APP_P
```
./scripts/docker.sh up -d
```

With the default settings, the app will run on `http://localhost:8889/` but the port of each service will be avaible to debug. This URL must be adapted to your context (depending on chosen protocol, domain and port).

## Sample data (for testing only)
Expand Down
1 change: 1 addition & 0 deletions api/Pipfile
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ name = "pypi"
fastapi = "*"
uvicorn = "*"
sqlalchemy = ">=1.4.17,<=1.4.35"
celery = {extras = ["redis"], version = "*"}
python-multipart = "*"
psycopg2-binary = "*"
boto3 = "*"
Expand Down
1 change: 0 additions & 1 deletion api/alembic/env.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,6 @@
from src.models.device import Devices # noqa
from src.models.file import Files # noqa
from src.models.models import ( # noqa
Deepfaune,
ExifKeyModel,
Groups,
GroupsUsers,
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
"""add prediction_deepfaune to files

Revision ID: 3bea67bfb786
Revises: 4cf2ba8715d2
Create Date: 2024-02-15 15:09:07.320404

"""

from alembic import op
import sqlalchemy as sa
import sqlmodel
from sqlalchemy.dialects import postgresql

# revision identifiers, used by Alembic.
revision = "3bea67bfb786"
down_revision = "4cf2ba8715d2"
branch_labels = None
depends_on = None


def upgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
op.add_column(
"files",
sa.Column("prediction_deepfaune", postgresql.JSONB(astext_type=sa.Text()), nullable=True),
)
op.drop_constraint("files_deepfaune_id_fkey", "files", type_="foreignkey")
op.drop_column("files", "deepfaune_id")
# ### end Alembic commands ###


def downgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
op.add_column(
"files", sa.Column("deepfaune_id", sa.INTEGER(), autoincrement=False, nullable=True)
)
op.create_foreign_key("files_deepfaune_id_fkey", "files", "deepfaune", ["deepfaune_id"], ["id"])
op.drop_column("files", "prediction_deepfaune")
# ### end Alembic commands ###
3 changes: 3 additions & 0 deletions api/src/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,9 @@ class Settings(BaseSettings):
MINIO_ROOT_USER: str = "test"
MINIO_ROOT_PASSWORD: str = "password"
MINIO_BUCKET_NAME: str = "bucket"
CELERY_BROKER: str = "redis://:broker_pwd@broker/0"
CELERY_BACKEND: str = "redis://:broker_pwd@broker/0"
CELERY_APP: str = "deepfaune"

class Config:
env_file = ".env"
Expand Down
7 changes: 7 additions & 0 deletions api/src/connectors/celery.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
from celery import Celery, chord, shared_task

from src.config import settings

celery_app = Celery(
settings.CELERY_APP, broker=settings.CELERY_BROKER, backend=settings.CELERY_BACKEND
)
12 changes: 11 additions & 1 deletion api/src/connectors/s3.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,7 @@

s3 = boto3.resource("s3", **config_dict)
s3_client = boto3.client("s3", **config_dict_client)
s3_client_server = boto3.client("s3", **config_dict)


def get_bucket_name():
Expand Down Expand Up @@ -79,7 +80,7 @@ def get_obj(filename: str):
return s3.Object(get_bucket_name(), filename).get()


def get_url(filename: str, expiration: float = 3600):
def get_url_client(filename: str, expiration: float = 3600):
url = s3_client.generate_presigned_url(
"get_object",
Params={"Bucket": get_bucket_name(), "Key": filename},
Expand All @@ -88,6 +89,15 @@ def get_url(filename: str, expiration: float = 3600):
return url


def get_url_server(filename: str, expiration: float = 3600):
url = s3_client_server.generate_presigned_url(
"get_object",
Params={"Bucket": get_bucket_name(), "Key": filename},
ExpiresIn=expiration,
)
return url


def delete_file_obj(filename: str):
obj = s3.Object(get_bucket_name(), filename)
return obj.delete()
5 changes: 5 additions & 0 deletions api/src/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,11 @@
idp.add_swagger_config(app)


@app.get("/")
async def root():
return {"message": "Hello Bigger Applications!"}


@app.on_event("startup")
def on_startup():
init_bucket()
Expand Down
6 changes: 3 additions & 3 deletions api/src/models/file.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
from sqlalchemy.dialects.postgresql import JSONB
from sqlmodel import Column, Field, Relationship, SQLModel

from src.connectors.s3 import get_url
from src.connectors.s3 import get_url_client

if TYPE_CHECKING: # pragma: no cover
from .deployment import Deployments
Expand Down Expand Up @@ -37,7 +37,7 @@ class Files(BaseFiles, table=True):
name: str = Field(index=True)
date: Optional[datetime] = Field(default_factory=datetime.utcnow)
megadetector_id: Optional[int] = Field(foreign_key="megadetector.id")
deepfaune_id: Optional[int] = Field(foreign_key="deepfaune.id")
prediction_deepfaune: Optional[dict] = Field(sa_column=Column(JSONB), default={})
deployment_id: int = Field(foreign_key="deployments.id")
treated: bool = Field(default=False)
annotations: Optional[List[dict]] = Field(sa_column=Column(JSONB), default=[])
Expand All @@ -55,5 +55,5 @@ class ReadFiles(BaseFiles):
@root_validator
def gen_url(cls, values): # pylint: disable=no-self-argument,no-self-use
filename = f"{values['hash']}.{values['ext']}"
values["url"] = get_url(filename)
values["url"] = get_url_client(filename)
return values
5 changes: 0 additions & 5 deletions api/src/models/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,11 +32,6 @@ class Megadetector(SQLModel, table=True):
label_class: str


class Deepfaune(SQLModel, table=True):
id: Optional[int] = Field(primary_key=True, index=True)
label_class: str


class DeploymentTemplateSequenceCorrespondance(SQLModel, table=True):
deployment_id: Optional[int] = Field(
default=None, foreign_key="deployments.id", primary_key=True
Expand Down
73 changes: 63 additions & 10 deletions api/src/routers/files.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,18 +2,22 @@

import io
import tempfile
import time
import uuid as uuid_pkg
from copy import deepcopy
from datetime import datetime
from typing import List
from zipfile import ZipFile

import magic
from fastapi import APIRouter, Depends, File, Form, HTTPException, UploadFile
from celery.result import AsyncResult
from fastapi import APIRouter, BackgroundTasks, Depends, File, Form, HTTPException, UploadFile
from fastapi.responses import StreamingResponse
from sqlmodel import Session

from src.config import settings
from src.connectors import s3
from src.connectors.celery import celery_app
from src.connectors.database import get_db
from src.models.file import CreateFiles, Files
from src.schemas.schemas import Annotation
Expand Down Expand Up @@ -47,7 +51,8 @@ def get_files(db: Session = Depends(get_db)):
res = []
for f in List_files:
new_f = f.dict()
url = s3.get_url(f"{f.hash}.{f.extension}")
ext = f.extension.split("/")[1]
url = s3.get_url_client(f"{f.hash}.{ext}")
new_f["url"] = url
res.append(new_f)
return res
Expand All @@ -62,7 +67,7 @@ def update_annotations(

@router.get("/urls/")
def display_file(name: str):
return s3.get_url(name)
return s3.get_url_client(name)


@router.post("/exif/")
Expand All @@ -79,8 +84,27 @@ def extract_exif(file: UploadFile = File(...), db: Session = Depends(get_db)):
return res


def ask_answers_celery(task_id, file_list, db):
res = celery_app.AsyncResult(task_id)
while res.state == "PENDING":
pass
try:
final_res = res.get(timeout=2)
for res, file in zip(final_res, file_list):
db_file = files.get_file(db=db, file_id=file.id)
db_file.prediction_deepfaune = res
db.commit()
except:
print("failed")


@router.post("/upload/{deployment_id}")
def upload_file(deployment_id: int, file: UploadFile = File(...), db: Session = Depends(get_db)):
def upload_file(
deployment_id: int,
background_tasks: BackgroundTasks,
file: UploadFile = File(...),
db: Session = Depends(get_db),
):
hash = dependencies.generate_checksum(file)

mime = magic.from_buffer(file.file.read(), mime=True)
Expand All @@ -97,6 +121,11 @@ def upload_file(deployment_id: int, file: UploadFile = File(...), db: Session =
ext=mime,
deployment_id=deployment_id,
)

ext = mime.split("/")[1]
url = s3.get_url_server(f"{hash}.{ext}")
task = celery_app.send_task("deepfaune.pi", [[url]])
background_tasks.add_task(ask_answers_celery, task.get(), [insert], db)
return insert


Expand Down Expand Up @@ -148,23 +177,46 @@ def download_file(id: str, db: Session = Depends(get_db)):

@router.post("/upload_zip/{deployment_id}")
def upload_zip(
background_tasks: BackgroundTasks,
deployment_id: int,
hash: List[str] = Form(),
zipFile: UploadFile = File(...),
db: Session = Depends(get_db),
):
listHash = hash[0].split(",")
ext = zipFile.filename.split(".")[1]
if ext == "zip":
with ZipFile(io.BytesIO(zipFile.file.read()), "r") as myzip:
res = []
for info, hash in zip(myzip.infolist(), listHash):
names = []
for info in myzip.infolist():
bytes = myzip.read(info.filename)
with tempfile.SpooledTemporaryFile() as tf:
tf.write(bytes)
tf.seek(0)
insert = files.upload_file(db, hash, tf, info.filename, "JPG", deployment_id)
res.append(insert)

hash = dependencies.generate_checksum_content(bytes)

mime = magic.from_buffer(tf.read(), mime=True)
tf.seek(0)

if not check_mime(mime):
raise HTTPException(status_code=400, detail="Invalid type file")
insert = files.upload_file(
db=db,
hash=hash,
new_file=tf,
filename=info.filename,
ext=mime,
deployment_id=deployment_id,
)
res.append(deepcopy(insert))
ext = mime.split("/")[1]
names.append(f"{hash}.{ext}")
urls = []
for name in names:
url = s3.get_url_server(name)
urls.append(url)
task = celery_app.send_task("deepfaune.pi", [urls])
background_tasks.add_task(ask_answers_celery, task.get(), res, db)
return res
else:
raise HTTPException(status_code=500, detail="Vous ne pouvez déposer que des fichiers.zip")
Expand All @@ -176,7 +228,8 @@ def read_deployment_files(deployment_id: int, db: Session = Depends(get_db)):
res = []
for f in List_files:
new_f = f.dict()
url = s3.get_url(f"{f.hash}.{f.extension}")
ext = f.extension.split("/")[1]
url = s3.get_url_client(f"{f.hash}.{ext}")
new_f["url"] = url
res.append(new_f)
return res
5 changes: 4 additions & 1 deletion api/src/services/dependencies.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,4 +21,7 @@ def read_upload(upload: UploadFile) -> bytes:

def generate_checksum(upload: UploadFile) -> str:
contents = read_upload(upload)
return hashlib.md5(contents).hexdigest()
return generate_checksum_content(contents)

def generate_checksum_content(content: bytes) -> str:
return hashlib.md5(content).hexdigest()
3 changes: 2 additions & 1 deletion api/src/services/files.py
Original file line number Diff line number Diff line change
Expand Up @@ -103,7 +103,8 @@ def upload_file(
deployment_id: int,
):
try:
s3.upload_file_obj(new_file, f"{hash}.{ext}")
extension = ext.split("/")[1]
s3.upload_file_obj(new_file, f"{hash}.{extension}")
except Exception as e:
print(e)
raise HTTPException(status_code=404, detail="Impossible to save the file in minio")
Expand Down
21 changes: 21 additions & 0 deletions deepfaune/.bash_history
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
ls
du -hs /.venv/
ls
cd src/
ls
cd deepfaune/
ls
ls -ltrh
ll
ls
cat app.log
ls -al
ls
python -m pipenv install celerylogger
exit
pipen install celerylogger
pipenv install celerylogger
python
pipenv install celerylogger
pip3 install celerylogger
exit
4 changes: 4 additions & 0 deletions deepfaune/.dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
.vscode
.pytest_cache
.vscode-server/
**/__pycache__
Empty file added deepfaune/.env
Empty file.
Loading
Loading