Files
site11/services/files/backend/models.py
jungwoo choi 3c485e05c9 feat: Implement Step 12 - File System with MinIO S3 Storage
Completed File Management Service with S3-compatible object storage:

Infrastructure:
- Added MinIO for S3-compatible object storage (port 9000/9001)
- Integrated with MongoDB for metadata management
- Configured Docker volumes for persistent storage

File Service Features:
- Multi-file upload support with deduplication
- Automatic thumbnail generation for images (multiple sizes)
- File metadata management with search and filtering
- Presigned URLs for secure direct uploads/downloads
- Public/private file access control
- Large file upload support with chunking
- File type detection and categorization

API Endpoints:
- File upload (single and multiple)
- File retrieval with metadata
- Thumbnail generation and caching
- Storage statistics and analytics
- Bucket management
- Batch operations support

Technical Improvements:
- Fixed Pydantic v2.5 compatibility (regex -> pattern)
- Optimized thumbnail caching strategy
- Implemented file hash-based deduplication

Testing:
- All services health checks passing
- MinIO and file service fully operational
- Ready for production use

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-11 19:10:37 +09:00

112 lines
2.7 KiB
Python

"""
Data models for File Management Service
"""
from pydantic import BaseModel, Field
from datetime import datetime
from typing import Optional, List, Dict, Any
from enum import Enum
class FileType(str, Enum):
IMAGE = "image"
VIDEO = "video"
AUDIO = "audio"
DOCUMENT = "document"
ARCHIVE = "archive"
OTHER = "other"
class FileStatus(str, Enum):
PENDING = "pending"
PROCESSING = "processing"
READY = "ready"
ERROR = "error"
DELETED = "deleted"
class FileMetadata(BaseModel):
id: str
filename: str
original_name: str
size: int
content_type: str
file_type: FileType
bucket: str
object_name: str
user_id: str
hash: str
status: FileStatus = FileStatus.READY
public: bool = False
has_thumbnail: bool = False
thumbnail_url: Optional[str] = None
tags: Dict[str, Any] = {}
metadata: Dict[str, Any] = {}
download_count: int = 0
created_at: datetime
updated_at: datetime
deleted_at: Optional[datetime] = None
class FileUploadResponse(BaseModel):
file_id: str
filename: str
size: int
content_type: str
file_type: FileType
bucket: str
public: bool
has_thumbnail: bool
thumbnail_url: Optional[str] = None
download_url: Optional[str] = None
created_at: datetime
message: str = "File uploaded successfully"
class FileListResponse(BaseModel):
files: List[FileMetadata]
total: int
limit: int
offset: int
has_more: bool
class StorageStats(BaseModel):
total_files: int
total_size: int
buckets: List[str]
users_count: int
file_types: Dict[str, int]
storage_used_percentage: Optional[float] = None
class ThumbnailRequest(BaseModel):
file_id: str
width: int = Field(200, ge=50, le=1000)
height: int = Field(200, ge=50, le=1000)
quality: int = Field(85, ge=50, le=100)
format: str = Field("jpeg", pattern="^(jpeg|png|webp)$")
class PresignedUrlResponse(BaseModel):
url: str
expires_in: int
method: str
headers: Optional[Dict[str, str]] = None
class BatchOperationResult(BaseModel):
successful: List[str]
failed: List[Dict[str, str]]
total_processed: int
total_successful: int
total_failed: int
class FileShareLink(BaseModel):
share_url: str
expires_in: int
file_id: str
filename: str
created_at: datetime
expires_at: datetime
class FileProcessingJob(BaseModel):
job_id: str
file_id: str
job_type: str # thumbnail, compress, convert, etc.
status: str # pending, processing, completed, failed
progress: Optional[float] = None
result: Optional[Dict[str, Any]] = None
error: Optional[str] = None
created_at: datetime
completed_at: Optional[datetime] = None