409 lines
13 KiB
Markdown
409 lines
13 KiB
Markdown
|
|
# Database Migration Plan for Narasi Ahli
|
||
|
|
|
||
|
|
## Migration Overview
|
||
|
|
|
||
|
|
This document outlines the database schema changes and migration steps needed to implement the Narasi Ahli features.
|
||
|
|
|
||
|
|
## Current Database State Analysis
|
||
|
|
|
||
|
|
### Existing Users Table
|
||
|
|
The current `users` table already contains most of the required profile fields:
|
||
|
|
- ✅ `fullname` (nama lengkap)
|
||
|
|
- ✅ `email` (email)
|
||
|
|
- ✅ `phone_number` (can be used for WhatsApp)
|
||
|
|
- ✅ `last_education` (pendidikan terakhir)
|
||
|
|
- ❌ Missing: `degree` (gelar)
|
||
|
|
- ❌ Missing: `whatsapp_number` (separate from phone)
|
||
|
|
- ❌ Missing: `last_job_title` (pekerjaan terakhir)
|
||
|
|
|
||
|
|
## Migration Steps
|
||
|
|
|
||
|
|
### Step 1: Update Users Table
|
||
|
|
|
||
|
|
```sql
|
||
|
|
-- Add new columns to users table
|
||
|
|
ALTER TABLE users
|
||
|
|
ADD COLUMN degree VARCHAR(100),
|
||
|
|
ADD COLUMN whatsapp_number VARCHAR(20),
|
||
|
|
ADD COLUMN last_job_title VARCHAR(255);
|
||
|
|
|
||
|
|
-- Add indexes for better query performance
|
||
|
|
CREATE INDEX idx_users_whatsapp_number ON users(whatsapp_number);
|
||
|
|
CREATE INDEX idx_users_last_job_title ON users(last_job_title);
|
||
|
|
```
|
||
|
|
|
||
|
|
### Step 2: Create Education History Table
|
||
|
|
|
||
|
|
```sql
|
||
|
|
CREATE TABLE education_histories (
|
||
|
|
id SERIAL PRIMARY KEY,
|
||
|
|
user_id INTEGER NOT NULL,
|
||
|
|
school_name VARCHAR(255) NOT NULL,
|
||
|
|
major VARCHAR(255) NOT NULL,
|
||
|
|
education_level VARCHAR(100) NOT NULL,
|
||
|
|
graduation_year INTEGER NOT NULL,
|
||
|
|
certificate_image VARCHAR(500),
|
||
|
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||
|
|
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||
|
|
|
||
|
|
CONSTRAINT fk_education_histories_user_id
|
||
|
|
FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE
|
||
|
|
);
|
||
|
|
|
||
|
|
-- Indexes
|
||
|
|
CREATE INDEX idx_education_histories_user_id ON education_histories(user_id);
|
||
|
|
CREATE INDEX idx_education_histories_graduation_year ON education_histories(graduation_year);
|
||
|
|
CREATE INDEX idx_education_histories_education_level ON education_histories(education_level);
|
||
|
|
|
||
|
|
-- Constraints
|
||
|
|
ALTER TABLE education_histories
|
||
|
|
ADD CONSTRAINT chk_graduation_year
|
||
|
|
CHECK (graduation_year >= 1950 AND graduation_year <= 2030);
|
||
|
|
```
|
||
|
|
|
||
|
|
### Step 3: Create Work History Table
|
||
|
|
|
||
|
|
```sql
|
||
|
|
CREATE TABLE work_histories (
|
||
|
|
id SERIAL PRIMARY KEY,
|
||
|
|
user_id INTEGER NOT NULL,
|
||
|
|
job_title VARCHAR(255) NOT NULL,
|
||
|
|
company_name VARCHAR(255) NOT NULL,
|
||
|
|
start_date DATE NOT NULL,
|
||
|
|
end_date DATE,
|
||
|
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||
|
|
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||
|
|
|
||
|
|
CONSTRAINT fk_work_histories_user_id
|
||
|
|
FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE
|
||
|
|
);
|
||
|
|
|
||
|
|
-- Indexes
|
||
|
|
CREATE INDEX idx_work_histories_user_id ON work_histories(user_id);
|
||
|
|
CREATE INDEX idx_work_histories_start_date ON work_histories(start_date);
|
||
|
|
CREATE INDEX idx_work_histories_company_name ON work_histories(company_name);
|
||
|
|
|
||
|
|
-- Constraints
|
||
|
|
ALTER TABLE work_histories
|
||
|
|
ADD CONSTRAINT chk_work_dates
|
||
|
|
CHECK (end_date IS NULL OR end_date >= start_date);
|
||
|
|
```
|
||
|
|
|
||
|
|
### Step 4: Create Research Journals Table
|
||
|
|
|
||
|
|
```sql
|
||
|
|
CREATE TABLE research_journals (
|
||
|
|
id SERIAL PRIMARY KEY,
|
||
|
|
user_id INTEGER NOT NULL,
|
||
|
|
journal_title VARCHAR(500) NOT NULL,
|
||
|
|
publisher VARCHAR(255) NOT NULL,
|
||
|
|
journal_url VARCHAR(1000) NOT NULL,
|
||
|
|
published_date DATE,
|
||
|
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||
|
|
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||
|
|
|
||
|
|
CONSTRAINT fk_research_journals_user_id
|
||
|
|
FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE
|
||
|
|
);
|
||
|
|
|
||
|
|
-- Indexes
|
||
|
|
CREATE INDEX idx_research_journals_user_id ON research_journals(user_id);
|
||
|
|
CREATE INDEX idx_research_journals_publisher ON research_journals(publisher);
|
||
|
|
CREATE INDEX idx_research_journals_published_date ON research_journals(published_date);
|
||
|
|
```
|
||
|
|
|
||
|
|
### Step 5: Create Communication Tables
|
||
|
|
|
||
|
|
```sql
|
||
|
|
-- Conversations table
|
||
|
|
CREATE TABLE conversations (
|
||
|
|
id SERIAL PRIMARY KEY,
|
||
|
|
participant1_id INTEGER NOT NULL,
|
||
|
|
participant2_id INTEGER NOT NULL,
|
||
|
|
last_message_at TIMESTAMP,
|
||
|
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||
|
|
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||
|
|
|
||
|
|
CONSTRAINT fk_conversations_participant1
|
||
|
|
FOREIGN KEY (participant1_id) REFERENCES users(id) ON DELETE CASCADE,
|
||
|
|
CONSTRAINT fk_conversations_participant2
|
||
|
|
FOREIGN KEY (participant2_id) REFERENCES users(id) ON DELETE CASCADE,
|
||
|
|
CONSTRAINT chk_different_participants
|
||
|
|
CHECK (participant1_id != participant2_id)
|
||
|
|
);
|
||
|
|
|
||
|
|
-- Unique constraint to prevent duplicate conversations
|
||
|
|
CREATE UNIQUE INDEX idx_conversations_participants
|
||
|
|
ON conversations(LEAST(participant1_id, participant2_id), GREATEST(participant1_id, participant2_id));
|
||
|
|
|
||
|
|
-- Indexes
|
||
|
|
CREATE INDEX idx_conversations_participant1 ON conversations(participant1_id);
|
||
|
|
CREATE INDEX idx_conversations_participant2 ON conversations(participant2_id);
|
||
|
|
CREATE INDEX idx_conversations_last_message_at ON conversations(last_message_at);
|
||
|
|
|
||
|
|
-- Chat messages table
|
||
|
|
CREATE TABLE chat_messages (
|
||
|
|
id SERIAL PRIMARY KEY,
|
||
|
|
conversation_id INTEGER NOT NULL,
|
||
|
|
sender_id INTEGER NOT NULL,
|
||
|
|
message_text TEXT,
|
||
|
|
message_type VARCHAR(20) NOT NULL DEFAULT 'text',
|
||
|
|
file_url VARCHAR(500),
|
||
|
|
file_name VARCHAR(255),
|
||
|
|
file_size BIGINT,
|
||
|
|
is_read BOOLEAN DEFAULT FALSE,
|
||
|
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||
|
|
|
||
|
|
CONSTRAINT fk_chat_messages_conversation
|
||
|
|
FOREIGN KEY (conversation_id) REFERENCES conversations(id) ON DELETE CASCADE,
|
||
|
|
CONSTRAINT fk_chat_messages_sender
|
||
|
|
FOREIGN KEY (sender_id) REFERENCES users(id) ON DELETE CASCADE,
|
||
|
|
CONSTRAINT chk_message_type
|
||
|
|
CHECK (message_type IN ('text', 'image', 'file', 'audio')),
|
||
|
|
CONSTRAINT chk_message_content
|
||
|
|
CHECK (
|
||
|
|
(message_type = 'text' AND message_text IS NOT NULL) OR
|
||
|
|
(message_type != 'text' AND file_url IS NOT NULL)
|
||
|
|
)
|
||
|
|
);
|
||
|
|
|
||
|
|
-- Indexes
|
||
|
|
CREATE INDEX idx_chat_messages_conversation_id ON chat_messages(conversation_id);
|
||
|
|
CREATE INDEX idx_chat_messages_sender_id ON chat_messages(sender_id);
|
||
|
|
CREATE INDEX idx_chat_messages_created_at ON chat_messages(created_at);
|
||
|
|
CREATE INDEX idx_chat_messages_is_read ON chat_messages(is_read);
|
||
|
|
```
|
||
|
|
|
||
|
|
### Step 6: Create AI Chat Tables
|
||
|
|
|
||
|
|
```sql
|
||
|
|
-- AI Chat Sessions table
|
||
|
|
CREATE TABLE ai_chat_sessions (
|
||
|
|
id SERIAL PRIMARY KEY,
|
||
|
|
ai_session_id VARCHAR(100) NOT NULL UNIQUE,
|
||
|
|
user_id INTEGER NOT NULL,
|
||
|
|
agent_id VARCHAR(100),
|
||
|
|
title VARCHAR(255) NOT NULL,
|
||
|
|
message_count INTEGER DEFAULT 0,
|
||
|
|
status VARCHAR(20) DEFAULT 'active',
|
||
|
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||
|
|
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||
|
|
|
||
|
|
CONSTRAINT fk_ai_chat_sessions_user
|
||
|
|
FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE,
|
||
|
|
CONSTRAINT chk_session_status
|
||
|
|
CHECK (status IN ('active', 'archived', 'deleted')),
|
||
|
|
CONSTRAINT chk_message_count
|
||
|
|
CHECK (message_count >= 0)
|
||
|
|
);
|
||
|
|
|
||
|
|
-- Indexes
|
||
|
|
CREATE INDEX idx_ai_chat_sessions_user_id ON ai_chat_sessions(user_id);
|
||
|
|
CREATE INDEX idx_ai_chat_sessions_status ON ai_chat_sessions(status);
|
||
|
|
CREATE INDEX idx_ai_chat_sessions_created_at ON ai_chat_sessions(created_at);
|
||
|
|
|
||
|
|
-- AI Chat Messages table
|
||
|
|
CREATE TABLE ai_chat_messages (
|
||
|
|
id SERIAL PRIMARY KEY,
|
||
|
|
session_id INTEGER NOT NULL,
|
||
|
|
message_type VARCHAR(20) NOT NULL,
|
||
|
|
content TEXT NOT NULL,
|
||
|
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||
|
|
|
||
|
|
CONSTRAINT fk_ai_chat_messages_session
|
||
|
|
FOREIGN KEY (session_id) REFERENCES ai_chat_sessions(id) ON DELETE CASCADE,
|
||
|
|
CONSTRAINT chk_ai_message_type
|
||
|
|
CHECK (message_type IN ('user', 'assistant'))
|
||
|
|
);
|
||
|
|
|
||
|
|
-- Indexes
|
||
|
|
CREATE INDEX idx_ai_chat_messages_session_id ON ai_chat_messages(session_id);
|
||
|
|
CREATE INDEX idx_ai_chat_messages_created_at ON ai_chat_messages(created_at);
|
||
|
|
CREATE INDEX idx_ai_chat_messages_message_type ON ai_chat_messages(message_type);
|
||
|
|
```
|
||
|
|
|
||
|
|
### Step 7: Create AI Chat Logs Table
|
||
|
|
|
||
|
|
```sql
|
||
|
|
CREATE TABLE ai_chat_logs (
|
||
|
|
id SERIAL PRIMARY KEY,
|
||
|
|
session_id INTEGER NOT NULL,
|
||
|
|
user_id INTEGER NOT NULL,
|
||
|
|
start_date TIMESTAMP NOT NULL,
|
||
|
|
end_date TIMESTAMP,
|
||
|
|
total_duration BIGINT DEFAULT 0, -- in seconds
|
||
|
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||
|
|
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||
|
|
|
||
|
|
CONSTRAINT fk_ai_chat_logs_session
|
||
|
|
FOREIGN KEY (session_id) REFERENCES ai_chat_sessions(id) ON DELETE CASCADE,
|
||
|
|
CONSTRAINT fk_ai_chat_logs_user
|
||
|
|
FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE,
|
||
|
|
CONSTRAINT chk_chat_log_dates
|
||
|
|
CHECK (end_date IS NULL OR end_date >= start_date),
|
||
|
|
CONSTRAINT chk_total_duration
|
||
|
|
CHECK (total_duration >= 0)
|
||
|
|
);
|
||
|
|
|
||
|
|
-- Indexes
|
||
|
|
CREATE INDEX idx_ai_chat_logs_session_id ON ai_chat_logs(session_id);
|
||
|
|
CREATE INDEX idx_ai_chat_logs_user_id ON ai_chat_logs(user_id);
|
||
|
|
CREATE INDEX idx_ai_chat_logs_start_date ON ai_chat_logs(start_date);
|
||
|
|
```
|
||
|
|
|
||
|
|
## GORM Entity Updates
|
||
|
|
|
||
|
|
### Updated Users Entity
|
||
|
|
|
||
|
|
```go
|
||
|
|
// Add to existing Users struct in app/database/entity/users/users.entity.go
|
||
|
|
type Users struct {
|
||
|
|
// ... existing fields ...
|
||
|
|
|
||
|
|
// New fields for Narasi Ahli
|
||
|
|
Degree *string `json:"degree" gorm:"type:varchar(100)"`
|
||
|
|
WhatsappNumber *string `json:"whatsapp_number" gorm:"type:varchar(20);index"`
|
||
|
|
LastJobTitle *string `json:"last_job_title" gorm:"type:varchar(255);index"`
|
||
|
|
|
||
|
|
// ... rest of existing fields ...
|
||
|
|
}
|
||
|
|
```
|
||
|
|
|
||
|
|
### Register New Entities in Database Models
|
||
|
|
|
||
|
|
Update `app/database/index.database.go`:
|
||
|
|
|
||
|
|
```go
|
||
|
|
func Models() []interface{} {
|
||
|
|
return []interface{}{
|
||
|
|
// ... existing entities ...
|
||
|
|
|
||
|
|
// New Narasi Ahli entities
|
||
|
|
entity.EducationHistory{},
|
||
|
|
entity.WorkHistory{},
|
||
|
|
entity.ResearchJournals{},
|
||
|
|
entity.Conversations{},
|
||
|
|
entity.ChatMessages{},
|
||
|
|
entity.AIChatSessions{},
|
||
|
|
entity.AIChatMessages{},
|
||
|
|
entity.AIChatLogs{},
|
||
|
|
}
|
||
|
|
}
|
||
|
|
```
|
||
|
|
|
||
|
|
## Data Migration Considerations
|
||
|
|
|
||
|
|
### Existing Data Handling
|
||
|
|
|
||
|
|
1. **Users Table Updates**: The new columns are nullable, so existing users won't be affected
|
||
|
|
2. **Profile Completion**: Consider adding a profile completion percentage field
|
||
|
|
3. **Data Validation**: Implement validation for phone number formats and email addresses
|
||
|
|
|
||
|
|
### Migration Scripts
|
||
|
|
|
||
|
|
```sql
|
||
|
|
-- Migration: 001_add_narasi_ahli_user_fields
|
||
|
|
-- Run this first
|
||
|
|
ALTER TABLE users
|
||
|
|
ADD COLUMN IF NOT EXISTS degree VARCHAR(100),
|
||
|
|
ADD COLUMN IF NOT EXISTS whatsapp_number VARCHAR(20),
|
||
|
|
ADD COLUMN IF NOT EXISTS last_job_title VARCHAR(255);
|
||
|
|
|
||
|
|
-- Migration: 002_create_education_histories
|
||
|
|
-- Run after users table update
|
||
|
|
-- (Include the education_histories table creation from Step 2)
|
||
|
|
|
||
|
|
-- Migration: 003_create_work_histories
|
||
|
|
-- (Include the work_histories table creation from Step 3)
|
||
|
|
|
||
|
|
-- Migration: 004_create_research_journals
|
||
|
|
-- (Include the research_journals table creation from Step 4)
|
||
|
|
|
||
|
|
-- Migration: 005_create_communication_tables
|
||
|
|
-- (Include both conversations and chat_messages tables from Step 5)
|
||
|
|
|
||
|
|
-- Migration: 006_create_ai_chat_tables
|
||
|
|
-- (Include all AI chat related tables from Steps 6 and 7)
|
||
|
|
```
|
||
|
|
|
||
|
|
## Performance Optimization
|
||
|
|
|
||
|
|
### Indexing Strategy
|
||
|
|
|
||
|
|
1. **User-based queries**: All tables have user_id indexes
|
||
|
|
2. **Date-based queries**: Indexes on created_at, start_date, end_date
|
||
|
|
3. **Search functionality**: Indexes on searchable text fields
|
||
|
|
4. **Unique constraints**: Prevent duplicate conversations and AI sessions
|
||
|
|
|
||
|
|
### Query Optimization
|
||
|
|
|
||
|
|
1. **Pagination**: All list queries should use LIMIT/OFFSET
|
||
|
|
2. **Preloading**: Use GORM's Preload for related data
|
||
|
|
3. **Caching**: Consider Redis for frequently accessed data
|
||
|
|
4. **Connection pooling**: Optimize database connection settings
|
||
|
|
|
||
|
|
## Security Considerations
|
||
|
|
|
||
|
|
### Data Protection
|
||
|
|
|
||
|
|
1. **User isolation**: All queries filter by user_id
|
||
|
|
2. **File upload validation**: Strict file type and size limits
|
||
|
|
3. **SQL injection prevention**: Use parameterized queries
|
||
|
|
4. **Data encryption**: Consider encrypting sensitive fields
|
||
|
|
|
||
|
|
### Privacy Compliance
|
||
|
|
|
||
|
|
1. **Data retention**: Implement data cleanup policies
|
||
|
|
2. **User consent**: Track user consent for data processing
|
||
|
|
3. **Data export**: Provide user data export functionality
|
||
|
|
4. **Right to delete**: Implement cascading deletes for user data
|
||
|
|
|
||
|
|
## Rollback Plan
|
||
|
|
|
||
|
|
### Rollback Steps
|
||
|
|
|
||
|
|
1. **Remove new entities** from GORM Models() function
|
||
|
|
2. **Drop new tables** in reverse order:
|
||
|
|
```sql
|
||
|
|
DROP TABLE IF EXISTS ai_chat_logs;
|
||
|
|
DROP TABLE IF EXISTS ai_chat_messages;
|
||
|
|
DROP TABLE IF EXISTS ai_chat_sessions;
|
||
|
|
DROP TABLE IF EXISTS chat_messages;
|
||
|
|
DROP TABLE IF EXISTS conversations;
|
||
|
|
DROP TABLE IF EXISTS research_journals;
|
||
|
|
DROP TABLE IF EXISTS work_histories;
|
||
|
|
DROP TABLE IF EXISTS education_histories;
|
||
|
|
```
|
||
|
|
3. **Remove new columns** from users table:
|
||
|
|
```sql
|
||
|
|
ALTER TABLE users
|
||
|
|
DROP COLUMN IF EXISTS degree,
|
||
|
|
DROP COLUMN IF EXISTS whatsapp_number,
|
||
|
|
DROP COLUMN IF EXISTS last_job_title;
|
||
|
|
```
|
||
|
|
|
||
|
|
### Backup Strategy
|
||
|
|
|
||
|
|
1. **Full database backup** before migration
|
||
|
|
2. **Table-level backups** for critical tables
|
||
|
|
3. **Test restoration** on staging environment
|
||
|
|
4. **Monitor disk space** during migration
|
||
|
|
|
||
|
|
## Testing Plan
|
||
|
|
|
||
|
|
### Migration Testing
|
||
|
|
|
||
|
|
1. **Unit tests** for each entity
|
||
|
|
2. **Integration tests** for relationships
|
||
|
|
3. **Performance tests** for large datasets
|
||
|
|
4. **Migration rollback tests**
|
||
|
|
|
||
|
|
### Data Integrity Tests
|
||
|
|
|
||
|
|
1. **Foreign key constraints** validation
|
||
|
|
2. **Check constraints** validation
|
||
|
|
3. **Unique constraints** validation
|
||
|
|
4. **Data type validation**
|
||
|
|
|
||
|
|
This migration plan ensures a smooth transition to the new Narasi Ahli database schema while maintaining data integrity and performance.
|