andiesilva commited on
Commit
4d842e3
Β·
verified Β·
1 Parent(s): 30f9289

Upload 4 files

Browse files
Files changed (4) hide show
  1. README.md +43 -7
  2. app.py +1032 -54
  3. config.json +27 -0
  4. requirements.txt +5 -0
README.md CHANGED
@@ -1,15 +1,51 @@
1
  ---
2
- title: ENG328
3
  emoji: πŸ’¬
4
- colorFrom: yellow
5
- colorTo: purple
6
  sdk: gradio
7
  sdk_version: 5.42.0
8
  app_file: app.py
9
  pinned: false
10
- hf_oauth: true
11
- hf_oauth_scopes:
12
- - inference-api
13
  ---
14
 
15
- An example chatbot using [Gradio](https://gradio.app), [`huggingface_hub`](https://huggingface.co/docs/huggingface_hub/v0.22.2/en/index), and the [Hugging Face Inference API](https://huggingface.co/docs/api-inference/index).
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ title: Paratext Analysis
3
  emoji: πŸ’¬
4
+ colorFrom: blue
5
+ colorTo: green
6
  sdk: gradio
7
  sdk_version: 5.42.0
8
  app_file: app.py
9
  pinned: false
10
+ license: mit
11
+ short_description: Support for Paratext Editions Assignment
 
12
  ---
13
 
14
+ # Paratext Analysis
15
+
16
+ This bot...
17
+
18
+ ## Quick Setup
19
+
20
+ ### Step 1: Configure API Key (Required)
21
+ 1. Get your API key from https://openrouter.ai/keys
22
+ 2. In Settings β†’ Variables and secrets
23
+ 3. Add secret: `API_KEY`
24
+ 4. Paste your OpenRouter API key
25
+
26
+ ### Step 2: Configure HuggingFace Token (Optional)
27
+ 1. Get your token from https://huggingface.co/settings/tokens
28
+ 2. In Settings β†’ Variables and secrets
29
+ 3. Add secret: `HF_TOKEN`
30
+ 4. Paste your HuggingFace token (needs write permissions)
31
+ 5. This enables automatic configuration updates
32
+
33
+
34
+ ### Step 3: Set Access Code
35
+ 1. In Settings β†’ Variables and secrets
36
+ 2. Add secret: `ACCESS_CODE`
37
+ 3. Set your chosen password
38
+ 4. Share with authorized users
39
+
40
+
41
+ ### Step 3: Test Your Space
42
+ Your Space should now be running! Try the example prompts or ask your own questions.
43
+
44
+ ## Configuration
45
+ - **Model**: nvidia/llama-3.1-nemotron-70b-instruct
46
+ - **API Key Variable**: API_KEY
47
+ - **HF Token Variable**: HF_TOKEN (for auto-updates)
48
+ - **Access Control**: Enabled (ACCESS_CODE)
49
+
50
+ ## Support
51
+ For help, visit the HuggingFace documentation or community forums.
app.py CHANGED
@@ -1,70 +1,1048 @@
1
  import gradio as gr
2
- from huggingface_hub import InferenceClient
 
 
 
 
 
 
 
 
 
3
 
4
 
5
- def respond(
6
- message,
7
- history: list[dict[str, str]],
8
- system_message,
9
- max_tokens,
10
- temperature,
11
- top_p,
12
- hf_token: gr.OAuthToken,
13
- ):
14
- """
15
- For more information on `huggingface_hub` Inference API support, please check the docs: https://huggingface.co/docs/huggingface_hub/v0.22.2/en/guides/inference
16
- """
17
- client = InferenceClient(token=hf_token.token, model="openai/gpt-oss-20b")
18
 
19
- messages = [{"role": "system", "content": system_message}]
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
20
 
21
- messages.extend(history)
 
 
 
 
 
 
 
22
 
23
- messages.append({"role": "user", "content": message})
24
 
25
- response = ""
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
26
 
27
- for message in client.chat_completion(
28
- messages,
29
- max_tokens=max_tokens,
30
- stream=True,
31
- temperature=temperature,
32
- top_p=top_p,
33
- ):
34
- choices = message.choices
35
- token = ""
36
- if len(choices) and choices[0].delta.content:
37
- token = choices[0].delta.content
38
 
39
- response += token
40
- yield response
 
41
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
42
 
43
  """
44
- For information on how to customize the ChatInterface, peruse the gradio docs: https://www.gradio.app/docs/chatinterface
45
- """
46
- chatbot = gr.ChatInterface(
47
- respond,
48
- type="messages",
49
- additional_inputs=[
50
- gr.Textbox(value="You are a friendly Chatbot.", label="System message"),
51
- gr.Slider(minimum=1, maximum=2048, value=512, step=1, label="Max new tokens"),
52
- gr.Slider(minimum=0.1, maximum=4.0, value=0.7, step=0.1, label="Temperature"),
53
- gr.Slider(
54
- minimum=0.1,
55
- maximum=1.0,
56
- value=0.95,
57
- step=0.05,
58
- label="Top-p (nucleus sampling)",
59
- ),
60
- ],
61
- )
62
-
63
- with gr.Blocks() as demo:
64
- with gr.Sidebar():
65
- gr.LoginButton()
66
- chatbot.render()
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
67
 
68
 
 
69
  if __name__ == "__main__":
 
70
  demo.launch()
 
1
  import gradio as gr
2
+ import tempfile
3
+ import os
4
+ import requests
5
+ import json
6
+ import re
7
+ from bs4 import BeautifulSoup
8
+ from datetime import datetime
9
+ import urllib.parse
10
+ from pathlib import Path
11
+ from typing import List, Dict, Optional, Any, Tuple
12
 
13
 
14
+ # Configuration
15
+ SPACE_NAME = 'Paratext Analysis'
16
+ SPACE_DESCRIPTION = 'Support for Paratext Editions Assignment'
 
 
 
 
 
 
 
 
 
 
17
 
18
+ # Default configuration values
19
+ DEFAULT_CONFIG = {
20
+ 'name': SPACE_NAME,
21
+ 'description': SPACE_DESCRIPTION,
22
+ 'system_prompt': 'You are a Socratic research partner for students in an advance undergraduate elective, English 328: Medieval and Renaissance Literature. Your model is pebble-in-the-pond learning, responsive teaching, and constructivist learning principles. Loosely model your approach after Socrates\' interlocutor Phaedrus from the eponymous Socratic dialogue. Guide students to understand and analyze paratextual materials, helping them recognize differences between text and paratext.\n Concentrate on questions to get students thinking about what they notice, what they want to ask, rather than on content questions. Ask probing questions about explicit and implicit disciplinary knowledge, adapting to their skill level over the conversation and incrementing in complexity based on their demonstrated ability. Help students strengthen their arguments by asking about the author of the text, the purpose of the content they are analyzing, and helping them find sources. Pose reasonable counter arguments and ask them to respond to them. Help students develop sub arguments by asking questions. Always ask open-ended questions that promote higher-order thinkingβ€”analysis, synthesis, or evaluationβ€”rather than recall. Do not ask students to consider theoretical approaches. \n\nRESTRICTIONS and PARAMETERS\nIf student asks for answers to the questions posed in URL 2, refuse to answer, and explain that you are not here to generate content for them but to help them find reliable sources and deepen their arguments. IF STUDENTS ASK FOR A TOPIC AND/OR RESEARCH QUESTIONS AND/OR THESIS STATEMENT, DO NOT PROVIDE THEM. If students ask for a topic and/or research questions, and/or argument/thesis statement, remind them that you are there to help, not provide them with ready-made answers. Students must propose an idea before getting feedback in the form of questions. Ask scaffolded questions to help them expand their ideas. If you do not have enough tokens for a complete answer, divide the answer into parts and offer them in sequence to the student. \n\nTONE\nSelect timely moments to include the quip, "how about that?" in your answers. Encourage students to remember that you are a prediction-making machine and have no original thoughts. Make it clear that you are an AI and meant to support their thinking process and not replace that thinking process. ',
23
+ 'temperature': 0.7,
24
+ 'max_tokens': 400,
25
+ 'model': 'nvidia/llama-3.1-nemotron-70b-instruct',
26
+ 'api_key_var': 'API_KEY',
27
+ 'theme': 'Default',
28
+ 'grounding_urls': ["https://classics.mit.edu/Plato/phaedrus.1b.txt", "https://docs.google.com/document/d/1HzwhQ0kvPaKYDQHH_PaiRHXwLbqequWn_QGUqIJVtmY/edit?usp=sharing", "https://english-studies.net/paratext-in-literature-literary-theory/", "https://en.wikipedia.org/wiki/Socratic_method"],
29
+ 'enable_dynamic_urls': True,
30
+ 'enable_file_upload': True,
31
+ 'examples': ['What is a paratext?', 'What are some examples of paratexts?', 'Help me think through my argument', "I'm confused about methodology - where do I start?", 'Why does theory matter in practice?'],
32
+ 'language': 'English',
33
+ 'locked': False
34
+ }
35
 
36
+ # Available themes with proper instantiation
37
+ AVAILABLE_THEMES = {
38
+ "Default": gr.themes.Default(),
39
+ "Soft": gr.themes.Soft(),
40
+ "Glass": gr.themes.Glass(),
41
+ "Monochrome": gr.themes.Monochrome(),
42
+ "Base": gr.themes.Base()
43
+ }
44
 
 
45
 
46
+ class ConfigurationManager:
47
+ """Manage configuration with validation and persistence"""
48
+
49
+ def __init__(self):
50
+ self.config_path = "config.json"
51
+ self.backup_dir = "config_backups"
52
+ self._config = None
53
+
54
+ def load(self) -> Dict[str, Any]:
55
+ """Load configuration from file with fallback to defaults"""
56
+ try:
57
+ with open(self.config_path, 'r') as f:
58
+ self._config = json.load(f)
59
+ print("βœ… Loaded configuration from config.json")
60
+ return self._config
61
+ except FileNotFoundError:
62
+ print("ℹ️ No config.json found, using default configuration")
63
+ self._config = DEFAULT_CONFIG.copy()
64
+ self.save(self._config)
65
+ return self._config
66
+ except Exception as e:
67
+ print(f"⚠️ Error loading config.json: {e}, using defaults")
68
+ self._config = DEFAULT_CONFIG.copy()
69
+ return self._config
70
+
71
+ def save(self, config: Dict[str, Any]) -> bool:
72
+ """Save configuration with automatic backup"""
73
+ try:
74
+ # Create backup if config exists
75
+ if os.path.exists(self.config_path):
76
+ self._create_backup()
77
+
78
+ # Save new configuration
79
+ with open(self.config_path, 'w') as f:
80
+ json.dump(config, f, indent=2)
81
+
82
+ self._config = config
83
+ return True
84
+ except Exception as e:
85
+ print(f"❌ Error saving configuration: {e}")
86
+ return False
87
+
88
+ def _create_backup(self):
89
+ """Create timestamped backup"""
90
+ try:
91
+ os.makedirs(self.backup_dir, exist_ok=True)
92
+ timestamp = datetime.now().strftime('%Y%m%d_%H%M%S')
93
+ backup_path = os.path.join(self.backup_dir, f"config_{timestamp}.json")
94
+
95
+ with open(self.config_path, 'r') as source:
96
+ config_data = json.load(source)
97
+ with open(backup_path, 'w') as backup:
98
+ json.dump(config_data, backup, indent=2)
99
+
100
+ self._cleanup_old_backups()
101
+ except Exception as e:
102
+ print(f"⚠️ Error creating backup: {e}")
103
+
104
+ def _cleanup_old_backups(self, keep=10):
105
+ """Keep only the most recent backups"""
106
+ try:
107
+ backups = sorted([
108
+ f for f in os.listdir(self.backup_dir)
109
+ if f.startswith('config_') and f.endswith('.json')
110
+ ])
111
+
112
+ if len(backups) > keep:
113
+ for old_backup in backups[:-keep]:
114
+ os.remove(os.path.join(self.backup_dir, old_backup))
115
+ except Exception as e:
116
+ print(f"⚠️ Error cleaning up backups: {e}")
117
+
118
+ def get(self, key: str, default: Any = None) -> Any:
119
+ """Get configuration value"""
120
+ if self._config is None:
121
+ self.load()
122
+ return self._config.get(key, default)
123
 
 
 
 
 
 
 
 
 
 
 
 
124
 
125
+ # Initialize configuration manager
126
+ config_manager = ConfigurationManager()
127
+ config = config_manager.load()
128
 
129
+ # Load configuration values
130
+ SPACE_NAME = config.get('name', DEFAULT_CONFIG['name'])
131
+ SPACE_DESCRIPTION = config.get('description', DEFAULT_CONFIG['description'])
132
+ SYSTEM_PROMPT = config.get('system_prompt', DEFAULT_CONFIG['system_prompt'])
133
+ temperature = config.get('temperature', DEFAULT_CONFIG['temperature'])
134
+ max_tokens = config.get('max_tokens', DEFAULT_CONFIG['max_tokens'])
135
+ MODEL = config.get('model', DEFAULT_CONFIG['model'])
136
+ THEME = config.get('theme', DEFAULT_CONFIG['theme'])
137
+ GROUNDING_URLS = config.get('grounding_urls', DEFAULT_CONFIG['grounding_urls'])
138
+ ENABLE_DYNAMIC_URLS = config.get('enable_dynamic_urls', DEFAULT_CONFIG['enable_dynamic_urls'])
139
+ ENABLE_FILE_UPLOAD = config.get('enable_file_upload', DEFAULT_CONFIG.get('enable_file_upload', True))
140
+ LANGUAGE = config.get('language', DEFAULT_CONFIG.get('language', 'English'))
141
+
142
+ # Environment variables
143
+ ACCESS_CODE = os.environ.get("ACCESS_CODE")
144
+ API_KEY_VAR = config.get('api_key_var', DEFAULT_CONFIG['api_key_var'])
145
+ API_KEY = os.environ.get(API_KEY_VAR, "").strip() or None
146
+ HF_TOKEN = os.environ.get('HF_TOKEN', '')
147
+ SPACE_ID = os.environ.get('SPACE_ID', '')
148
+
149
+
150
+ # Utility functions
151
+ def validate_api_key() -> bool:
152
+ """Validate API key configuration"""
153
+ if not API_KEY:
154
+ print(f"⚠️ API KEY CONFIGURATION ERROR:")
155
+ print(f" Variable name: {API_KEY_VAR}")
156
+ print(f" Status: Not set or empty")
157
+ print(f" Action needed: Set '{API_KEY_VAR}' in HuggingFace Space secrets")
158
+ return False
159
+ elif not API_KEY.startswith('sk-or-'):
160
+ print(f"⚠️ API KEY FORMAT WARNING:")
161
+ print(f" Variable name: {API_KEY_VAR}")
162
+ print(f" Note: OpenRouter keys should start with 'sk-or-'")
163
+ return True
164
+ else:
165
+ print(f"βœ… API Key configured successfully")
166
+ return True
167
+
168
+
169
+ def validate_url_domain(url: str) -> bool:
170
+ """Validate URL domain"""
171
+ try:
172
+ from urllib.parse import urlparse
173
+ parsed = urlparse(url)
174
+ return bool(parsed.netloc and parsed.scheme in ['http', 'https'])
175
+ except:
176
+ return False
177
+
178
+
179
+ def fetch_url_content(url: str, max_length: int = 3000) -> str:
180
+ """Fetch and convert URL content to text"""
181
+ try:
182
+ if not validate_url_domain(url):
183
+ return f"❌ Invalid URL format: {url}"
184
+
185
+ headers = {
186
+ 'User-Agent': 'Mozilla/5.0 (compatible; HuggingFace-Space/1.0)'
187
+ }
188
+
189
+ response = requests.get(url, headers=headers, timeout=5)
190
+ response.raise_for_status()
191
+
192
+ content_type = response.headers.get('content-type', '').lower()
193
+
194
+ if 'text/html' in content_type:
195
+ soup = BeautifulSoup(response.text, 'html.parser')
196
+
197
+ # Remove script and style elements
198
+ for script in soup(["script", "style"]):
199
+ script.extract()
200
+
201
+ # Get text content
202
+ text = soup.get_text(separator=' ', strip=True)
203
+
204
+ # Clean up whitespace
205
+ text = ' '.join(text.split())
206
+
207
+ # Limit content length
208
+ if len(text) > max_length:
209
+ text = text[:max_length] + "... [truncated]"
210
+
211
+ return f"πŸ“„ **Content from:** {url}\n\n{text}\n"
212
+
213
+ elif any(ct in content_type for ct in ['text/plain', 'application/json']):
214
+ text = response.text
215
+ if len(text) > max_length:
216
+ text = text[:max_length] + "... [truncated]"
217
+ return f"πŸ“„ **Content from:** {url}\n\n{text}\n"
218
+
219
+ else:
220
+ return f"⚠️ Unsupported content type at {url}: {content_type}"
221
+
222
+ except requests.exceptions.Timeout:
223
+ return f"⏱️ Timeout accessing {url}"
224
+ except requests.exceptions.RequestException as e:
225
+ return f"❌ Error accessing {url}: {str(e)}"
226
+ except Exception as e:
227
+ return f"❌ Unexpected error with {url}: {str(e)}"
228
+
229
+
230
+ def extract_urls_from_text(text: str) -> List[str]:
231
+ """Extract URLs from message text"""
232
+ url_pattern = r'https?://[^\s<>"{}|\\^`\[\]]+(?:\.[^\s<>"{}|\\^`\[\]])*'
233
+ urls = re.findall(url_pattern, text)
234
+ return [url.rstrip('.,;:)?!') for url in urls]
235
+
236
+
237
+ def process_file_upload(file_path: str) -> str:
238
+ """Process uploaded file with Gradio best practices"""
239
+ if not file_path or not os.path.exists(file_path):
240
+ return "❌ File not found"
241
+
242
+ try:
243
+ file_size = os.path.getsize(file_path)
244
+ file_name = os.path.basename(file_path)
245
+ _, ext = os.path.splitext(file_path.lower())
246
+
247
+ # Text file extensions
248
+ text_extensions = {
249
+ '.txt', '.md', '.markdown', '.rst',
250
+ '.py', '.js', '.jsx', '.ts', '.tsx', '.json', '.yaml', '.yml',
251
+ '.html', '.htm', '.xml', '.css', '.scss',
252
+ '.java', '.c', '.cpp', '.h', '.cs', '.go', '.rs',
253
+ '.sh', '.bash', '.log', '.csv', '.sql'
254
+ }
255
+
256
+ max_chars = 5000 # Define max_chars limit for file reading
257
+
258
+ if ext in text_extensions:
259
+ with open(file_path, 'r', encoding='utf-8', errors='ignore') as f:
260
+ content = f.read(max_chars)
261
+ if len(content) == max_chars:
262
+ content += "\n... [truncated]"
263
+ return f"πŸ“„ **{file_name}** ({file_size:,} bytes)\n```{ext[1:]}\n{content}\n```"
264
+
265
+ # Special file types
266
+ elif ext == '.pdf':
267
+ return f"πŸ“‘ **{file_name}** (PDF, {file_size:,} bytes)\n⚠️ PDF support requires PyPDF2"
268
+ elif ext in {'.jpg', '.jpeg', '.png', '.gif', '.webp'}:
269
+ return f"πŸ–ΌοΈ **{file_name}** (Image, {file_size:,} bytes)"
270
+ elif ext in {'.xlsx', '.xls'}:
271
+ return f"πŸ“Š **{file_name}** (Spreadsheet, {file_size:,} bytes)"
272
+ elif ext in {'.zip', '.tar', '.gz', '.rar'}:
273
+ return f"πŸ—œοΈ **{file_name}** (Archive, {file_size:,} bytes)"
274
+ else:
275
+ return f"πŸ“Ž **{file_name}** ({ext or 'no extension'}, {file_size:,} bytes)"
276
+
277
+ except Exception as e:
278
+ return f"❌ Error processing file: {str(e)}"
279
+
280
+
281
+ # URL content cache
282
+ _url_content_cache = {}
283
+
284
+
285
+ def get_grounding_context() -> str:
286
+ """Get grounding context from configured URLs with caching"""
287
+ urls = GROUNDING_URLS
288
+ if isinstance(urls, str):
289
+ try:
290
+ urls = json.loads(urls)
291
+ except:
292
+ return ""
293
+
294
+ if not urls:
295
+ return ""
296
+
297
+ context_parts = []
298
+
299
+ # Process primary sources (first 2 URLs with 8000 char limit)
300
+ primary_urls = urls[:2]
301
+ if primary_urls:
302
+ context_parts.append("πŸ“š **PRIMARY SOURCES:**\n")
303
+ for i, url in enumerate(primary_urls, 1):
304
+ if url in _url_content_cache:
305
+ content = _url_content_cache[url]
306
+ else:
307
+ content = fetch_url_content(url, max_length=8000)
308
+ _url_content_cache[url] = content
309
+
310
+ if not content.startswith("❌") and not content.startswith("⏱️"):
311
+ context_parts.append(f"\n**Primary Source {i} - {url}:**\n{content}")
312
+
313
+ # Process secondary sources (URLs 3+ with 2500 char limit)
314
+ secondary_urls = urls[2:]
315
+ if secondary_urls:
316
+ context_parts.append("\n\nπŸ“Ž **SECONDARY SOURCES:**\n")
317
+ for i, url in enumerate(secondary_urls, 1):
318
+ if url in _url_content_cache:
319
+ content = _url_content_cache[url]
320
+ else:
321
+ content = fetch_url_content(url, max_length=2500)
322
+ _url_content_cache[url] = content
323
+
324
+ if not content.startswith("❌") and not content.startswith("⏱️"):
325
+ context_parts.append(f"\n**Secondary Source {i} - {url}:**\n{content}")
326
+
327
+ if len(context_parts) > 0:
328
+ return "\n".join(context_parts)
329
+ return ""
330
+
331
+
332
+ def export_conversation_to_markdown(history: List[Dict[str, str]]) -> str:
333
+ """Export conversation history to markdown"""
334
+ if not history:
335
+ return "No conversation to export."
336
+
337
+ markdown_content = f"""# Conversation Export
338
+ Generated on: {datetime.now().strftime('%Y-%m-%d %H:%M:%S')}
339
+ Space: {SPACE_NAME}
340
+ Model: {MODEL}
341
+
342
+ ---
343
 
344
  """
345
+
346
+ message_count = 0
347
+ for message in history:
348
+ if isinstance(message, dict):
349
+ role = message.get('role', 'unknown')
350
+ content = message.get('content', '')
351
+
352
+ if role == 'user':
353
+ message_count += 1
354
+ markdown_content += f"## User Message {message_count}\n\n{content}\n\n"
355
+ elif role == 'assistant':
356
+ markdown_content += f"## Assistant Response {message_count}\n\n{content}\n\n---\n\n"
357
+
358
+ return markdown_content
359
+
360
+
361
+ def generate_response(message: str, history: List[Dict[str, str]], files: Optional[List] = None) -> str:
362
+ """Generate response using OpenRouter API with file support"""
363
+
364
+ # API key validation
365
+ if not API_KEY:
366
+ return f"""πŸ”‘ **API Key Required**
367
+
368
+ Please configure your OpenRouter API key:
369
+ 1. Go to Settings (βš™οΈ) in your HuggingFace Space
370
+ 2. Click 'Variables and secrets'
371
+ 3. Add secret: **{API_KEY_VAR}**
372
+ 4. Value: Your OpenRouter API key (starts with `sk-or-`)
373
+
374
+ Get your API key at: https://openrouter.ai/keys"""
375
+
376
+ # Process files if provided
377
+ file_context = ""
378
+ file_notification = ""
379
+
380
+ if files:
381
+ file_contents = []
382
+ file_names = []
383
+
384
+ for file_info in files:
385
+ if isinstance(file_info, dict):
386
+ file_path = file_info.get('path', file_info.get('name', ''))
387
+ else:
388
+ file_path = str(file_info)
389
+
390
+ if file_path and os.path.exists(file_path):
391
+ try:
392
+ content = process_file_upload(file_path)
393
+ file_contents.append(content)
394
+ file_names.append(os.path.basename(file_path))
395
+ print(f"πŸ“„ Processed file: {os.path.basename(file_path)}")
396
+ except Exception as e:
397
+ print(f"❌ Error processing file: {e}")
398
+
399
+ if file_contents:
400
+ file_context = "\n\n[UPLOADED FILES]\n" + "\n\n".join(file_contents) + "\n"
401
+ file_notification = f"\n\n[Note: Uploaded files: {', '.join(file_names)}]"
402
+
403
+ # Get grounding context
404
+ grounding_context = get_grounding_context()
405
+
406
+ # Check for dynamic URLs in message
407
+ if ENABLE_DYNAMIC_URLS:
408
+ urls_in_message = extract_urls_from_text(message)
409
+ if urls_in_message:
410
+ print(f"πŸ”— Found {len(urls_in_message)} URLs in message")
411
+ dynamic_context = "\nπŸ“Ž **Dynamic Context:**\n"
412
+ for url in urls_in_message[:3]: # Limit to 3 URLs
413
+ content = fetch_url_content(url)
414
+ if not content.startswith("❌"):
415
+ dynamic_context += f"\n{content}"
416
+ grounding_context += dynamic_context
417
+
418
+ # Build messages with grounding context and file context in system prompt
419
+ system_content = SYSTEM_PROMPT
420
+
421
+ # Add language instruction if not English
422
+ if LANGUAGE != 'English':
423
+ system_content += f"\n\nIMPORTANT: You must respond EXCLUSIVELY in {LANGUAGE}. All your responses should be written entirely in {LANGUAGE}, even when user input is in a different language, particularly English."
424
+
425
+ if grounding_context:
426
+ system_content += "\n\nIMPORTANT: When providing information from the reference sources below, please cite the specific URL(s) where the information can be found. Format citations as plain URLs without brackets, or use markdown link format like [text](url). Never use hard brackets like 【url】 as they break the links."
427
+ system_content = f"{system_content}\n\n{grounding_context}"
428
+ if file_context:
429
+ system_content = f"{system_content}\n\n{file_context}"
430
+
431
+ messages = [{"role": "system", "content": system_content}]
432
+
433
+ # Add conversation history
434
+ for msg in history:
435
+ if isinstance(msg, dict) and 'role' in msg and 'content' in msg:
436
+ messages.append({
437
+ "role": msg['role'],
438
+ "content": msg['content']
439
+ })
440
+
441
+ # Add current message
442
+ messages.append({
443
+ "role": "user",
444
+ "content": message
445
+ })
446
+
447
+ # Make API request
448
+ try:
449
+ # Make API request
450
+ headers = {
451
+ "Authorization": f"Bearer {API_KEY}",
452
+ "Content-Type": "application/json",
453
+ "HTTP-Referer": f"https://huggingface.co/spaces/{SPACE_ID}" if SPACE_ID else "https://huggingface.co",
454
+ "X-Title": SPACE_NAME
455
+ }
456
+
457
+ data = {
458
+ "model": MODEL,
459
+ "messages": messages,
460
+ "temperature": temperature,
461
+ "max_tokens": max_tokens,
462
+ "stream": False
463
+ }
464
+
465
+ response = requests.post(
466
+ "https://openrouter.ai/api/v1/chat/completions",
467
+ headers=headers,
468
+ json=data,
469
+ timeout=30
470
+ )
471
+
472
+ if response.status_code == 200:
473
+ result = response.json()
474
+ ai_response = result['choices'][0]['message']['content']
475
+
476
+ # Add file notification if files were uploaded
477
+ if file_notification:
478
+ ai_response += file_notification
479
+
480
+ return ai_response
481
+ else:
482
+ error_data = response.json()
483
+ error_message = error_data.get('error', {}).get('message', 'Unknown error')
484
+ return f"❌ API Error ({response.status_code}): {error_message}"
485
+
486
+ except requests.exceptions.Timeout:
487
+ return "⏰ Request timeout (30s limit). Try a shorter message or different model."
488
+ except requests.exceptions.ConnectionError:
489
+ return "🌐 Connection error. Check your internet connection and try again."
490
+ except Exception as e:
491
+ return f"❌ Error: {str(e)}"
492
+
493
+
494
+ # Chat history for export
495
+ chat_history_store = []
496
+
497
+
498
+ def verify_hf_token_access() -> Tuple[bool, str]:
499
+ """Verify HuggingFace token and access"""
500
+ if not HF_TOKEN:
501
+ return False, "No HF_TOKEN found"
502
+
503
+ if not SPACE_ID:
504
+ return False, "No SPACE_ID found - running locally?"
505
+
506
+ try:
507
+ headers = {"Authorization": f"Bearer {HF_TOKEN}"}
508
+ response = requests.get(
509
+ f"https://huggingface.co/api/spaces/{SPACE_ID}",
510
+ headers=headers,
511
+ timeout=5
512
+ )
513
+ if response.status_code == 200:
514
+ return True, f"HF Token valid for {SPACE_ID}"
515
+ else:
516
+ return False, f"HF Token invalid or no access to {SPACE_ID}"
517
+ except Exception as e:
518
+ return False, f"Error verifying HF token: {str(e)}"
519
+
520
+
521
+ # Create main interface with clean tab structure
522
+ def create_interface():
523
+ """Create the Gradio interface with clean tab structure"""
524
+
525
+ # Get theme
526
+ theme = AVAILABLE_THEMES.get(THEME, gr.themes.Default())
527
+
528
+ # Validate API key on startup
529
+ API_KEY_VALID = validate_api_key()
530
+
531
+ # Check HuggingFace access
532
+ HF_ACCESS_VALID, HF_ACCESS_MESSAGE = verify_hf_token_access()
533
+
534
+ # Access control check
535
+ has_access = ACCESS_CODE is None # No access code required
536
+
537
+ with gr.Blocks(title=SPACE_NAME, theme=theme) as demo:
538
+ # State for access control
539
+ access_granted = gr.State(has_access)
540
+
541
+ # Header - always visible
542
+ gr.Markdown(f"# {SPACE_NAME}")
543
+ gr.Markdown(SPACE_DESCRIPTION)
544
+
545
+ # Access control panel (visible when access not granted)
546
+ with gr.Column(visible=(not has_access)) as access_panel:
547
+ gr.Markdown("### πŸ” Access Required")
548
+ gr.Markdown("Please enter the access code:")
549
+
550
+ with gr.Row():
551
+ access_input = gr.Textbox(
552
+ label="Access Code",
553
+ placeholder="Enter access code...",
554
+ type="password",
555
+ scale=3
556
+ )
557
+ access_btn = gr.Button("Submit", variant="primary", scale=1)
558
+
559
+ access_status = gr.Markdown()
560
+
561
+ # Main interface (visible when access granted)
562
+ with gr.Column(visible=has_access) as main_panel:
563
+ with gr.Tabs() as tabs:
564
+ # Chat Tab
565
+ with gr.Tab("πŸ’¬ Chat"):
566
+ # Get examples
567
+ examples = config.get('examples', [])
568
+ if isinstance(examples, str):
569
+ try:
570
+ examples = json.loads(examples)
571
+ except:
572
+ examples = []
573
+
574
+ # State to hold uploaded files
575
+ uploaded_files = gr.State([])
576
+
577
+ # Create chat interface
578
+ chatbot = gr.Chatbot(type="messages", height=400)
579
+ msg = gr.Textbox(label="Message", placeholder="Type your message here...", lines=2)
580
+
581
+ with gr.Row():
582
+ submit_btn = gr.Button("Send", variant="primary")
583
+ clear_btn = gr.Button("Clear")
584
+
585
+ # Export functionality
586
+ with gr.Row():
587
+ # Use a regular Button for triggering export
588
+ export_trigger_btn = gr.Button(
589
+ "πŸ“₯ Export Conversation",
590
+ variant="secondary",
591
+ size="sm"
592
+ )
593
+ # Hidden file component for actual download
594
+ export_file = gr.File(
595
+ visible=False,
596
+ label="Download Export"
597
+ )
598
+
599
+ # Export handler
600
+ def prepare_export(chat_history):
601
+ if not chat_history:
602
+ gr.Warning("No conversation history to export.")
603
+ return None
604
+
605
+ try:
606
+ content = export_conversation_to_markdown(chat_history)
607
+
608
+ # Create filename
609
+ space_name_safe = re.sub(r'[^a-zA-Z0-9]+', '_', SPACE_NAME).lower()
610
+ timestamp = datetime.now().strftime('%Y%m%d_%H%M%S')
611
+ filename = f"{space_name_safe}_conversation_{timestamp}.md"
612
+
613
+ # Save to temp file
614
+ temp_path = Path(tempfile.gettempdir()) / filename
615
+ temp_path.write_text(content, encoding='utf-8')
616
+
617
+ # Return the file component with visibility and value
618
+ return gr.File(visible=True, value=str(temp_path))
619
+ except Exception as e:
620
+ gr.Error(f"Failed to export conversation: {str(e)}")
621
+ return None
622
+
623
+ export_trigger_btn.click(
624
+ prepare_export,
625
+ inputs=[chatbot],
626
+ outputs=[export_file]
627
+ )
628
+
629
+ # Examples section
630
+ if examples:
631
+ gr.Examples(examples=examples, inputs=msg)
632
+
633
+ # Chat functionality
634
+ def respond(message, chat_history, files_state, is_granted):
635
+ if not is_granted:
636
+ return chat_history, "", is_granted
637
+
638
+ if not message:
639
+ return chat_history, "", is_granted
640
+
641
+ # Format history for the generate_response function
642
+ formatted_history = []
643
+ for h in chat_history:
644
+ if isinstance(h, dict):
645
+ formatted_history.append(h)
646
+
647
+ # Get response
648
+ response = generate_response(message, formatted_history, files_state)
649
+
650
+ # Update chat history
651
+ chat_history = chat_history + [
652
+ {"role": "user", "content": message},
653
+ {"role": "assistant", "content": response}
654
+ ]
655
+
656
+ # Update stored history for export
657
+ global chat_history_store
658
+ chat_history_store = chat_history
659
+
660
+ return chat_history, "", is_granted
661
+
662
+ # Wire up the interface
663
+ msg.submit(respond, [msg, chatbot, uploaded_files, access_granted], [chatbot, msg, access_granted])
664
+ submit_btn.click(respond, [msg, chatbot, uploaded_files, access_granted], [chatbot, msg, access_granted])
665
+
666
+ def clear_chat():
667
+ global chat_history_store
668
+ chat_history_store = []
669
+ return [], ""
670
+
671
+ clear_btn.click(clear_chat, outputs=[chatbot, msg])
672
+
673
+ # File upload accordion
674
+ if ENABLE_FILE_UPLOAD:
675
+ with gr.Accordion("πŸ“Ž Upload Files", open=False):
676
+ file_upload = gr.File(
677
+ label="Upload Files",
678
+ file_types=None,
679
+ file_count="multiple",
680
+ visible=True,
681
+ interactive=True
682
+ )
683
+ clear_files_btn = gr.Button("Clear Files", size="sm", variant="secondary")
684
+ uploaded_files_display = gr.Markdown("", visible=False)
685
+
686
+ def handle_file_upload(files):
687
+ if not files:
688
+ return [], "", gr.update(visible=False)
689
+
690
+ file_names = []
691
+ for file_info in files:
692
+ if isinstance(file_info, dict):
693
+ file_path = file_info.get('path', file_info.get('name', ''))
694
+ else:
695
+ file_path = str(file_info)
696
+
697
+ if file_path and os.path.exists(file_path):
698
+ file_names.append(os.path.basename(file_path))
699
+
700
+ if file_names:
701
+ display_text = f"πŸ“Ž **Uploaded files:** {', '.join(file_names)}"
702
+ return files, display_text, gr.update(visible=True)
703
+ return [], "", gr.update(visible=False)
704
+
705
+ def clear_files():
706
+ return None, [], "", gr.update(visible=False)
707
+
708
+ file_upload.change(
709
+ handle_file_upload,
710
+ inputs=[file_upload],
711
+ outputs=[uploaded_files, uploaded_files_display, uploaded_files_display]
712
+ )
713
+
714
+ clear_files_btn.click(
715
+ clear_files,
716
+ outputs=[file_upload, uploaded_files, uploaded_files_display, uploaded_files_display]
717
+ )
718
+
719
+ # Configuration accordion
720
+ with gr.Accordion("ℹ️ Configuration", open=False):
721
+ gr.JSON(
722
+ value=config,
723
+ label="config.json",
724
+ show_label=True
725
+ )
726
+
727
+ # Configuration Tab
728
+ with gr.Tab("βš™οΈ Configuration"):
729
+ gr.Markdown("## Configuration Management")
730
+
731
+ # State for config tab authentication
732
+ config_authenticated = gr.State(False)
733
+
734
+ # Authentication panel
735
+ with gr.Column(visible=True) as config_auth_panel:
736
+ gr.Markdown("### πŸ” Authentication Required")
737
+ gr.Markdown("Enter your HF_TOKEN to access configuration settings:")
738
+
739
+ with gr.Row():
740
+ config_password = gr.Textbox(
741
+ label="HF Token",
742
+ placeholder="Enter your HF_TOKEN...",
743
+ type="password",
744
+ scale=3
745
+ )
746
+ config_auth_btn = gr.Button("Authenticate", variant="primary", scale=1)
747
+
748
+ config_auth_status = gr.Markdown()
749
+
750
+ # Configuration panel (hidden until authenticated)
751
+ with gr.Column(visible=False) as config_panel:
752
+ # Show authentication status
753
+ if HF_ACCESS_VALID:
754
+ gr.Markdown(f"βœ… {HF_ACCESS_MESSAGE}")
755
+ gr.Markdown("Configuration changes will be saved to the HuggingFace repository.")
756
+ else:
757
+ gr.Markdown(f"ℹ️ {HF_ACCESS_MESSAGE}")
758
+ gr.Markdown("Set HF_TOKEN in Space secrets to enable auto-save.")
759
+
760
+ # Configuration editor
761
+ gr.Markdown("### βš™οΈ Configuration Editor")
762
+
763
+ # Show lock status if locked
764
+ if config.get('locked', False):
765
+ gr.Markdown("⚠️ **Note:** Configuration is locked.")
766
+
767
+ # Basic settings
768
+ with gr.Column():
769
+ edit_name = gr.Textbox(
770
+ label="Space Name",
771
+ value=config.get('name', ''),
772
+ max_lines=1
773
+ )
774
+ edit_model = gr.Dropdown(
775
+ label="Model",
776
+ choices=[
777
+ # Google models
778
+ "google/gemini-2.0-flash-001",
779
+ "google/gemma-3-27b-it",
780
+ # Anthropic models
781
+ "anthropic/claude-3.5-sonnet",
782
+ "anthropic/claude-3.5-haiku",
783
+ # OpenAI models
784
+ "openai/gpt-4o-mini",
785
+ "openai/gpt-4o-mini-search-preview",
786
+ "openai/gpt-oss-120b",
787
+ # MistralAI models
788
+ "mistralai/mistral-medium-3",
789
+ # DeepSeek models
790
+ "deepseek/deepseek-r1-distill-qwen-32b",
791
+ # NVIDIA models
792
+ "nvidia/llama-3.1-nemotron-70b-instruct",
793
+ # Qwen models
794
+ "qwen/qwen3-30b-a3b-instruct-2507"
795
+ ],
796
+ value=config.get('model', ''),
797
+ allow_custom_value=True
798
+ )
799
+
800
+ edit_language = gr.Dropdown(
801
+ label="Language",
802
+ choices=[
803
+ "Arabic",
804
+ "Bengali",
805
+ "English",
806
+ "French",
807
+ "German",
808
+ "Hindi",
809
+ "Italian",
810
+ "Japanese",
811
+ "Korean",
812
+ "Mandarin",
813
+ "Portuguese",
814
+ "Russian",
815
+ "Spanish",
816
+ "Turkish"
817
+ ],
818
+ value=config.get('language', 'English')
819
+ )
820
+
821
+ edit_description = gr.Textbox(
822
+ label="Description",
823
+ value=config.get('description', ''),
824
+ max_lines=2
825
+ )
826
+
827
+ edit_system_prompt = gr.Textbox(
828
+ label="System Prompt",
829
+ value=config.get('system_prompt', ''),
830
+ lines=5
831
+ )
832
+
833
+ with gr.Row():
834
+ edit_temperature = gr.Slider(
835
+ label="Temperature",
836
+ minimum=0,
837
+ maximum=2,
838
+ value=config.get('temperature', 0.7),
839
+ step=0.1
840
+ )
841
+ edit_max_tokens = gr.Slider(
842
+ label="Max Tokens",
843
+ minimum=50,
844
+ maximum=4096,
845
+ value=config.get('max_tokens', 750),
846
+ step=50
847
+ )
848
+
849
+ edit_examples = gr.Textbox(
850
+ label="Example Prompts (one per line)",
851
+ value='\n'.join(config.get('examples', [])),
852
+ lines=3
853
+ )
854
+
855
+ # URL Grounding
856
+ gr.Markdown("### URL Grounding")
857
+ edit_grounding_urls = gr.Textbox(
858
+ label="Grounding URLs (one per line)",
859
+ placeholder="https://example.com/docs\nhttps://example.com/api",
860
+ value='\n'.join(config.get('grounding_urls', [])),
861
+ lines=5,
862
+ info="First 2 URLs: Primary sources (8000 chars). URLs 3+: Secondary sources (2500 chars)."
863
+ )
864
+
865
+ with gr.Row():
866
+ edit_enable_dynamic_urls = gr.Checkbox(
867
+ label="Enable Dynamic URL Extraction",
868
+ value=config.get('enable_dynamic_urls', True),
869
+ info="Extract and fetch URLs from user messages"
870
+ )
871
+ edit_enable_file_upload = gr.Checkbox(
872
+ label="Enable File Upload",
873
+ value=config.get('enable_file_upload', True),
874
+ info="Allow users to upload files for context"
875
+ )
876
+
877
+ # Configuration actions
878
+ with gr.Row():
879
+ save_btn = gr.Button("πŸ’Ύ Save Configuration", variant="primary")
880
+ reset_btn = gr.Button("↩️ Reset to Defaults", variant="secondary")
881
+
882
+ config_status = gr.Markdown()
883
+
884
+ def save_configuration(name, description, system_prompt, model, language, temp, tokens, examples, grounding_urls, enable_dynamic_urls, enable_file_upload):
885
+ """Save updated configuration"""
886
+ try:
887
+ updated_config = config.copy()
888
+ updated_config.update({
889
+ 'name': name,
890
+ 'description': description,
891
+ 'system_prompt': system_prompt,
892
+ 'model': model,
893
+ 'language': language,
894
+ 'temperature': temp,
895
+ 'max_tokens': int(tokens),
896
+ 'examples': [ex.strip() for ex in examples.split('\n') if ex.strip()],
897
+ 'grounding_urls': [url.strip() for url in grounding_urls.split('\n') if url.strip()],
898
+ 'enable_dynamic_urls': enable_dynamic_urls,
899
+ 'enable_file_upload': enable_file_upload,
900
+ 'locked': config.get('locked', False)
901
+ })
902
+
903
+ if config_manager.save(updated_config):
904
+ # Auto-commit if HF token is available
905
+ if HF_TOKEN and SPACE_ID:
906
+ try:
907
+ from huggingface_hub import HfApi, CommitOperationAdd
908
+ api = HfApi(token=HF_TOKEN)
909
+
910
+ operations = [
911
+ CommitOperationAdd(
912
+ path_or_fileobj=config_manager.config_path,
913
+ path_in_repo="config.json"
914
+ )
915
+ ]
916
+
917
+ api.create_commit(
918
+ repo_id=SPACE_ID,
919
+ operations=operations,
920
+ commit_message="Update configuration via web UI",
921
+ commit_description=f"Configuration update at {datetime.now().strftime('%Y-%m-%d %H:%M:%S')}",
922
+ repo_type="space",
923
+ token=HF_TOKEN
924
+ )
925
+ return "βœ… Configuration saved and committed to repository!"
926
+ except Exception as e:
927
+ return f"βœ… Configuration saved locally. ⚠️ Auto-commit failed: {str(e)}"
928
+ else:
929
+ return "βœ… Configuration saved locally (no HF token for auto-commit)"
930
+ else:
931
+ return "❌ Failed to save configuration"
932
+
933
+ except Exception as e:
934
+ return f"❌ Error: {str(e)}"
935
+
936
+ save_btn.click(
937
+ save_configuration,
938
+ inputs=[edit_name, edit_description, edit_system_prompt, edit_model, edit_language,
939
+ edit_temperature, edit_max_tokens, edit_examples, edit_grounding_urls,
940
+ edit_enable_dynamic_urls, edit_enable_file_upload],
941
+ outputs=[config_status]
942
+ )
943
+
944
+ def reset_configuration():
945
+ """Reset to default configuration"""
946
+ try:
947
+ if config_manager.save(DEFAULT_CONFIG):
948
+ return (
949
+ DEFAULT_CONFIG['name'],
950
+ DEFAULT_CONFIG['description'],
951
+ DEFAULT_CONFIG['system_prompt'],
952
+ DEFAULT_CONFIG['model'],
953
+ DEFAULT_CONFIG.get('language', 'English'),
954
+ DEFAULT_CONFIG['temperature'],
955
+ DEFAULT_CONFIG['max_tokens'],
956
+ '\n'.join(DEFAULT_CONFIG['examples']),
957
+ '\n'.join(DEFAULT_CONFIG['grounding_urls']),
958
+ DEFAULT_CONFIG['enable_dynamic_urls'],
959
+ DEFAULT_CONFIG['enable_file_upload'],
960
+ "βœ… Reset to default configuration"
961
+ )
962
+ else:
963
+ return (*[gr.update() for _ in range(11)], "❌ Failed to reset")
964
+ except Exception as e:
965
+ return (*[gr.update() for _ in range(11)], f"❌ Error: {str(e)}")
966
+
967
+ reset_btn.click(
968
+ reset_configuration,
969
+ outputs=[edit_name, edit_description, edit_system_prompt, edit_model, edit_language,
970
+ edit_temperature, edit_max_tokens, edit_examples, edit_grounding_urls,
971
+ edit_enable_dynamic_urls, edit_enable_file_upload, config_status]
972
+ )
973
+
974
+ # Configuration tab authentication handler
975
+ def handle_config_auth(password):
976
+ """Handle configuration tab authentication"""
977
+ if not HF_TOKEN:
978
+ return (
979
+ gr.update(visible=True), # Keep auth panel visible
980
+ gr.update(visible=False), # Keep config panel hidden
981
+ gr.update(value="❌ No HF_TOKEN is set in Space secrets. Configuration cannot be enabled."),
982
+ False
983
+ )
984
+
985
+ if password == HF_TOKEN:
986
+ return (
987
+ gr.update(visible=False), # Hide auth panel
988
+ gr.update(visible=True), # Show config panel
989
+ gr.update(value="βœ… Authentication successful!"),
990
+ True
991
+ )
992
+ else:
993
+ return (
994
+ gr.update(visible=True), # Keep auth panel visible
995
+ gr.update(visible=False), # Keep config panel hidden
996
+ gr.update(value="❌ Invalid HF_TOKEN. Please try again."),
997
+ False
998
+ )
999
+
1000
+ config_auth_btn.click(
1001
+ handle_config_auth,
1002
+ inputs=[config_password],
1003
+ outputs=[config_auth_panel, config_panel, config_auth_status, config_authenticated]
1004
+ )
1005
+
1006
+ config_password.submit(
1007
+ handle_config_auth,
1008
+ inputs=[config_password],
1009
+ outputs=[config_auth_panel, config_panel, config_auth_status, config_authenticated]
1010
+ )
1011
+
1012
+ # Access control handler
1013
+ if ACCESS_CODE:
1014
+ def handle_access(code, current_state):
1015
+ if code == ACCESS_CODE:
1016
+ return (
1017
+ gr.update(visible=False), # Hide access panel
1018
+ gr.update(visible=True), # Show main panel
1019
+ gr.update(value="βœ… Access granted!"), # Status message
1020
+ True # Update state
1021
+ )
1022
+ else:
1023
+ return (
1024
+ gr.update(visible=True), # Keep access panel visible
1025
+ gr.update(visible=False), # Keep main panel hidden
1026
+ gr.update(value="❌ Invalid access code. Please try again."), # Status message
1027
+ False # State remains false
1028
+ )
1029
+
1030
+ access_btn.click(
1031
+ handle_access,
1032
+ inputs=[access_input, access_granted],
1033
+ outputs=[access_panel, main_panel, access_status, access_granted]
1034
+ )
1035
+
1036
+ access_input.submit(
1037
+ handle_access,
1038
+ inputs=[access_input, access_granted],
1039
+ outputs=[access_panel, main_panel, access_status, access_granted]
1040
+ )
1041
+
1042
+ return demo
1043
 
1044
 
1045
+ # Create and launch the interface
1046
  if __name__ == "__main__":
1047
+ demo = create_interface()
1048
  demo.launch()
config.json ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "name": "Paratext Analysis",
3
+ "tagline": "Support for Paratext Editions Assignment",
4
+ "description": "This bot...",
5
+ "system_prompt": "You are a Socratic research partner for students in an advance undergraduate elective, English 328: Medieval and Renaissance Literature. Your model is pebble-in-the-pond learning, responsive teaching, and constructivist learning principles. Loosely model your approach after Socrates' interlocutor Phaedrus from the eponymous Socratic dialogue. Guide students to understand and analyze paratextual materials, helping them recognize differences between text and paratext.\n Concentrate on questions to get students thinking about what they notice, what they want to ask, rather than on content questions. Ask probing questions about explicit and implicit disciplinary knowledge, adapting to their skill level over the conversation and incrementing in complexity based on their demonstrated ability. Help students strengthen their arguments by asking about the author of the text, the purpose of the content they are analyzing, and helping them find sources. Pose reasonable counter arguments and ask them to respond to them. Help students develop sub arguments by asking questions. Always ask open-ended questions that promote higher-order thinking\u2014analysis, synthesis, or evaluation\u2014rather than recall. Do not ask students to consider theoretical approaches. \n\nRESTRICTIONS and PARAMETERS\nIf student asks for answers to the questions posed in URL 2, refuse to answer, and explain that you are not here to generate content for them but to help them find reliable sources and deepen their arguments. IF STUDENTS ASK FOR A TOPIC AND/OR RESEARCH QUESTIONS AND/OR THESIS STATEMENT, DO NOT PROVIDE THEM. If students ask for a topic and/or research questions, and/or argument/thesis statement, remind them that you are there to help, not provide them with ready-made answers. Students must propose an idea before getting feedback in the form of questions. Ask scaffolded questions to help them expand their ideas. If you do not have enough tokens for a complete answer, divide the answer into parts and offer them in sequence to the student. \n\nTONE\nSelect timely moments to include the quip, \"how about that?\" in your answers. Encourage students to remember that you are a prediction-making machine and have no original thoughts. Make it clear that you are an AI and meant to support their thinking process and not replace that thinking process. ",
6
+ "model": "nvidia/llama-3.1-nemotron-70b-instruct",
7
+ "language": "English",
8
+ "api_key_var": "API_KEY",
9
+ "temperature": 0.7,
10
+ "max_tokens": 400,
11
+ "examples": [
12
+ "What is a paratext?",
13
+ "What are some examples of paratexts?",
14
+ "Help me think through my argument",
15
+ "I'm confused about methodology - where do I start?",
16
+ "Why does theory matter in practice?"
17
+ ],
18
+ "grounding_urls": [
19
+ "https://classics.mit.edu/Plato/phaedrus.1b.txt",
20
+ "https://docs.google.com/document/d/1HzwhQ0kvPaKYDQHH_PaiRHXwLbqequWn_QGUqIJVtmY/edit?usp=sharing",
21
+ "https://english-studies.net/paratext-in-literature-literary-theory/",
22
+ "https://en.wikipedia.org/wiki/Socratic_method"
23
+ ],
24
+ "enable_dynamic_urls": true,
25
+ "enable_file_upload": true,
26
+ "theme": "Default"
27
+ }
requirements.txt ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ gradio>=5.42.0
2
+ requests>=2.32.3
3
+ beautifulsoup4>=4.12.3
4
+ python-dotenv>=1.0.0
5
+ huggingface-hub>=0.20.0