QiaoNPC commited on
Commit
355a00d
·
1 Parent(s): 602a074

Fix readme.md: I deleted the configuration before, adding it back now

Browse files
Files changed (1) hide show
  1. README.md +11 -11
README.md CHANGED
@@ -1,14 +1,3 @@
1
- # PwnAI Demo
2
-
3
- ## Overview
4
- PwnAI is an educational event that explores adversarial machine learning techniques, specifically focusing on attacking Image Classifiers and Language Model (LM) Prompt Injections. This repository contains a demo showcasing how adversarial attacks can be applied to image classifiers.
5
-
6
- ## Demo Description
7
- The demo includes two example pictures that appear very similar but are classified differently. Users can interact with the demo by submitting both pictures for inference, allowing them to observe how the machine learning model's classification can be manipulated through adversarial attacks. Users can also submit their own pictures to play around.
8
-
9
- ## Performance Note
10
- Please note that this demo runs on a free-tier CPU, so its performance may be slow.
11
-
12
  ---
13
  title: PwnAI Image Classification Demo
14
  emoji: 😻
@@ -21,3 +10,14 @@ pinned: false
21
  ---
22
 
23
  Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  title: PwnAI Image Classification Demo
3
  emoji: 😻
 
10
  ---
11
 
12
  Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
13
+
14
+ # PwnAI Demo
15
+
16
+ ## Overview
17
+ PwnAI is an educational event that explores adversarial machine learning techniques, specifically focusing on attacking Image Classifiers and Language Model (LM) Prompt Injections. This repository contains a demo showcasing how adversarial attacks can be applied to image classifiers.
18
+
19
+ ## Demo Description
20
+ The demo includes two example pictures that appear very similar but are classified differently. Users can interact with the demo by submitting both pictures for inference, allowing them to observe how the machine learning model's classification can be manipulated through adversarial attacks. Users can also submit their own pictures to play around.
21
+
22
+ ## Performance Note
23
+ Please note that this demo runs on a free-tier CPU, so its performance may be slow.