Spaces:
Running
on
Zero
Running
on
Zero
Commit
·
c92251d
1
Parent(s):
ca8a6c5
Update
Browse files
README.md
CHANGED
@@ -16,6 +16,7 @@ This is a space to try out the
|
|
16 |
attempts to infer a function name, comment/description, and suitable variable
|
17 |
names, when given the output of Hex-Rays decompiler output of a function. More information is available in this [blog post](https://www.atredis.com/blog/2024/6/3/how-to-train-your-large-language-model).
|
18 |
|
19 |
-
## TODO
|
20 |
|
21 |
* We currently use `transformers` which de-quantizes the gguf. This is easy but inefficient. Can we use llama.cpp or Ollama with zerogpu?
|
|
|
|
16 |
attempts to infer a function name, comment/description, and suitable variable
|
17 |
names, when given the output of Hex-Rays decompiler output of a function. More information is available in this [blog post](https://www.atredis.com/blog/2024/6/3/how-to-train-your-large-language-model).
|
18 |
|
19 |
+
## TODO / Issues
|
20 |
|
21 |
* We currently use `transformers` which de-quantizes the gguf. This is easy but inefficient. Can we use llama.cpp or Ollama with zerogpu?
|
22 |
+
* Model returns the markdown json prefix often. Is this something I am doing wrong?
|
app.py
CHANGED
@@ -72,7 +72,7 @@ def predict(code):
|
|
72 |
|
73 |
output = pipe_out[0]["generated_text"]
|
74 |
|
75 |
-
json_output = json.dumps(
|
76 |
try:
|
77 |
json.loads(output)
|
78 |
json_output = output
|
@@ -87,7 +87,7 @@ def predict(code):
|
|
87 |
demo = gr.Interface(
|
88 |
fn=predict,
|
89 |
inputs=gr.Text(label="Hex-Rays decompiler output"),
|
90 |
-
outputs=[gr.
|
91 |
description=frontmatter.load("README.md").content,
|
92 |
examples=examples
|
93 |
)
|
|
|
72 |
|
73 |
output = pipe_out[0]["generated_text"]
|
74 |
|
75 |
+
json_output = json.dumps([])
|
76 |
try:
|
77 |
json.loads(output)
|
78 |
json_output = output
|
|
|
87 |
demo = gr.Interface(
|
88 |
fn=predict,
|
89 |
inputs=gr.Text(label="Hex-Rays decompiler output"),
|
90 |
+
outputs=[gr.JSON(label="Aidapal Output as JSON"), gr.Text(label="Raw Aidapal Output")],
|
91 |
description=frontmatter.load("README.md").content,
|
92 |
examples=examples
|
93 |
)
|