Skip to content

Activity

Improve clarity and fix typos

mtanghupushed 1 commit to main • ddf7bcc…5498928 • 
on Jun 21, 2023

Explain some results from the project

mtanghupushed 1 commit to main • ecc5b4c…ddf7bcc • 
on Jun 21, 2023

Describe the general project in a README

mtanghupushed 1 commit to main • 0ffe713…ecc5b4c • 
on Jun 21, 2023

Try lowering leverage and see scaling again

mtanghupushed 1 commit to main • 93e0a3c…0ffe713 • 
on May 26, 2023

Try multi-sigmoid trade loss to boost trading

mtanghupushed 1 commit to main • e59313c…93e0a3c • 
on May 25, 2023

Remove gconv and robust loss files (unused)

mtanghupushed 1 commit to main • 976d25b…e59313c • 
on May 25, 2023

Try robust loss from Ma & Huang 2020

mtanghupushed 1 commit to main • da3672e…976d25b • 
on May 18, 2023

Remove sgconv naming (due to switch to FlashMHA)

mtanghupushed 1 commit to main • dfe11c0…da3672e • 
on May 18, 2023

Experiment with norms, go back to layernorm

mtanghupushed 1 commit to main • a77995c…dfe11c0 • 
on May 18, 2023

Try rotary, rmsnorm and normal gelu activation

mtanghupushed 1 commit to main • 127e28d…a77995c • 
on May 15, 2023

Create only majors dataset again

mtanghupushed 1 commit to main • d5d566f…127e28d • 
on May 15, 2023

Turn off biases on output layers

mtanghupushed 1 commit to main • bcec40b…d5d566f • 
on May 6, 2023

Turn off final layer norm (fixed scaling?)

mtanghupushed 1 commit to main • 4ef5097…bcec40b • 
on May 6, 2023

Try an embedding norm to help with stability

mtanghupushed 1 commit to main • d36cee9…4ef5097 • 
on May 6, 2023

Use log1p and set max loss to .5 for inverse math

mtanghupushed 1 commit to main • cbb3552…d36cee9 • 
on May 3, 2023

Try small, regularized, and only trade loss runs

mtanghupushed 1 commit to main • 68139c8…cbb3552 • 
on Apr 29, 2023

Fix embedding dropout naming bug

mtanghupushed 1 commit to main • 4d957c9…68139c8 • 
on Apr 26, 2023

Apply embedding dropout

Force push
mtanghuforce pushed to main • a3afe34…4d957c9 • 
on Apr 26, 2023

Apply and embedding dropout

mtanghupushed 1 commit to main • 7c1fbc0…a3afe34 • 
on Apr 26, 2023

Try some larger runs (not scaling, divergence)

mtanghupushed 1 commit to main • 9a13ff5…7c1fbc0 • 
on Apr 21, 2023

Rerun normal data processing with max classes

mtanghupushed 1 commit to main • 9af5f60…9a13ff5 • 
on Apr 20, 2023

Start preparing/planning for scaling experiment

mtanghupushed 1 commit to main • dd0db37…9af5f60 • 
on Apr 20, 2023

Turn off SGConv and non linear embedding

mtanghupushed 1 commit to main • 773b71d…dd0db37 • 
on Apr 20, 2023

Adjust layers for scaling and speed

mtanghupushed 1 commit to main • 1e90bff…773b71d • 
on Apr 20, 2023

Clean up validation splitting to 120 days

mtanghupushed 1 commit to main • 0a3735a…1e90bff • 
on Apr 20, 2023

Write out martingale robustness

mtanghupushed 1 commit to main • a687040…0a3735a • 
on Apr 17, 2023

Organize and analyze random data experiment

mtanghupushed 1 commit to main • 6836dda…a687040 • 
on Apr 17, 2023

Run random data experiments (positive results)

mtanghupushed 1 commit to main • bc9e33a…6836dda • 
on Apr 16, 2023

Set randomization parameter legitimately

mtanghupushed 1 commit to main • 9eb7b2a…bc9e33a • 
on Apr 16, 2023

Use theoretical randomness and improve stability

mtanghupushed 1 commit to main • 8248c19…9eb7b2a • 
on Apr 15, 2023