-
Notifications
You must be signed in to change notification settings - Fork 4
/
Copy pathindex.html
74 lines (72 loc) · 6.46 KB
/
index.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
<!DOCTYPE html>
<html lang="en-GB">
<head>
<meta charset="utf-8" />
<title>Transformer Attention Visualiser</title>
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<meta name="author" content="Ayaka, Nixie (Jing Hua)" />
<meta name="keywords" content="Transformer, BERT, attention mechanism, attention matrices, visualiser" />
<meta name="description" content="TrAVis is a Transformer attention visualiser that can visualise BERT attention matrices in your browser." />
<meta name="twitter:card" content="summary_large_image" />
<meta name="twitter:site" content="@ayaka14732" />
<meta name="twitter:title" content="Transformer Attention Visualiser" />
<meta name="twitter:image" content="https://ayaka14732.github.io/TrAVis/demo/twitter.png" />
<meta name="twitter:description" content="TrAVis is a Transformer attention visualiser that can visualise BERT attention matrices in your browser." />
<script src="https://cdn.jsdelivr.net/pyodide/v0.21.3/full/pyodide.js"></script>
<script src="https://d3js.org/d3.v7.min.js"></script>
<script src="visualiser.js"></script>
<script defer src="demo_result.js"></script>
<script defer src="index.js"></script>
<link rel="stylesheet" href="Rubik Light.css" />
<link rel="stylesheet" href="index.css" />
</head>
<body>
<div class="loading-indicator warning">The web page is loading the model weights. While you are waiting, you can explore the attention matrices of the BERT model for the default sentence.</div>
<h1>TrAVis: Transformer Attention Visualiser</h1>
<div class="input-container">
<div class="webflow-style-input">
<input type="text" id="userInput" value="A wealthy white-collar worker from Costa Rica working in our marketing agency has set up a marine conservation centre." />
<button id="visualise-input" class="hidden" onclick="handleVisualise(true)">
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 448 512">
<!--! Font Awesome Pro 6.2.0 by @fontawesome - https://fontawesome.com License - https://fontawesome.com/license (Commercial License) Copyright 2022 Fonticons, Inc. -->
<path fill="white" d="M438.6 278.6c12.5-12.5 12.5-32.8 0-45.3l-160-160c-12.5-12.5-32.8-12.5-45.3 0s-12.5 32.8 0 45.3L338.8 224 32 224c-17.7 0-32 14.3-32 32s14.3 32 32 32l306.7 0L233.4 393.4c-12.5 12.5-12.5 32.8 0 45.3s32.8 12.5 45.3 0l160-160z" />
</svg>
</button>
<button>
<div class="lds-ring loading-indicator">
<div></div>
<div></div>
<div></div>
<div></div>
</div>
</button>
</div>
</div>
<div class="slider-wrapper">
<div class="slider">
<label for="layer-input">Layer 6</label>
<input type="range" min="0" max="11" value="6" step="1" id="layer-input" onchange="handleVisualise()" />
</div>
<div class="slider">
<label for="head-input">Head 9</label>
<input type="range" min="0" max="11" value="9" step="1" id="head-input" onchange="handleVisualise()" />
</div>
</div>
<div id="my_dataviz"></div>
<main>
<h2>What is this?</h2>
<p>TrAVis (source code on <a href="https://github.com/ayaka14732/TrAVis">GitHub</a>) is a Transformer Attention Visualiser. The idea of visualising the attention matrices is inspired by <a href="https://arxiv.org/abs/1409.0473"><em>Neural Machine Translation by Jointly Learning to Align and Translate</em></a>.</p>
<p>The original paper of the Transformer model was named <a href="https://arxiv.org/abs/1706.03762"><em>Attention Is All You Need</em></a>, demonstrating the centrality of the attention mechanism to <a href="https://huggingface.co/docs/transformers/model_summary">Transformer-based models</a>. These models generate attention matrices during the computation of the attention mechanism, which indicate how the models process the input data, and can therefore be seen as a concrete representation of the mechanism.</p>
<p>In the <a href="https://arxiv.org/abs/1810.04805">BERT</a> Base Uncased model, for example, there are 12 transformer layers, each layer contains 12 heads, and each head generates one attention matrix. TrAVis is the tool for visualising these attention matrices.</p>
<h2>Why is it important?</h2>
<p>Despite the popularity of Transformer-based models, people often utilise them by just simply running the training scripts, ignoring what is going on inside the model. TrAVis helps us to better understand how Transformer-based models work internally, thus enabling us to better exploit them to solve our problems and, furthermore, giving us inspirations to make improvements to the model architecture.</p>
<h2>How does it work?</h2>
<p>The project consists of 4 parts.</p>
<p>Firstly, we <a href="https://github.com/ayaka14732/bart-base-jax"><b>implemented</b></a> the <a href="https://arxiv.org/abs/1910.13461">BART</a> model from scratch using <a href="https://github.com/google/jax">JAX</a>. We chose JAX because it is an amazing deep learning framework that enables us to write clear source code, and it can be easily converted to NumPy, which can be executed in-browser. We chose the BART model because it is a complete encoder-decoder model, so it can be easily adapted to other models, such as BERT, by simply taking a subset of the source code.</p>
<p>Secondly, we <a href="https://github.com/ztjhz/word-piece-tokenizer"><b>implemented</b></a> the <a href="https://huggingface.co/docs/transformers/model_doc/bert#transformers.BertTokenizer">HuggingFace BERT Tokeniser</a> in pure Python, as it can be more easily executed in-browser. Moreover, we have optimised the tokenisation algorithm, which is 57% faster than the original HuggingFace implementation.</p>
<p>Thirdly, we use <a href="https://pyodide.org/">Pyodide</a> to run our Python code in browser. Pyodide supports all Python libraries implemented in pure Python, with <a href="https://pyodide.org/en/stable/usage/packages-in-pyodide.html">additional support</a> for a number of other libraries such as NumPy and SciPy.</p>
<p>Fourthly, we visualise the attention matrices in our web application using <a href="https://d3js.org/">d3.js</a>.</p>
</main>
<footer class="centered"><div>Brought to you with <span class="emoji">❤️</span> by </div><div><a href="https://github.com/ayaka14732">Ayaka</a> and <a href="https://github.com/ztjhz">Nixie (Jing Hua)</a></div></footer>
</body>
</html>