mstdn.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A general-purpose Mastodon server with a 500 character limit. All languages are welcome.

Administered by:

Server stats:

0
active users

#transformer

0 posts0 participants0 posts today
Funbie Studios ➤ TFNation 2025 (Aug 8-10)<p>Here's a little lore callback for Alchemist Prime and how he was inspired by Maccadam - a spikey lil warhammer! <a class="hashtag" rel="nofollow noopener" href="https://bsky.app/search?q=%233dprint" target="_blank">#3dprint</a> files now up at <a href="https://www.patreon.com/posts/133569593" rel="nofollow noopener" target="_blank">www.patreon.com/posts/133569593</a> - happy fun adding this to your Alchemist Prime too! . . . . . <a class="hashtag" rel="nofollow noopener" href="https://bsky.app/search?q=%233dprint" target="_blank">#3dprint</a> <a class="hashtag" rel="nofollow noopener" href="https://bsky.app/search?q=%233dprinted" target="_blank">#3dprinted</a> <a class="hashtag" rel="nofollow noopener" href="https://bsky.app/search?q=%23transformer" target="_blank">#transformer</a></p>
Habr<p>Часть 4: Mamba — State Space Models vs трансформеры</p><p>Mamba — революция в обработке длинных последовательностей! Mamba — State Space Models vs трансформеры, что лучше?!</p><p><a href="https://habr.com/ru/articles/925416/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">habr.com/ru/articles/925416/</span><span class="invisible"></span></a></p><p><a href="https://zhub.link/tags/mamba" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>mamba</span></a> <a href="https://zhub.link/tags/transformer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>transformer</span></a> <a href="https://zhub.link/tags/nlp" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>nlp</span></a> <a href="https://zhub.link/tags/ssm" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ssm</span></a></p>
Anime of Japan<p><a href="https://www.wacoca.com/anime/1863651/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="">wacoca.com/anime/1863651/</span><span class="invisible"></span></a> 【新作ウルトラマンすごすぎ!?】7月5日発売オメガのおもちゃの遊び方を一気に完全紹介!DXオメガスラッガー最強なりきりセットなど【ウルトラマンオメガ】 <a href="https://wakoka.com/tags/2025Summer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>2025Summer</span></a> <a href="https://wakoka.com/tags/2025SummerAnime" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>2025SummerAnime</span></a> <a href="https://wakoka.com/tags/2025%E5%B9%B4%E5%A4%8F%E9%96%8B%E5%A7%8B%E3%81%AE%E6%96%B0%E4%BD%9C%E3%82%A2%E3%83%8B%E3%83%A1" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>2025年夏開始の新作アニメ</span></a> <a href="https://wakoka.com/tags/Anime" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Anime</span></a> <a href="https://wakoka.com/tags/Candy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Candy</span></a> <a href="https://wakoka.com/tags/chainese" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>chainese</span></a> <a href="https://wakoka.com/tags/hasbro" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>hasbro</span></a> <a href="https://wakoka.com/tags/https" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>https</span></a>://www.youtube.com/channel/UCUNo0xntGxghrW7TKunOSxA <a href="https://wakoka.com/tags/korea" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>korea</span></a> <a href="https://wakoka.com/tags/Review" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Review</span></a> <a href="https://wakoka.com/tags/Toy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Toy</span></a> <a href="https://wakoka.com/tags/transformer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>transformer</span></a> <a href="https://wakoka.com/tags/%E3%82%A2%E3%83%8B%E3%83%A1" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>アニメ</span></a> <a href="https://wakoka.com/tags/%E3%82%A6%E3%83%AB%E3%83%88%E3%83%A9%E3%83%9E%E3%83%B3%E3%82%AA%E3%83%A1%E3%82%AC" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ウルトラマンオメガ</span></a> <a href="https://wakoka.com/tags/%E3%82%BF%E3%82%AB%E3%83%A9%E3%83%88%E3%83%9F%E3%83%BC" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>タカラトミー</span></a> <a href="https://wakoka.com/tags/%E3%83%80%E3%82%AC%E3%83%B3%E3%82%B0" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ダガング</span></a> <a href="https://wakoka.com/tags/%E3%83%88%E3%83%A9%E3%83%B3%E3%82%B9%E3%83%95%E3%82%A9%E3%83%BC%E3%83%9E%E3%83%BC" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>トランスフォーマー</span></a> <a href="https://wakoka.com/tags/%E4%B8%AD%E5%9B%BD" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>中国</span></a> <a href="https://wakoka.com/tags/%E5%90%88%E4%BD%93" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>合体</span></a> <a href="https://wakoka.com/tags/%E5%A4%89%E5%BD%A2" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>変形</span></a> <a href="https://wakoka.com/tags/%E6%96%B0%E4%BD%9C%E3%82%A2%E3%83%8B%E3%83%A1" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>新作アニメ</span></a> <a href="https://wakoka.com/tags/%E9%9F%93%E5%9B%BD" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>韓国</span></a> <a href="https://wakoka.com/tags/%E9%A3%9F%E7%8E%A9" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>食玩</span></a> <a href="https://wakoka.com/tags/%E9%A7%84%E7%8E%A9%E5%85%B7" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>駄玩具</span></a></p>
Hacker News<p>We reimagined Transformer architectures inspired by nature's hidden structures</p><p><a href="https://ieeexplore.ieee.org/document/10754699" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">ieeexplore.ieee.org/document/1</span><span class="invisible">0754699</span></a></p><p><a href="https://mastodon.social/tags/HackerNews" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HackerNews</span></a> <a href="https://mastodon.social/tags/Transformer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Transformer</span></a> <a href="https://mastodon.social/tags/Architectures" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Architectures</span></a> <a href="https://mastodon.social/tags/Nature" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Nature</span></a> <a href="https://mastodon.social/tags/Inspired" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Inspired</span></a> <a href="https://mastodon.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://mastodon.social/tags/Hidden" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Hidden</span></a> <a href="https://mastodon.social/tags/Structures" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Structures</span></a></p>
Habr<p>Часть 3: Diffusion Transformer (DiT) — Stable Diffusion 3 как она есть</p><p>В этой статье погрузимся в мир генерации изображений с Diffusion Transformer (DiT) — сердцем Stable Diffusion 3. Разберем как она устроена и как работает</p><p><a href="https://habr.com/ru/articles/924410/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">habr.com/ru/articles/924410/</span><span class="invisible"></span></a></p><p><a href="https://zhub.link/tags/stable_diffusion" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>stable_diffusion</span></a> <a href="https://zhub.link/tags/transformer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>transformer</span></a> <a href="https://zhub.link/tags/diffusion" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>diffusion</span></a> <a href="https://zhub.link/tags/vae" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>vae</span></a></p>
Arthur Hau, PhD🐶🐱🌱🎵🦣<p>It is hard to work with LLMs as they have message length limits and window length limits. The current <a href="https://tribe.net/tags/Transformer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Transformer</span></a> model is highly inefficient in that for every conversation over 100,000 tokens are being processed and held in the memory of an <a href="https://tribe.net/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a>. As the conversation progresses, some old bits of information are replaced by new chat bits. As my <a href="https://tribe.net/tags/browser" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>browser</span></a> dashboard project grows, it is getting very difficult to communicate with <a href="https://tribe.net/tags/Claude4" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Claude4</span></a> <a href="https://tribe.net/tags/Sonnet" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Sonnet</span></a>. It is all about project <a href="https://tribe.net/tags/management" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>management</span></a>. </p><p><a href="https://tribe.net/tags/LLMs" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLMs</span></a> <a href="https://tribe.net/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a></p>
Arthur Hau, PhD🐶🐱🌱🎵🦣<p><a href="https://tribe.net/tags/Claude4" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Claude4</span></a> <a href="https://tribe.net/tags/Sonnet" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Sonnet</span></a> just created a pdf editor inside our <a href="https://tribe.net/tags/ICandy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ICandy</span></a> <a href="https://tribe.net/tags/browser" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>browser</span></a> dashboard. Now, I can double click a <a href="https://tribe.net/tags/pdf" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>pdf</span></a> <a href="https://tribe.net/tags/Ebook" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Ebook</span></a> in my <a href="https://tribe.net/tags/ELibrary" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ELibrary</span></a> module in ICandy. It will be displayed inside ICandy for me to read, highlight, and add note annotation to the ebook. <a href="https://tribe.net/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://tribe.net/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a> <a href="https://tribe.net/tags/LLMs" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLMs</span></a> </p><p>It is not easy to work with LLMs or other AIs. However, it is not difficult either. You only need to know some basic computer operations and the theory behind the current <a href="https://tribe.net/tags/transformer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>transformer</span></a> <a href="https://tribe.net/tags/CNN" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CNN</span></a> LLM models.</p>
Habr<p>Часть 2: Vision Transformer (ViT) — Когда трансформеры научились видеть</p><p>Представьте, что лингвист внезапно стал экспертом по живописи. Именно это произошло в 2020 году, когда архитектура для обработки текста — трансформеры — научилась "видеть" изображения. Vision Transformer (ViT) доказал: для понимания картинок не обязательны свёртки! Разберем "на пальцах" как она устроена и как изображения превращаются в предсказания.</p><p><a href="https://habr.com/ru/articles/922868/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">habr.com/ru/articles/922868/</span><span class="invisible"></span></a></p><p><a href="https://zhub.link/tags/visual_transformer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>visual_transformer</span></a> <a href="https://zhub.link/tags/vit" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>vit</span></a> <a href="https://zhub.link/tags/transformer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>transformer</span></a> <a href="https://zhub.link/tags/computervision" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>computervision</span></a> <a href="https://zhub.link/tags/%D1%80%D0%B0%D0%B7%D0%B1%D0%BE%D1%80_%D1%81%D1%82%D0%B0%D1%82%D1%8C%D0%B8" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>разбор_статьи</span></a></p>
hideosnes<p>The following line made my day:</p><p>&lt;code&gt;<br>let foo = writable&lt;bar() | null&gt;(null)<br>&lt;/code&gt;</p><p>Henceforth, I declare, feeling keen kinship with the level of anal retentiveness of typescript.</p><p><a href="https://fedi.at/tags/improving" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>improving</span></a> <a href="https://fedi.at/tags/transformer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>transformer</span></a> <a href="https://fedi.at/tags/js" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>js</span></a> <a href="https://fedi.at/tags/ai" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ai</span></a> <a href="https://fedi.at/tags/code" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>code</span></a> with <a href="https://fedi.at/tags/safeCode" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>safeCode</span></a></p>
InfoQ<p>Tired of <a href="https://techhub.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> search hallucinations?</p><p>The root cause often lies in the architecture behind most AI models: the <a href="https://techhub.social/tags/Transformer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Transformer</span></a>.</p><p>In this <a href="https://techhub.social/tags/InfoQ" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>InfoQ</span></a> article, Albert Lie explains how <a href="https://techhub.social/tags/StateSpaceModels" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>StateSpaceModels</span></a> (<a href="https://techhub.social/tags/SSMs" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SSMs</span></a>) can fix this, and what it could mean for the future of AI search.</p><p>Read now: <a href="https://bit.ly/4l57iQt" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">bit.ly/4l57iQt</span><span class="invisible"></span></a> </p><p><a href="https://techhub.social/tags/LLMs" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLMs</span></a></p>
Christian Mayer<p>XSLT — <a href="https://github.com/pacocoursey/xslt" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">github.com/pacocoursey/xslt</span><span class="invisible"></span></a></p><p><a href="https://mastodon.social/tags/XML" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>XML</span></a> <a href="https://mastodon.social/tags/XSLT" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>XSLT</span></a> <a href="https://mastodon.social/tags/HTML" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HTML</span></a> <a href="https://mastodon.social/tags/JSON" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>JSON</span></a> <a href="https://mastodon.social/tags/Markdown" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Markdown</span></a> <a href="https://mastodon.social/tags/plaintext" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>plaintext</span></a> <a href="https://mastodon.social/tags/transformer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>transformer</span></a></p>
:rss: Qiita - 人気の記事<p>GPTのしくみ入門:AIはどのように言葉を「理解」し、「生み出している」のか?<br><a href="https://qiita.com/KYoshiyama/items/d86ae5afc29650f0b924?utm_campaign=popular_items&amp;utm_medium=feed&amp;utm_source=popular_items" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">qiita.com/KYoshiyama/items/d86</span><span class="invisible">ae5afc29650f0b924?utm_campaign=popular_items&amp;utm_medium=feed&amp;utm_source=popular_items</span></a></p><p><a href="https://rss-mstdn.studiofreesia.com/tags/qiita" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>qiita</span></a> <a href="https://rss-mstdn.studiofreesia.com/tags/word2vec" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>word2vec</span></a> <a href="https://rss-mstdn.studiofreesia.com/tags/Transformer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Transformer</span></a> <a href="https://rss-mstdn.studiofreesia.com/tags/ChatGPT" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ChatGPT</span></a> <a href="https://rss-mstdn.studiofreesia.com/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a> <a href="https://rss-mstdn.studiofreesia.com/tags/ContextualEmbedding" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ContextualEmbedding</span></a></p>
Robotics<p><a href="https://mastodon.world/tags/Drone" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Drone</span></a> car 🚗 <a href="https://mastodon.world/tags/robot" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>robot</span></a> <a href="https://www.youtube.com/watch?v=5w7pl7xQGKM" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">youtube.com/watch?v=5w7pl7xQGK</span><span class="invisible">M</span></a></p><p><a href="https://mastodon.world/tags/Transformer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Transformer</span></a></p>
Arthur Hau, PhD🐶🐱🌱🎵🦣<p>How good is <a href="https://tribe.net/tags/Google" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Google</span></a> 's <a href="https://tribe.net/tags/Gemma" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Gemma</span></a> 3 12b small <a href="https://tribe.net/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a> <a href="https://tribe.net/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> running on <a href="https://tribe.net/tags/LMStudio" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LMStudio</span></a> running on my i9 rtx 4060 Dell <a href="https://tribe.net/tags/laptop" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>laptop</span></a>? I would say it is a very good <a href="https://tribe.net/tags/Japanese" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Japanese</span></a> <a href="https://tribe.net/tags/learning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>learning</span></a> companion. Most LLMs are using very similar <a href="https://tribe.net/tags/transformer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>transformer</span></a> models. The differences between these smaller models can be significant because of their training data. They cannot train like large LLMs because of their small sizes. </p><p>I told Gemma 3 to create a short story in Japanese and to follow up with its English translation.</p>
Habr<p>Разработка LLM с нуля</p><p>Краткий обзор курса, который я недавно закончил пилить на степике - Разработка LLM с нуля . Этот практический курс, на котором вам предстоит создать с нуля свою собственную LLM: начиная с токенизатора и заканчивая генерацией текста. Для разработки будут использоваться только Python и низкоуровневый PyTorch, не полагаясь на какие-либо высокоуровневые библиотеки. Курс платный. Следующие две недели по промокоду FIRST предоставляется скидка 50%.</p><p><a href="https://habr.com/ru/articles/918568/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">habr.com/ru/articles/918568/</span><span class="invisible"></span></a></p><p><a href="https://zhub.link/tags/llm" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>llm</span></a> <a href="https://zhub.link/tags/gpt" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>gpt</span></a> <a href="https://zhub.link/tags/transformer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>transformer</span></a></p>
50+ Music<p>"Walk on the Wild Side" is a song by American <a href="https://mastodon.online/tags/rock" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>rock</span></a> musician <a href="https://mastodon.online/tags/LouReed" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LouReed</span></a> from his second solo studio album, <a href="https://mastodon.online/tags/Transformer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Transformer</span></a> (1972). It was produced by <a href="https://mastodon.online/tags/DavidBowie" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>DavidBowie</span></a> and <a href="https://mastodon.online/tags/MickRonson" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>MickRonson</span></a> and released as a <a href="https://mastodon.online/tags/doubleAside" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>doubleAside</span></a> with "<a href="https://mastodon.online/tags/PerfectDay" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>PerfectDay</span></a>". Known as a counterculture anthem, the song received heavy radio play and became Reed's biggest hit and <a href="https://mastodon.online/tags/signatureSong" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>signatureSong</span></a> while touching on topics considered taboo at the time, such as <a href="https://mastodon.online/tags/transgender" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>transgender</span></a> people, <a href="https://mastodon.online/tags/drugs" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>drugs</span></a>, <a href="https://mastodon.online/tags/maleProstitution" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>maleProstitution</span></a>, and <a href="https://mastodon.online/tags/oralSex" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>oralSex</span></a>. <br><a href="https://www.youtube.com/watch?v=3wffYJ5URPE" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="">youtube.com/watch?v=3wffYJ5URPE</span><span class="invisible"></span></a></p>
Hacker News<p>JavelinGuard: Low-Cost Transformer Architectures for LLM Security</p><p><a href="https://arxiv.org/abs/2506.07330" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">arxiv.org/abs/2506.07330</span><span class="invisible"></span></a></p><p><a href="https://mastodon.social/tags/HackerNews" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HackerNews</span></a> <a href="https://mastodon.social/tags/JavelinGuard" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>JavelinGuard</span></a> <a href="https://mastodon.social/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a> <a href="https://mastodon.social/tags/Security" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Security</span></a> <a href="https://mastodon.social/tags/Transformer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Transformer</span></a> <a href="https://mastodon.social/tags/Architecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Architecture</span></a> <a href="https://mastodon.social/tags/LowCost" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LowCost</span></a> <a href="https://mastodon.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://mastodon.social/tags/Research" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Research</span></a></p>
AnthropyML, AI, LLM from absolute scratch with own data and solar power
𝕒𝕓𝕕𝕖𝕣𝕘𝕠<p>Ein tiefgehender Leitfaden zum Training von Transformer-Modellen von Grund auf,inklusive Datensatzkuration,Erstellung eines benutzerdefinierten Tokenizers und Multi-GPU-Training mit PyTorch Accelerate. Fokus auf einem Python-Code-Generierungsmodell mit GitHub-Daten,unter Berücksichtigung von Herausforderungen wie Bias und Skalierung. Praktische Beispiele für Code-Generierung und Bewertung betonen die Bedeutung domänenspezifischer Tokenisierung. <a href="https://mastodon.social/tags/KI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>KI</span></a> <a href="https://mastodon.social/tags/MaschinellesLernen" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>MaschinellesLernen</span></a> <a href="https://mastodon.social/tags/Transformer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Transformer</span></a></p>
疲倦的森林仍有群鸟扑翅如千百万个蝴蝶破茧而出<p><a href="https://fedibird.com/tags/senart" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>senart</span></a> <a href="https://fedibird.com/tags/opm" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>opm</span></a> <a href="https://fedibird.com/tags/OPMEG" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OPMEG</span></a> <a href="https://fedibird.com/tags/%E6%93%8E%E5%A8%81" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>擎威</span></a> <a href="https://fedibird.com/tags/transformer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>transformer</span></a> <br>此女被群友提醒随手存的表情包是对家画的于是提笔怒画:</p>