From 429ae46cd0bc3d90bca3e8b90d742cf9aa98d5e2 Mon Sep 17 00:00:00 2001 From: Yan Lin Date: Sat, 31 Jan 2026 16:38:03 +0100 Subject: [PATCH] adjust reference section style --- content/ml-tech/ode-sde/index.md | 2 +- content/ml-tech/rotary-pe/index.md | 16 +++++++++------- 2 files changed, 10 insertions(+), 8 deletions(-) diff --git a/content/ml-tech/ode-sde/index.md b/content/ml-tech/ode-sde/index.md index 5d74349..b6a9a47 100644 --- a/content/ml-tech/ode-sde/index.md +++ b/content/ml-tech/ode-sde/index.md @@ -261,7 +261,7 @@ Below are some preliminary results I obtained from a set of amorphous material g --- -**References:** +## References 1. Holderrieth and Erives, "An Introduction to Flow Matching and Diffusion Models." 2. Song and Ermon, "Generative Modeling by Estimating Gradients of the Data Distribution." diff --git a/content/ml-tech/rotary-pe/index.md b/content/ml-tech/rotary-pe/index.md index f35066c..a3837e9 100644 --- a/content/ml-tech/rotary-pe/index.md +++ b/content/ml-tech/rotary-pe/index.md @@ -164,10 +164,12 @@ LongRoPE also introduces a progressive extension strategy. Rather than jumping d ![](longrope.webp) -> **References:** -> -> 1. RoFormer: Enhanced transformer with Rotary Position Embedding (2024). Su, Jianlin and Ahmed, Murtadha and Lu, Yu and Pan, Shengfeng and Bo, Wen and Liu, Yunfeng. -> 2. Extending context window of large language models via positional interpolation (2023). Chen, Shouyuan and Wong, Sherman and Chen, Liangjian and Tian, Yuandong. -> 3. YaRN: Efficient Context Window Extension of Large Language Models (2023). Peng, Bowen and Quesnelle, Jeffrey and Fan, Honglu and Shippole, Enrico. -> 4. Resonance rope: Improving context length generalization of large language models (2024). Wang, Suyuchen and Kobyzev, Ivan and Lu, Peng and Rezagholizadeh, Mehdi and Liu, Bang. -> 5. LongRoPE: Extending LLM Context Window Beyond 3 Million Tokens (2024). Ding, Yiran and Zhang, Li Lyna and Zhang, Chengruidong and Xu, Yuanyuan and Shang, Ning and Xu, Jiahang and Yang, Fan and Yang, Mao. +--- + +## References + +1. RoFormer: Enhanced transformer with Rotary Position Embedding (2024). Su, Jianlin and Ahmed, Murtadha and Lu, Yu and Pan, Shengfeng and Bo, Wen and Liu, Yunfeng. +2. Extending context window of large language models via positional interpolation (2023). Chen, Shouyuan and Wong, Sherman and Chen, Liangjian and Tian, Yuandong. +3. YaRN: Efficient Context Window Extension of Large Language Models (2023). Peng, Bowen and Quesnelle, Jeffrey and Fan, Honglu and Shippole, Enrico. +4. Resonance rope: Improving context length generalization of large language models (2024). Wang, Suyuchen and Kobyzev, Ivan and Lu, Peng and Rezagholizadeh, Mehdi and Liu, Bang. +5. LongRoPE: Extending LLM Context Window Beyond 3 Million Tokens (2024). Ding, Yiran and Zhang, Li Lyna and Zhang, Chengruidong and Xu, Yuanyuan and Shang, Ning and Xu, Jiahang and Yang, Fan and Yang, Mao.