Your Name | Recent Paper Notes

Recent Reading Log

Short, opinionated notes on papers I have read recently. Each entry highlights the core idea and what I found useful.

Attention Is All You Need

Introduces the Transformer architecture using self-attention for sequence modeling, removing recurrence and convolution.

Read notes →

Graphrag

Imported from your local PDF library. Add a summary here.

Read notes →

Uav Detr

Imported from your local PDF library. Add a summary here.

Read notes →

Docres

Imported from your local PDF library. Add a summary here.

Read notes →

Erasenet

Imported from your local PDF library. Add a summary here.

Read notes →

Repvit

Imported from your local PDF library. Add a summary here.

Read notes →

Vit

Imported from your local PDF library. Add a summary here.

Read notes →

Detr

Imported from your local PDF library. Add a summary here.

Read notes →

Rt Detr

Imported from your local PDF library. Add a summary here.

Read notes →

Densenet

Imported from your local PDF library. Add a summary here.

Read notes →

Crnn

Imported from your local PDF library. Add a summary here.

Read notes →