Codesota · Models · ASDFormerResearch2 results · 1 benchmarks
Model card

ASDFormer.

Researchopen-sourceTransformer with Mixture of Experts

Transformer-based with MoE decoder. State-of-the-art on ABIDE dataset with interpretable attention mechanisms.

§ 01 · Benchmarks

Every benchmark ASDFormer has a recorded score for.

#BenchmarkArea · TaskMetricValueRankDateSource
01ABIDE IMedical · Disease Classificationauc81.2%#2/92025-08-19source ↗
02ABIDE IMedical · Disease Classificationaccuracy74.6%#12/242025-08-19source ↗
Rank column shows this model’s position vs all other models scored on the same benchmark + metric (competitors after the slash). #1 in red means current SOTA. Sorted by rank, then newest result.
§ 02 · Strengths by area

Where ASDFormer actually performs.

Medical
1
benchmark
avg rank #7.0
§ 03 · Papers

1 paper with results for ASDFormer.

  1. 2025-08-19· 2 results

    ASDFormer: A Transformer with Mixtures of Pooling-Classifier Experts for Autism Spectrum Disorder Diagnosis

§ 04 · Related models

Other Research models scored on Codesota.

DenseNet-121 (Chest X-ray)
8M params · 4 results · 2 SOTA
SimpleNet
2 results · 2 SOTA
DGN
1 result · 1 SOTA
DeepASD
1 result · 1 SOTA
DefectDet (ResNet)
1 result · 1 SOTA
PROXI
1 result · 1 SOTA
ASD-SWNet
2 results
EfficientAD
2 results
§ 05 · Sources & freshness

Where these numbers come from.

paper
2
results
2 of 2 rows marked verified.