PKU-Alignment Group @Pair-Lab (under construction)
PKU-Alignment Group @Pair-Lab (under construction)
News
People
Events
Publications
Contact
More Platforms
知乎
Bilibili
Email
小红书
PAIR-Lab
Copied
Copied to clipboard
Tianyi Qiu
Undergraduate Student
Peking University
Interests
Value Alignment
Scalable Oversight
Human-AI Interaction
AI Societal Impact
Latest
AI Alignment: A Comprehensive Survey
ProgressGym: Alignment with a Millennium of Moral Progress
Reward Generalization in RLHF: A Topological Perspective
Align Anything: Training All-Modality Models to Follow Instructions with Language Feedback
Language Models Resist Alignment: Evidence From Data Compression
ProgressGym: Alignment with a Millennium of Moral Progress
PKU-SafeRLHF: Towards Multi-Level Safety Alignment for LLMs with Human Preference
Cite
×