Abhijeet, FNU Sudhakar (2025) Transformer-based architectures for domain-aware clinical text automation. World Journal of Advanced Engineering Technology and Sciences, 16 (2). pp. 251-258. ISSN 2582-8266
Abstract
Clinical text processing automation is an essential solution in the development of a more efficient, accurate, and scalable healthcare system. The complexities posed by electronic health records, diagnostic reports, and unstructured clinical narratives that feature heavy terminology, erratic structures, and domain-specific semantics escaped traditional natural language processing methods because they were unable to deal with the complexity of the data. Transformer architecture has become a revolutionary solution by providing self-attention and contextual embedding structures representing long-range dependencies and fine-level word patterns. These models facilitate automated clinical documentation, coding, decision support, and multimodal data integration at higher accuracy and compliance by considering methods that are domain-aware, like biomedical pretraining, ontology integration, federated learning, and privacy-preserving training. In this paper, we will review the history of transformer models in the context of clinical NLP, the domain adaptation methods they use, how to achieve scalability and observability, and what the potential future research opportunities are, such as benchmarking multi-region failover, cost-aware autoscaling of health infrastructure using artificial intelligence.
Item Type: | Article |
---|---|
Official URL: | https://doi.org/10.30574/wjaets.2025.16.2.1280 |
Uncontrolled Keywords: | Transformers; Clinical Nlp; Domain Adaptation; Clinical Automation; Healthcare AI |
Date Deposited: | 15 Sep 2025 05:45 |
Related URLs: | |
URI: | https://eprint.scholarsrepository.com/id/eprint/6073 |