Simon L Linwood
Overview
Explore the profile of Simon L Linwood including associated specialties, affiliations and a list of published articles.
Author names and details appear as published. Due to indexing inconsistencies, multiple individuals may share a name, and a single author may have variations. MedLuna displays this data as publicly available, without modification or verification
Snapshot
Snapshot
Articles
5
Citations
54
Followers
0
Related Specialties
Related Specialties
Top 10 Co-Authors
Top 10 Co-Authors
Published In
Affiliations
Affiliations
Soon will be listed here.
Recent Articles
1.
Boch S, Omololu S, Fan M, Murnan A, Kelleher K, Linwood S, et al.
J Health Care Poor Underserved
. 2024 Aug;
35(3):777-789.
PMID: 39129601
Little is known about clinical documentation for youth exposed to parental justiceinvolvement (e.g., parole, probation, jail, prison). We reviewed the electronic health records of 100 youth with at least one...
2.
Lee H, Li B, DeForte S, Splaingard M, Huang Y, Chi Y, et al.
Sci Data
. 2022 Jul;
9(1):421.
PMID: 35853958
Despite being crucial to health and quality of life, sleep-especially pediatric sleep-is not yet well understood. This is exacerbated by lack of access to sufficient pediatric sleep data with clinical...
3.
Sirrianni J, Sezgin E, Claman D, Linwood S
Methods Inf Med
. 2022 Jul;
61(5-06):195-200.
PMID: 35835447
Background: Generative pretrained transformer (GPT) models are one of the latest large pretrained natural language processing models that enables model training with limited datasets and reduces dependency on large datasets,...
4.
Zeng X, Linwood S, Liu C
Sci Rep
. 2022 Mar;
12(1):3651.
PMID: 35256645
The adoption of electronic health records (EHR) has become universal during the past decade, which has afforded in-depth data-based research. By learning from the large amount of healthcare data, various...
5.
Sezgin E, Sirrianni J, Linwood S
JMIR Med Inform
. 2022 Feb;
10(2):e32875.
PMID: 35142635
Generative pretrained transformer models have been popular recently due to their enhanced capabilities and performance. In contrast to many existing artificial intelligence models, generative pretrained transformer models can perform with...