Skip to content

Commit 37c5f86

Browse files
committed
2 parents 8a4d194 + 284eeaf commit 37c5f86

File tree

2 files changed

+9
-83
lines changed

2 files changed

+9
-83
lines changed

.eslintrc.json

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,6 @@
2424
"no-return-assign":"off",
2525
"react/no-unescaped-entities": "off",
2626
"max-len": "off",
27-
"camelcase": "off",
28-
"linebreak-style": ["error", "windows"]
27+
"camelcase": "off"
2928
}
3029
}

src/Components/Main.jsx

Lines changed: 8 additions & 81 deletions
Original file line numberDiff line numberDiff line change
@@ -88,7 +88,7 @@ function Main() {
8888
We used these metrics to train machine learning classifiers to predict
8989
which nodes would be critical to language and speech. Example data (C–E)
9090
are provided from a single participant (n = 1) for each visualization.
91-
Source data are provided as a Source Data file
91+
Source data are provided as a Source Data file.
9292
<Col>
9393
<Carousel
9494
interval={null}
@@ -111,6 +111,7 @@ function Main() {
111111
<Image fluid src={CorticalSites5} />
112112
</Carousel.Item>
113113
</Carousel>
114+
114115
{index1 === 0 && (
115116
<p>
116117
"A DES was used either intraoperatively (depicted) or in the
@@ -139,9 +140,10 @@ function Main() {
139140
learning classifiers to predict which nodes would be critical to
140141
language and speech. Example data (C–E) are provided from a single
141142
participant (n = 1) for each visualization. Source data are
142-
provided as a Source Data file. "
143+
provided as a Source Data file."
143144
</p>
144145
)}
146+
145147
{index1 === 1 && (
146148
<p>
147149
"PC participation coefficient, S strength, CC clustering
@@ -168,6 +170,7 @@ function Main() {
168170
{' '}
169171
</p>
170172
)}
173+
171174
{index1 === 2 && (
172175
<p>
173176
"PC participation coefficient, S strength, CC clustering
@@ -231,6 +234,7 @@ function Main() {
231234
{' '}
232235
</p>
233236
)}
237+
234238
{index1 === 4 && (
235239
<p>
236240
"For within-participant classification, participants with at least
@@ -288,16 +292,7 @@ function Main() {
288292
significance in two-dimensional space, and can analyze much longer
289293
time series. We also propose a criterion for statistical model
290294
selection, based on both goodness of fit and width of confidence
291-
intervals. Using ERC with 2D moving average to study naming under
292-
conditions in which perceptual modality and ambiguity were
293-
contrasted, we observed new patterns of task-related neural
294-
propagation that were nevertheless consistent with expectations
295-
derived from previous studies of naming. ERC with 2D moving average
296-
is uniquely suitable to both research and clinical applications and
297-
can be used to estimate the statistical significance of neural
298-
propagation for both cognitive neuroscientific studies and
299-
functional brain mapping prior to resective surgery for epilepsy and
300-
brain tumors.
295+
intervals.
301296
</p>
302297
<Button
303298
href="https://www.sciencedirect.com/science/article/pii/S0893608022000351"
@@ -319,6 +314,7 @@ function Main() {
319314
<Image fluid src={ERC_Naming2} />
320315
</Carousel.Item>
321316
</Carousel>
317+
322318
{index2 === 0 ? (
323319
<p>
324320
"Results of event-related causality (ERC) estimated with 2D moving
@@ -354,75 +350,6 @@ function Main() {
354350
)}
355351
</Col>
356352
</Row>
357-
358-
<hr className="featurette-divider" />
359-
<Row>
360-
<Col>
361-
<h2 className="featurette-heading">
362-
Semi-Autonomous iEEG Brain-Machine Interfaces
363-
</h2>
364-
<p>
365-
We developed a novel system, the Hybrid Augmented Reality Multimodal
366-
Operation Neural Integration Environment (HARMONIE). This system
367-
utilizes hybrid input, supervisory control, and intelligent robotics
368-
to allow users to identify an object (via eye tracking and computer
369-
vision) and initiate (via brain-control) a semi-autonomous
370-
reach-grasp-and-drop of the object by the JHU/APL Modular Prosthetic
371-
Limb MPL. The novel approach demonstrated in this proof-of-principle
372-
study, using hybrid input, supervisory control, and intelligent
373-
robotics, addresses limitations of current BMIs.
374-
{' '}
375-
</p>
376-
<Button
377-
href="http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=6683036&tag=1"
378-
target="_blank"
379-
>
380-
Full Text
381-
</Button>
382-
<Button
383-
href="https://ieeexplore.ieee.org/document/6683036/media#media"
384-
target="_blank"
385-
>
386-
Videos
387-
</Button>
388-
</Col>
389-
<Col>
390-
<Image fluid src={Hybrid_BCI} />
391-
</Col>
392-
</Row>
393-
<hr className="featurette-divider" />
394-
395-
<Row>
396-
<Col>
397-
<h2 className="featurette-heading">Redefining Broca's Area</h2>
398-
<p>
399-
During the cued production of words, a temporal cascade of neural
400-
activity proceeds from sensory representations of words in the
401-
temporal cortex to their corresponding articulatory gestures in the
402-
motor cortex. Broca's area mediates this cascade through reciprocal
403-
interactions with temporal and frontal motor regions. Contrary to
404-
classNameic notions of the role of Broca's area in speech, while the
405-
motor cortex is activated during spoken responses, Broca's area is
406-
surprisingly silent. Moreover, when novel strings of articulatory
407-
gestures must be produced in response to nonword stimuli, neural
408-
activity is enhanced in Broca's area, but not in the motor cortex.
409-
These unique data provide evidence that Broca's area coordinates the
410-
transformation of information across large-scale cortical networks
411-
involved in spoken word production. In this role, Broca's area
412-
formulates an appropriate articulatory code to be implemented by the
413-
motor cortex.
414-
</p>
415-
<Button
416-
href="http://www.pnas.org/content/112/9/2871.short"
417-
target="_blank"
418-
>
419-
Full Text
420-
</Button>
421-
</Col>
422-
<Col>
423-
<Image fluid src={Brocas} />
424-
</Col>
425-
</Row>
426353
</Container>
427354
);
428355
}

0 commit comments

Comments
 (0)