Press Release Dear Customers, Dear Friends of Brain Products, 2019-02-18 · Press Release April...

13
Press Release April 2013, Volume 46 Contents of this issue Dear Customers, Dear Friends of Brain Products, In this issue of our Press Release we are introducing a new member of our family of peripheral sensors for recordings in MRI scanners, the Respiration Belt MR. The interest of the scientific community for in MRI usable respiration sensor grew significantly in the past years. Respiration is the source of several artifacts both in the EEG and fMRI data, which can be corrected or disentangled from the signals of interest, if a respiration belt is used. This sensor joins the already large family of amplifiers (BrainAmp MR series), caps (BrainCap MR), sensors (GSR MR, Acceleration 3D MR) and software solutions, which had been explicitly designed for recordings in MRI scanners, and that make Brain Products the leading company for the co-registration of ExG and other peripheral physiological signals with fMRI. In this context we also present a video on “How to obtain high quality EEG data during simultaneous fMRI” in this issue of our Press Release. The video is the result of a cooperation between Brain Products, the University of Nottingham and the Journal of Visualized Experiments (JoVE). Another product we are presenting here, is a new version of the BrainVision Recorder software. The latter is an evergreen product for us, whose first version was compiled more than 15 years ago. The new version is Windows 8 compatible and includes some very interesting tools allowing for example to import non-standard cap montages (equidistant or customized) in a very user friendly manner or the option to display different channel groups according to different time and amplitude scaling. I hope you enjoy reading our Press Release. Pierluigi Castellone Brain Products‘ CEO

Transcript of Press Release Dear Customers, Dear Friends of Brain Products, 2019-02-18 · Press Release April...

Page 1: Press Release Dear Customers, Dear Friends of Brain Products, 2019-02-18 · Press Release April 2013, Volume 46 Contents of this issue Dear Customers, Dear Friends of Brain Products,

Press Release

April 2013, Volume 46

Contents of this issue

Dear Customers, Dear Friends of Brain Products,

In this issue of our Press Release we are introducing a new member

of our family of peripheral sensors for recordings in MRI scanners,

the Respiration Belt MR.

The interest of the scientific community for in MRI usable

respiration sensor grew significantly in the past years. Respiration

is the source of several artifacts both in the EEG and fMRI data,

which can be corrected or disentangled from the signals of interest,

if a respiration belt is used. This sensor joins the already large

family of amplifiers (BrainAmp MR series), caps (BrainCap MR),

sensors (GSR MR, Acceleration 3D MR) and software solutions, which had been explicitly

designed for recordings in MRI scanners, and that make Brain Products the leading company

for the co-registration of ExG and other peripheral physiological signals with fMRI.

In this context we also present a video on “How to obtain high quality EEG data during

simultaneous fMRI” in this issue of our Press Release. The video is the result of a

cooperation between Brain Products, the University of Nottingham and the Journal

of Visualized Experiments ( JoVE).

Another product we are presenting here, is a new version of the BrainVision Recorder

software. The latter is an evergreen product for us, whose first version was compiled

more than 15 years ago. The new version is Windows 8 compatible and includes some very

interesting tools allowing for example to import non-standard cap montages (equidistant

or customized) in a very user friendly manner or the option to display different channel

groups according to different time and amplitude scaling.

I hope you enjoy reading our Press Release.

Pierluigi Castellone

Brain Products‘ CEO

Page 2: Press Release Dear Customers, Dear Friends of Brain Products, 2019-02-18 · Press Release April 2013, Volume 46 Contents of this issue Dear Customers, Dear Friends of Brain Products,

www.brainproducts.com

Brain Products Press Release April 2013, Volume 46

IN THE FOCUS

The Respiration Belt MR: a new device for parallel respiratory measurements by Nicola Soldati

The investigation of physiological signals continues to receive a

great amount of attention. The obvious reason is that such signals

are deeply rooted in the nature of the subject of investigation

(i.e. living beings) and they strongly interact with the organisms

at various levels, from pure physiology to higher level cognitive

functions.

Depending on the type of research, these signals can be useful

when addressing specific questions, or they maybe simply add

noise to the signals of interest. The latter is often the case in

neuroscience, where strong physiological phenomena such as

cardiac and respiratory cycles affect the measurements acquired

by different techniques (EEG, fMRI).

Respiration plays a critical role in the MR environment, where it

may not only be a confounding factor, but also a source of

related artifacts. It can be linked to movement artifacts (due to

the mechanical action of breathing - the typical respiratory rate

of a healthy adult is 12-20 breaths per minute), physiological

alterations (change of BOLD signal properties), induced field

inhomogeneity (change of air volume in the lungs can affect the

magnetic field locally), or interference with the experimental

paradigm.

Studies using fMRI show that respiratory effects cannot be

ignored, given that respiration induces great changes in terms

of artifacts, and different respiratory patterns cause different

oxygenation and finally change the fMRI measured BOLD signal

(Thomason et al. 2005).

For this reason, advanced signal processing techniques have

been developed with the goal of eliminating these confounding

factors. One proposal was the use of Independent Component

Analysis (ICA) to correct and remove structured noise (Thomas

et al. 2002). However, recent work has shown that ICA alone

cannot completely remove physiological noise from fMRI data

(Beall et al. 2010) and moreover that higher order fluctuations

in respiratory patterns induce detectable signal changes which

can act as a confounding factor in research related to resting state

(Birn et al., 2008).

Even if advances in data analysis techniques can provide better

results at the cost of greater complexity, these results are

considerably improved by parallel dedicated measurements of

the sources of the artifacts. An efficient method which exploits

parallel measurements for artifact correction uses acquired

respiratory signals to create a principal regressor, along with

other derived regressors obtained with a higher order analysis

of the signal itself. This approach is known as RETROICOR

(Glover et al., 2000). It is clear that a higher quality and sensitivity

of acquired respiratory data will lead to an improved quality

of all the regressors and finally to a higher quality of artifact

correction and final denoised data, independent of the strategy

adopted to correct for respiratory artifacts. With the aim of

obtaining the best data quality and the optimal method of

artifact correction we have developed the Respiration Belt MR,

a novel device for the acquisition of respiratory signals within

MR environments (Fig.1).

Working in an MR environment imposes several constraints

ranging from the safety and care of the subject to the quality of

the acquired data. Our solution offers advantages for all these

factors. We decided to realize a respiratory belt, because this is

a non-intrusive sensor which is comfortable for the test subjects,

who may already be negatively affected by the fMRI procedure

(Cook et al., 2007).

The compatibility and safety of the Respiration Belt MR result

from its technical characteristics.

One of its main features is that it is based on a pneumatic

technology, unlike most solutions on the market. This avoids

safety issues related to the introduction of electrical devices

in strong magnetic fields. In addition, being pneumatic-based,

page 2 of 13

Figure 2Figure 1

Respiration Belt MR Transducer

Respiration Belt MR Transducer

Respiration Belt MR Sensor

Elastic belt and pouch

Auxiliary Connector Cable

Page 3: Press Release Dear Customers, Dear Friends of Brain Products, 2019-02-18 · Press Release April 2013, Volume 46 Contents of this issue Dear Customers, Dear Friends of Brain Products,

www.brainproducts.com

Brain Products Press Release April 2013, Volume 46

the Respiration Belt MR is not a source of artifacts for the MR

imaging, thus preserving the highest data quality and ensuring

that no noise is induced on the MR recorded signal. Extensive

tests have been carried out with scanners from various

manufacturers with very satisfactory results.

Moreover, we developed our Respiration Belt MR with the aim

of having a device with great sensitivity which is able to

adequately follow different types of respiratory acts in a robust

way. Figure 2 shows a slow and deep respiration (black line)

and a faster and shallow respiration (red line) as measured

by the Respiration Belt MR: The Respiration Belt MR is able to

follow the dynamic of respiratory act over quite a wide range,

showing a good sensitivity of the system. This makes of it a

powerful and sophisticated tool to obtain high quality respiratory

signals, and thus regressors for artifact correction, and also

to investigate interrelation between physiology and brain

organization more accurately. The higher sensitivity of the belt

to respiratory dynamics makes it easier and more effective to

compute higher order regressors describing fluctuations of

respiration over time.

We are convinced that the new Respiration Belt MR represents a

very useful instrument for advanced research over a wide range

of applications and we will be pleased to welcome any of your

further enquires.

page 3 of 13

Products in Practice

Best Current Practice for Obtaining High Quality EEG Data During Simultaneous fMRI by Stefanie Rudrich

Brain Products has been a market leader in amplifiers, electrode

caps and software for EEG/fMRI co-registrations for more than a

decade now.

More than 300 pubmed-listed publications have used our EEG/

fMRI equipment over this period (see www.brainproducts.

com/references.php), emphasizing that we not only provide

the equipment, but we also have experience and can offer

competent support in this field.

Combining EEG/fMRI safely and conveniently for the subject and

the researcher as well as ensuring good signal quality requires

taking many aspects into account. To give you an insight into this

we collaborated with long-time customers Karen Mullinger and

Richard Bowtell (University of Nottingham, Sir Peter Mansfield

Magnetic Resonance Centre) as well as JoVE ( Journal of Visualized

Experiments). The result of this project is a video and detailed

protocol on the “Best Current Practice for Obtaining High Quality

EEG Data During Simultaneous fMRI”.

As you know, EEG data acquired during simultaneous fMRI

are affected by several artefacts. Post-processing methods

for successfully correcting them require a number of criteria

to be met during data acquisition. Based on their research,

Karen Mullinger and Richard Bowtell describe an experimental

set-up in the video which provides high quality EEG data during

simultaneous fMRI while minimising safety risks to the subject.

The video will be published shortly at JoVE‘s website:

www.jove.com

Moriah E. Thomason, Brittany E. Burrows, John D.E. Gabrieli, Gary H. Glover, Breath holding reveals differences in fMRI BOLD signal in children and adults, NeuroImage, Volume 25, Issue 3, 15 April 2005, Pages 824-837, ISSN 1053-8119, 10.1016/j.neuroimage.2004.12.026.

Christopher G. Thomas, Richard A. Harshman, Ravi S. Menon, Noise Reduction in BOLD-Based fMRI Using Component Analysis, NeuroImage, Volume 17, Issue 3, November 2002, Pages 1521-1537, ISSN 1053-8119, 10.1006/nimg.2002.1200.

Erik B. Beall, Mark J. Lowe, The non-separability of physiologic noise in functional connectivity MRI with spatial ICA at 3T, Journal of Neuroscience Methods, Volume 191, Issue 2, 30 August 2010, Pages 263-276, ISSN 0165-0270, 10.1016/j.jneumeth.2010.06.024.

Birn, R. M., Murphy, K. and Bandettini, P. A. (2008), The effect of respiration variations on independent component analysis results of resting state functional connectivity. Hum. Brain Mapp., 29: 740–750. doi: 10.1002/hbm.20577.

Glover, G. H., Li, T.-Q. and Ress, D. (2000), Image-based method for retrospective correction of physiological motion effects in fMRI: RETROICOR. Magn Reson Med, 44: 162–167. doi: 10.1002/1522-2594(200007)44:1<162::AID-MRM23>3.0.CO;2-E.

Cook R., Peel E., Shaw R.L., Senior C., 2007. The neuroimaging research process from the participants’ perspective. International Journal of Psychophysiology 63, 152–158.

Page 4: Press Release Dear Customers, Dear Friends of Brain Products, 2019-02-18 · Press Release April 2013, Volume 46 Contents of this issue Dear Customers, Dear Friends of Brain Products,

www.brainproducts.com

Brain Products Press Release April 2013, Volume 46

We are about to release a new sub-version of the popular

BrainVision Recorder software with the number of 1.20.0506.

This edition introduces some remarkable new features.

Windows 8 compatibility

Operating systems come and go, but Recorder stays. From

now, it supports Windows 8, the latest operating system from

Microsoft. It is compatible with 32 and 64-bit versions, in case the

BrainAmp, V-Amp or actiCHamp is connected. The combination of

QuickAmp and the Windows 8 32-bit is also supported.

This update does not include the support of the special tablet

version, the Windows RT. If the BrainVision Recorder 1.20.0506

has to run on a tablet-like computer, one with Windows 8

operating system should be used.

‘Tab view’ or ‘Scientific view’

The ‘tab view’ is a new data visualization tool, available during

data monitoring. It shows single or multiple channels in a new

tab, which explains why it is called ‘tab view’. The ‘scientific view’

name refers to the fact that all channels are displayed in their

own coordinate system with horizontal and vertical guidelines

(see also Fig. 1.).

Polarity, scaling and displayed time intervals are adjustable, and

the default values of these parameters can be edited. Therefore

this type of visualization is particularly suitable when channels

should be visualized with different time scales. For instance, the

reaction of the GSR data is typically slower than EEG data, so it is

reasonable to display them with a different time scale.

Not just one, but several tab views can be opened at the same

time to display different subparts of the data, e.g. to show only

the EMG channels or only the frontal ones. Furthermore the tab

configurations can be saved in the workspace, so that the same

type of visualization will be available for future recordings.

Electrode positions

Previously, Recorder was optimized for standard caps, which

follow the international 5% standard. However this meant that

the impedance topography map did not work smoothly for non-

standard caps, like equidistant or customized caps. The new

version improves this situation by handling BrainVision Electrode

Files (BVEF).

As with the Analyzer 2, the electrode file stores channel name

and position pairs in XML format. Now, the Recorder is also able

to open such files while the workspace is created or edited.

This feature offers three advantages: first of all it can replace

the channel name table; the names do not have to be entered

manually anymore. Secondly, the associated locations are

applied on the map of the impedance check mode: the electrodes

are topographically displayed in the real position (Fig. 2.).

Thirdly, and probably most importantly, the imported positions

will be written to the dataset’s header file, so an offline evalua-

tion tool such as the Analyzer 2 can recognize and import the

correct locations automatically together with the data.

Keyboard shortcuts

This version introduces keyboard shortcuts for scaling and time

interval changing. Dedicated combinations are available for the

standard, average and for the new scientific view. One does not

have to remember all combinations, tool tips around the icons

help to recall the buttons. Although this is a small improvement,

I am sure many users will appreciate it.

Further changes

The V-Amp workspace now has a new checkbox which makes it

possible to invert the polarity of the auxiliary data.

Under the hood some further work was done, the new Recorder

is able to handle the SafeNet-SRM dongle technology and some

small bugs have been fixed.

The new BrainVision Recorder 1.20.0506 will be available for

download at www.brainproducts.com/downloads.php?kid=2

from early April.

Figure 2. Impedance measurement map with individual electrode positions

page 4 of 13

Product Development

BrainVision Recorder 1.20.0506 by Dr. Roland Csuhaj

Figure 1. New ‘scientific view’ during data acquisition. Note the different time scale of the tabs.

Page 5: Press Release Dear Customers, Dear Friends of Brain Products, 2019-02-18 · Press Release April 2013, Volume 46 Contents of this issue Dear Customers, Dear Friends of Brain Products,

www.brainproducts.com

Brain Products Press Release April 2013, Volume 46

Support Tip

How to add methods to BrainVision Analyzer quickly and easily? An example of the interactive Matlab interface by Dr. Roland Csuhaj

New signal processing methods are published virtually every

day; BrainVision Analyzer 2 cannot have all of them. However, the

Analyzer is not just a powerful tool for offline EEG data evaluation -

it is also a flexible framework which can integrate signal processing

functions from many different sources. Beside the macros and

add-ins, the Analyzer offers an exciting possibility to export your

data to Matlab, do some calculations there and import the results

back to Analyzer in order to continue your work in its user friendly

environment. Today I am going to show how quickly a new method

can be added to Analyzer using the Matlab Transformation.

Let’s look at the fraction peak latency method, which is a technique

to detect onset latency of components. The fractional peak latency

marks the time point, when a certain percentage of the peak

amplitude (e.g. 50%) was reached in the backward direction.

Although such a method is not yet implemented in Analyzer,

we can take advantage of the existing transformations: ‘Peak

detection’ can perform the first step, and identify the peak of

the average nodes. The Matlab transformation can quickly send

the data into Matlab in order to do the rest of the calculation

there. (In this article I am not going to introduce all options of the

transformation, only those which are important for our goal). Open

an average node, where the peaks are already detected and start

the Matlab transformation. In the first window, mark the ‘Calculate

Data on Creation of Node’ radio button, in the second one activate

the ‘Export Markers’. The transformation executes the Matlab

commands typed in the first window. You do not have to enter

hundreds of lines here, it is enough to call the .m file(s). The ‘Show

Matlab Window’ is pretty useful while the code is still polished.

For those who are more familiar with Analyzer, this is similar to the

semi-automatic mode: the data can be checked or even modified

manually in Matlab before it is imported back to Analyzer. Since we

have just started to deal with this transformation, simply mark the

three mentioned boxes, but not the others and enter the following

text to the Code Executed … field:

It will only show you the familiar Matlab desktop instead of the

command line. Once the data is sent to Matlab, a dialog with ‘Press

OK to continue in Analyzer’ text will appear. Try to resist, and do

NOT click on OK until you are sure you are ready to close Matlab

and start the re-import phase. Once the Matlab main window is

started, you can have a look at the data. Many different properties

of the node were exported, for now only the ‘EEGData’ and the

‘Markers’ variables are important.

What should be done to mark the fractional peak? We need a loop

which checks all markers. If a ‘Peak’ marker is found (type is stored

in Markers.Type properties) the value of the corresponding data

point is needed from the EEGData variable. The fractional value can

be calculated easily and can be used as threshold. A second loop

should be started that searches for the data point in the backward

direction at which the threshold is reached. This is the point

where the new marker has to be placed. During the re-import, the

Analyzer will look for the variable name ‘NewMarkers’, its contents

will be added to the existing marker list. Certainly it must have the

same structure as the ‘Markers’ variable. So in the next run we can

enter the following code into the Matlab transformation dialog:

The percentage is defined by the ‘Threshold’ variable. The rest

is quite straightforward, except for one point: the ‘Markers’

variable refers to the channel and data position according to the

C# convention, where all indexing starts with 0. But the EEGData

matrix cannot be formed according to this convention because

in Matlab all indexing starts with 1. This difference has to be

compensated for by the code when it looks for the corresponding

data value (by adding 1) and also when it creates the new marker

(by subtracting 1).

Once you are sure the code is running fine, the desktop command

in the last line is not needed anymore and the ‘Show Matlab

Window’ checkbox can be deactivated. The transformation will

now run automatically. As you can see, only 21 lines were needed

to add a new method to Analyzer. It is a fast and effective way to

implement new functions.

The program codes are also available at www.brainproducts.com/

downloads.php?kid=21 as .m file. This version contains more

explanation as additional comment lines.

desktop;

Threshold = 50;

NbPeak = 1;

for i = 1 : size(Markers,2)

if strcmp(Markers(1,i).Type,(‘Peak‘))

Pos = Markers(1,i).Position+1;

PCh = Markers(1,i).Channel+1;

PValue = abs(EEGData(Pos, PCh));

Pos = Pos - 1;

while Pos > 0

CurrentValue = abs(EEGData(Pos, PCh));

i f CurrentValue <= PValue*Threshold*0.01

NewMarkers(1,NbPeak) = Markers(1,i);

NewMarkers(1,NbPeak).Description = ...

strcat(Markers(1,i).Description,‚_‘, num2str(Threshold));

NewMarkers(1,NbPeak).Position = Pos - 1;

NbPeak = NbPeak +1;

break

end

Pos = Pos - 1;

end

end

end

desktop;

page 5 of 13

Page 6: Press Release Dear Customers, Dear Friends of Brain Products, 2019-02-18 · Press Release April 2013, Volume 46 Contents of this issue Dear Customers, Dear Friends of Brain Products,

www.brainproducts.com

Brain Products Press Release April 2013, Volume 46

User Research

Perceiving while acting: How visual selection is tuned to action intentions, an EEG study. by Agnieszka Wykowska1

1 Allgemeine und Experimentelle Psychologie, Ludwig-Maximilians-Universität, Munich, Germany

Theoretical background

Imagine you’re playing baseball and you’re just about to strike

an approaching ball with your bat. How does your brain plan that

action and what parameters need to be specified to perform

it efficiently? Apart from the obvious control of the motor

commands, the brain needs also to adjust perceptual processing

to fit the goals of the planned action [1,2]. Throughout lifelong

experience, humans learn that for various actions different

perceptual parameters are important and relevant [2]. This implies

that perceptual selection can be tuned to action planning [2-4].

That is, in the baseball example - when you plan an action,

depending on whether you plan to hit the ball or catch it, different

perceptual aspects of the ball will be relevant, and prioritized

accordingly. In catching the ball, grip aperture is important,

and hence, size and shape of the ball needs to be processed

with priority. In case of hitting the ball, its location is the most

important feature. In neither of the cases color of the ball is

relevant. We postulate that the so-called intentional weighting

mechanism [4,5] operates at the level of processing of perceptual

information in order to tune perception to action plans. The idea

is that planning a particular action should affect visual perception

in a way that perceptual dimensions, which are potentially

relevant for the intended action receive a higher weight than

those dimensions that are not action-relevant. This should allow

efficient delivery of perceptual parameters for online action

control [2-5]

Experimental design

All figures and parts of the Methods section are as originally published

in Wykowska, A. & Schubö, A. (2012). Action intentions modulate allocation

of visual attention: electrophysiological evidence. Frontiers in Psychology,

3: 379. doi: 10.3389/fpsyg.2012.0037.

We conducted a study published in Frontiers in Psychology [6], in

which we examined behavioral manifestations and EEG correlates

of the intentional weighting mechanism. In our paradigm (see

Figure 1 for details) participants first observed a cue signaling

what type of movement (pointing or grasping) they should

prepare (but not execute immediately). Then, they performed a

visual search task (while preparing for the cued movement), i.e.,

they detected a target item presented among other distracter

items. The target of the visual search task was defined either by

size or by luminance (and differed from the other items only by

one respective feature). Participants were asked to respond with

one key on a computer mouse for target present trials and with

the other key for target absent trials. After the completion of the

visual search task (and upon presentation of a “go-signal”), they

executed (with the other hand) the movement that they planned.

The movement consisted in either grasping or pointing to one of

the cups positioned below the computer screen, and the particular

cup was indicated by the “go-signal”.

This design made participants

activate a movement code while they

were performing a perceptual task,

and created two action-perception

congruent pairs: size target was

congruent with the grasping movement

and luminance target was congruent with the pointing movement.

The congruency between movements and visual search targets

was predicated based on that size is a relevant perceptual

dimension for grasping (in grasping size of grip aperture needs to

be determined) while luminance is relevant for pointing

(luminance is an efficient hint for localization and pointing also

aims at localizing objects). Importantly, the visual search

was entirely unrelated to the movement task both motorically

(the visual search task was performed with one hand and

the movement task with the other) and perceptually (the

visual search display was presented on the computer

screen while the to-be-grasped/pointed to objects

were located below the screen). Hence any action-related effects

on perceptual processing in the search task would indicate

influences from action planning at the representational level in

the brain [1-5]. We expected better detection of dimensions in

the action-congruent conditions relative to other conditions,

Agnieszka Wykowska

Figure 1. An example trial sequence. A trial started with fixation dis-play followed by a movement cue and another fixation display. Next, a search display was presented and participants performed the visual search task immediately. Upon response to the search task and a blank screen, the go-signal asterisk was presented which indicated one of the three cups that should be grasped/pointed to. At this point, par-ticipants executed the prepared movement, which was registered by an experimenter with a mouse key press (the experimenter observed performance with a camera outside of the experimental chamber). Following the experimenter’s button press, a trial ended. Note that catch trials (30%) differed from the standard trials only in that in place of a search display, another fixation display was presented. As par-ticipants did not need to perform a search task, a blank display was presented during the time they would respond to the search display in case of trials of interest. The rest of the trial following the blank display was identical to the actual trials.

page 6 of 13

Page 7: Press Release Dear Customers, Dear Friends of Brain Products, 2019-02-18 · Press Release April 2013, Volume 46 Contents of this issue Dear Customers, Dear Friends of Brain Products,

www.brainproducts.com

Brain Products Press Release April 2013, Volume 46

given that the function of the intentional weighting mechanism

is to prioritize processing of perceptual dimensions that are

potentially relevant to a planned action.

While participants were performing the tasks, we recorded EEG.

In data analysis, ERPs were locked to visual search display onset

and thus reflected processes involved in the visual search task

(occurring before movement onset). In line with the idea that

the intentional weighting mechanism operates at the level of

processing perceptual dimensions, we expected action-related

modulation of ERP components that mirror early perceptual and

attentional processes in the visual search task.

Methods

Participants and Procedure

Eighteen participants (13 women) aged from 21 to 30 years

(mean age: 24.3) participated. They were seated in a dimly lit,

sound attenuated and electrically shielded cabin 100 cm away

from the computer screen. Before the experimental session,

participants took part in a practice session on a separate day.

The experimental session consisted of 504 trials for each of the

target types (luminance or size). The target type was blocked

(order counterbalanced across participants), whereas the

movement type (grasp vs. point) and display type (target present

vs. blank) were randomized within a block. Short breaks were

introduced after each 63 trials so that participants could move

their eyes, blink and relax.

EEG recording

EEG was recorded with Ag-AgCl electrodes from 37 electrodes

mounted on an elastic cap (EASYCAP GmbH, Germany).

Horizontal and vertical EOG were recorded bipolar from the outer

canthi of the eyes and from above and below the observer’s

left eye, respectively. All electrodes were referenced to Cz and

re-referenced offline to the average of all electrodes. Electrode

impedances were kept below 5 kΩ. Sampling rate was 500 Hz

with a High Cutoff Filter of 125 Hz. The EEG signal was amplified

with two DC amplifiers (BrainAmp DC, Brain Products GmbH)

and data were recorded using BrainVision Recorder 1.02 (Brain

Products GmbH).

EEG analysis

EEG data was processed with the use of BrainVision Analyzer

2.0.1 (Brain Products GmbH). EEG was averaged over 600-ms

epochs including a 200-ms pre-stimulus baseline, locked to

search display onset. Trials with eye movements and blinks on

any recording channel (indicated by any absolute voltage

difference in a segment exceeding 80 μV or voltage steps

between two sampling points exceeding 50 μV) were excluded

from analyses. Additionally, channels with other artefacts were

separately excluded if amplitude exceeded +/- 80 μV or any

activity was lower than 0.10 μV for a 100 msec interval. Raw

data was filtered offline 40-Hz high-cutoff filter (Butterworth

zero phase, 24 dB/Oct). Only trials with correct movement and

search responses were analyzed. Responses in the search task

deviating more than +/- 3 SD from mean RT (calculated separately

for each participant and target type) were categorized as

outliers and excluded. One participant was excluded from

analyses due to extensive eye blinks, two due to extensive

Figure 2. Mean reaction times (RTs) as a function of visual search target type (luminance vs. size) and prepared movement (grasp vs. point) in target-present trials. Error bars represent the standard errors of the mean. Action-perception congruency effects are observed as faster detection RTs for each of the dimensions when presented in the action-congruent condition (size-grasp and luminance-point) as compared to incongruent condition (size-point and luminance-grasp). Interaction of movement type and target type was significant for target trials, F (1, 12) = 16, p < .005, h

p2 = .58; and the difference between grasping

and pointing conditions was significant in the luminance task, t(13) = 2.1, p < .05 (one-tailed) and marginally significant in the size task, t(13) = 2.1, p = .06 (one-tailed).

Figure 3. The ERP waveforms for pooled channels O1, O2, PO7, PO8, locked to the onset of the visual search display as a function of target dimension: luminance (3A) and size (3B) and prepared movement type: pointing – solid line and grasping – dashed line. The grey outline boxes represent the time window of statistical analysis of the mean amplitude of the P1 component (70-130 ms, determined around +/- 30 msec the grand average peak latency). Interaction of target type and movement type was significant, F (1, 13) = 6.2, p < .05, h

p2 = .32. P1 was more enhan-

ced for the congruent movement condition, relative to the incongruent condition for luminance targets, t (13) = 2, p < .05, one-tailed (3A) but not for size targets, p > .25, one tailed (3B). The scalp distribution of the mean amplitude of the ERPs within the 70-130 ms time window (P1) is shown on the right. Note that the scalp distribution indicates a larger positivity on the right electrode sites, independent of condition. This might be related to the fact that attentional networks are located mostly in the right cerebral hemisphere [9-10], and is in line with previous fin-dings on attentional orienting that showed validity effects in a cueing paradigm also predominantly on right lateral electrodes [11].

page 7 of 13

Page 8: Press Release Dear Customers, Dear Friends of Brain Products, 2019-02-18 · Press Release April 2013, Volume 46 Contents of this issue Dear Customers, Dear Friends of Brain Products,

www.brainproducts.com

Brain Products Press Release April 2013, Volume 46

alpha waves and one due to poor performance in the movement

task (14% of errors in the pointing condition; other participants

did not exceed 7%). The analyses focused on O1, O2, PO7, PO8

electrodes for an early perceptual ERP component (P1, typically

in the time window of 100-130 ms), as well as on the PO7/8

electrode pair for the attention-related N2pc. The N2pc is

measured at posterior sites within the time window of ca. 180-

300 msec and is more negative on contralateral electrode sites

compared to ipsilateral sites relative to an attended object

presented in the left or right visual hemifield [7-8]. In

order to extract search-locked ERPs from the cue-locked

ERPs, catch trials were introduced (30% of all trials,

randomly intermixed with standard trials), in which

no search display was presented (trials consisted of only

movement task). Catch trials were subtracted from “actual” trials

on epoched data, separately for each cue type, time locked to

search display onset.

Results

Behavior. We observed action-perception congruency effects:

size targets were detected faster when the grasping (congruent)

movement was prepared (relative pointing) while luminance

targets were detected faster when the pointing (congruent)

movement was prepared (relative to grasping), see Figure 2.

ERPs. The ERPs showed a more enhanced P1 component (ca. 100

ms post-onset of the visual search display) for the luminance

target in the pointing (congruent) condition, as compared to the

grasping (incongruent) movement, see Figure 3. Although an

analogous effect on the P1 was not observed for size targets,

the size-grasping (congruent) condition elicited a larger N2pc as

compared to size-pointing (incongruent), see Figure 4 for grand

averages of the N2pc and Figure 5 for scalp distribution of the

N2pc effects.

Conclusion

The aim of this study was to investigate behavioral

manifestations and electrophysiological correlates of the

action-related intentional weighting mechanism imposed on

perceptual selection processes. We observed that perceptual

dimensions were processed with priority in the action-congruent

conditions, as compared to action-incongruent conditions,

as indicated by behavioral data and modulatory effects on P1

(luminance targets) and N2pc (size targets). Therefore, this

study provided striking evidence that the intentional weighting

mechanism operates at early stages of perceptual processes

and attentional selection; and biases processing of stimuli

with respect to action plans. Our findings support the idea that

perception and action are tightly coupled and that perceptual

selection is tuned to intended actions. In other words, what we see is tuned to how we intend to act!

Figure 5. Topographical maps of the ERP voltage distribution for the N2pc time window (230-300 ms) for size targets (upper panel) and lumi-nance targets (lower panel) in the grasping condition (left) and pointing condition (right) presented from posterior view (larger images) and top view, all channels (smaller images, front plotted upwards). The voltage distribution maps represent waveforms in the respective conditions for targets presented in the left and right visual hemifields. The maps clearly show target-related laterality effects (that is, enhanced activity contralate-ral to the target: the N2pc) for size targets in the grasping condition (upper left), while laterality was present but less pronounced in the pointing con-dition (upper right). In the luminance condition (lower panel), negativity was less pronounced in the grasping condition compared to pointing. In grasping trials, there was no difference in negativity for contra- and ipsi-lateral sites (lower left) yet a slight difference (nonsignificant) is observed in the pointing condition for target presented in the right hemifield (lower right).

Figure 4. The N2pc for the PO7/8 electrode pair, locked to the onset of the visual search display as a function of target dimension: size (4A) and luminance (4B), movement type (grasping: left; pointing: right) and electrode site: contralateral to the target – solid lines; ipsilateral to the target – dashed lines. The solid grey outline rectangle represents the time window of analysis of the N2pc mean amplitudes (230–300 msec, around +/- 35 msec the grand average peak latency of the difference wave between contra and ipsilateral channels). For size targets, interac-tion between laterality (N2pc) and movement type was significant, F (1, 13) = 5.2, p < .05, hp2 = .28. No such differential effect was observed for the luminance condition, all p > .15. The dashed grey outline rectangle indica-tes the earlier time window of analysis (160-230 ms) in which a laterality effect was found for both dimensions, F (1, 13) = 11, p = .01, ηp2 = .45, but no modulation by movement type was observed. All other interactions and effects were non-significant, all p > .6.

page 8 of 13

Page 9: Press Release Dear Customers, Dear Friends of Brain Products, 2019-02-18 · Press Release April 2013, Volume 46 Contents of this issue Dear Customers, Dear Friends of Brain Products,

www.brainproducts.com

Brain Products Press Release April 2013, Volume 46

References

[1] Hommel, B., Müsseler, J., Aschersleben, G., & Prinz, W. (2001). The Theory of Event Coding (TEC): A framework for perception and action planning. Behavioral and Brain Sciences, 24, 849-937.

[7] Eimer, M. (1995). The N2pc component as an indicator of attentional selectivity. Electroencephalography and Clinical Neurophysiology, 99, 225-234.

[2] Hommel, B. (2010). Grounding attention in action control: the intentional control of selection, in A new perspective in the cognitive science of attention and action: Effortless Attention, ed. B. Bruya, (Cambridge, MA, MIT Press), 121-140.

[8] Luck, S. J., & Hillyard, S. A. (1994) Spatial filtering during visual search: evidence from human electrophysiology. Journal of Experimental Psychology: Human Perception and Performance, 20, 1000-1014.

[3] Memelink, J., & Hommel, B. (in press). Intentional weighting: A basic principle in cognitive control. Psychological Research.

[9] Heilman, K. M., & Van Den Abell, T. (1980). Right hemisphere dominance for attention: the mechanism underlying hemispheric asymmetries of inattention (neglect). Neurology 30, 327–330.

[4] Wykowska A., Schubö A., & Hommel B. (2009). How you move is what you see: Action planning biases selection in visual search. Journal of Experimental Psychology: Human Perception and Performance, 35, 1755-1769.

[10] Thiebaut de Schotten, M., Dell’Acqua, F., Forkel, S. J., Simmons, A., Vergani, F., Murphy, D. G., et al. (2011). A lateralized brain network for visuospatial attention, Nature Neuroscience, 4, 1245-6.

[5] Wykowska, A., Hommel, B., & Schubö, A. (2012). Imaging when acting: picture but not word cues induce action-related biases of visual attention. Frontiers in Psychology, 3:388. doi: 10.3389/fpsyg.2012.00388.

[11] Mangun, G. R., & Hillyard, S. A. (1991). Modulations of sensory- evoked brain potentials indicate changes in perceptual processing during visual-spatial priming. Journal of Experimental Psychology: Human Perception and Performance, 17, 1057–1074.

[6] Wykowska, A., & Schubö, A. (2012). Action intentions modulate allocation of visual attention: electrophysiological evidence. Frontiers in Psychology, 3: 379. doi: 10.3389/fpsyg.2012.00379.

page 9 of 13

New Products

BrainVision Recorder Remote by Stefanie Rudrich

In time for Christmas 2012, we released our Remote Control tool

for BrainVision Recorder. Many of you have already downloaded

BrainVision Recorder Remote which allows you to control

selected features of BrainVision Recorder via a smartphone

(iPhone, Android based phones). If you haven’t tried it yet, go to

www.brainproducts.com/downloads.php?kid=36 and get your

Recorder Remote for free now!

BrainVision Recorder Remote is a web browser-based application

and requires only a TCP/IP connection between your smartphone

and the monitoring PC. Starting Recorder Remote on the PC

automatically calls up BrainVision Recorder. Then you can either

scan the displayed QR code with your smartphone or enter the

IP address directly into your Web browser. And there you go …

your smartphone is a remote control.

Once the smartphone is connected to the web server you can

execute the following features of BrainVision Recorder remotely:

Why BrainVision Recorder Remote?

You might already have experienced a situation like this: The computer on which BrainVision Recorder is running is located outside the (shielded) EEG cabin where you conduct your experiments. In this case you often need to run between the computer and the subject, to start the monitoring, to check/re-check impedances, or to re-prepare the electrodes. That is all history now. Recorder Remote enables you to accompany the subject into the experimental cabin and start the monitoring directly from there. You can also check impedances, re-prepare the electrodes that show unsatisfactory impedances, and re-start the monitoring from where you are, using the monitor in the EEG cabin to display the Recorder Software. Give it a try - it’s easy

and it’s free! Download Recorder Remote now!

• Start monitoring or test signal

• Start impedance measurement > if the amplifier has several impedance groups

- data, references, and ground - you can also

switch groups with this button

• Stop all operations

• Adjust the scaling of the signal or the range of the impedance measurements

> using the (+) and (-) buttons

• Confirm or terminate messages of BrainVision Recorder > using the OK button (green tick)

and Cancel button (red cross)

Page 10: Press Release Dear Customers, Dear Friends of Brain Products, 2019-02-18 · Press Release April 2013, Volume 46 Contents of this issue Dear Customers, Dear Friends of Brain Products,

www.brainproducts.com

Brain Products Press Release April 2013, Volume 46

page 10 of 13

Brain Products Inside

Who is who..? - Ratko Petrovic

I’d like to introduce myself as a new software developer for

Brain Products GmbH. After obtaining a degree in electrical

and computer engineering at University of Technical Science

in Novi Sad (Serbia), I continued to work at the Department of

Biomedical Engineering as a teaching and research assistant.

My main research area was functional electrical stimulation

and neural prosthetics, as well as biosignal processing and

analysis of ECGs and EMGs.

After receiving my Master’s degree in biomedical engineering

for a topic related to recording and analyzing ECG signals,

I accepted an offer from the German company Biosigna

GmbH, the leading developer for the ECG diagnostic

algorithms, to work on developing their products. For almost

5 years I worked for the company on improving existing

ECG algorithms, on the development of long-term,

biomarker-based ECG diagnostic algorithms, and leading the

client’s specific needs project. In 2010

I decided to expand my engineering

experience, and moved on to Noser

Engineering AG, where I had the oppor-

tunity to work as a software engineer in

various fields, like automotive, chemical

and mobility engineering.

After two years I realized that the medical engineering is

my biggest interest, which brought me to Brain Products.

From the first visit to the company I was thrilled with

the products and projects that were designed and developed

within the company, and the support that Brain Products

provides in the field of BCI and neural research. Now, I am

pleased to become a member of the innovative team of Brain

Products and to contribute with my knowledge and experience

to the development of new innovative solutions.

Ratko Petrovic

Brain Products Inside

Changes in company management by Alexander Svojanovsky

Effective January 3rd 2013, Dr. Achim Hornecker is no longer

Managing Director of Brain Products and he has left the company.

In addition to managing the Freiburg branch and the execution

of smaller software projects, his main remit was controlling the

development of the BrainVision Analyzer software.

Achim had already supported the former head of development

and co-founder Henning Nordholz when the company was

started up in 1997. He then succeeded Henning and

acquired company shares. Under his management, the Freiburg

branch expanded and Analyzer 2 was developed. Achim

initiated the start of further software projects, some of which

will partly be completed in the coming months and years. He

also established a technical writing department which pro-

duced and updated the user manuals for all company products.

Achim will continue to be a partner, external adviser and service

provider - and he will remain a welcome guest at any time!

We thank Achim for his time at Brain Products, where he has not

only left his mark on various software projects but also leaves a

legacy in the form of the team that he built up in Freiburg. He will

have a lasting place in the history of the company and we respect

him as an outstanding personality.

The software projects will now be continued under the leadership

of our Technical Manager Dr. Manfred Jaschke, while the new

head of the Analyzer project is Tomasz Kucinski. Markus

Kölble will be in charge of a further project in Freiburg.

Let me use this occasion also to welcome Jens Grunert and

Patrick Britz as new shareholders.

Brain Products Inside

Who is who..? - Manuel Hohmuth

I am Manuel Hohmuth, and since September 2012 I have been

Assistant Production Manager at Brain Products.

Previously I had worked as an electrician in various production

companies such as SGB Sächsische-Bayerische Starkstrom-

Gerätebau GmbH and Siemens AG. I am now working in

Production and Repairs, producing electrodes, testing actiCAP

ControlBoxes, actiCHamps, MOVE2actiCAP adapters, and

much more besides. In addition I carry out repairs on devices

sent in by our customers.

This work can range from simple jobs such

as changing electrodes through to carrying

out complete upgrades. And since no

production is possible without materials,

I also place all the necessary orders.

I applied to Brain Products because I

wanted a career change. That is what

I have achieved with the move to Brain Products and I am really

enjoying my job!

Manuel Hohmuth

Page 11: Press Release Dear Customers, Dear Friends of Brain Products, 2019-02-18 · Press Release April 2013, Volume 46 Contents of this issue Dear Customers, Dear Friends of Brain Products,

www.brainproducts.com

Brain Products Press Release April 2013, Volume 46

Brain Products Distributors

Two Job Offers at Brain Vision LLC by Dr. Florian Strelzyk, Brain Vision LLC

Brain Vision is the US distributor of Brain Products GmbH. Our

products are used by scientists in neuroscience in leading

research institutes.

We know that long lasting customer relations can only be based

on trust in our solutions, service and products.

We establish and maintain this trust by providing reliable

products, matching solutions and outstanding support and

are regularly engaging with key scientists and other companies

in valuable collaboration projects.

These are our current job offers:

Scientific Consultant / Full-time

Place of Employment: Morrisville, NC / USA (Brain Vision‘s Headquarter)

Job description:

Our ideal candidate will have a deep understanding of EEG re- search but will also enjoy working with key researchers and opin-ion makers of our immediate and other field(s) of neuroscience.

Your duties will include:

• Supporting the activity and be a part of the consulting and support team.

• Working together with Brain Products to provide comprehensive technical and scientific support.

• Attending conferences (e.g. SPR, HBM, CNS, HCI).

• Traveling throughout the US and Canada for workshops, installations and customer trainings (total travel time 15-20%).

• Maintaining up-to-date knowledge in EEG & ERP research.

Applicant requirements are:

• Academic degree (Masters or PhD preferred) in a relevant field of neurosciences, psychology, physics, biophysics, biomedical technology or a related field.

• Experience in complex neurophysiological analyses; ideally you are a user of our hard- and software solutions.

• High level of analytical skills including quick comprehension and satisfaction in finding solutions.

• Possess excellent communication skills; specifically you should enjoy frequent interaction with our customers.

• Ability to effectively communicate complex scientific topics to varied audiences.

• Confidence in your capacity to take initiative and work independently.

• Outstanding written and spoken English; proficiency in an additional language is a plus.

Assistant Support and Sales Manager / Full-tim

Place of Employment: Morrisville, NC / USA (Brain Vision‘s Headquarter)

Job description:

Our ideal candidate will have a deep understanding of EEG re- search and related scientific fields and will enjoy advising on various collaborations with scientists and industry leaders.

Your duties will include:

• Finding the best solution for our customers!

• Supporting the activity and be a part of the support, sales and consulting team.

• Attending conferences (e.g. SPR, HBM, CNS, HCI).

• Frequently traveling throughout the US and Canada for workshops, installations and customer trainings (total travel time 20-25%).

Applicant requirements are:

• Academic degree (Masters or PhD preferred) in a relevant field of neurosciences, psychology, physics, biophysics, biomedical technology or a related field.

• Business experience is a plus.

• Experience in neurophysiological analyses; ideally you are a user of our hard- and software solutions.

• High level of analytical skills including quick comprehension and satisfaction in finding solutions.

• Possess excellent communication skills; specifically you should enjoy frequent interaction with our customers.

• Confidence in your capacity to take initiative and work independently.

• Outstanding written and spoken English.

Benefits include working in a fast growing company within a pleasant and skillful team, competitive salary, and full benefits (medical,

dental, vision, and vacation/holiday time).

How to apply:If you meet the skills requirements and wish to explore the opportunity of joining Brain Vision LLC please send your application

documents (cover letter, curriculum vitae) by email to Dr. Florian Strelzyk ([email protected]).

Brain Vision is an equal opportunity/affirmative action employer. We base all hiring decisions on nondiscriminatory factors.

page 11 of 13

Page 12: Press Release Dear Customers, Dear Friends of Brain Products, 2019-02-18 · Press Release April 2013, Volume 46 Contents of this issue Dear Customers, Dear Friends of Brain Products,

www.brainproducts.com

Brain Products Press Release April 2013, Volume 46

page 12 of 13

User Workshops

ERP Workshop for beginners and Advanced EEG & fMRI Workshop in Guangdong (China), November 27-30, 2012 by Alyssa He, Hanix Shenzen

Advanced EEG & fMRI workshops are gaining in popularity in

China. The 8th annual China workshop was attended by over 100

Brain Products users in Guangdong last November, in an event

timed to fit in with the annual national psychology aca-demic

conference of China. The workshop was booked out within weeks

of its announcement, with workshop participants coming from

different provinces and Hong Kong.

The workshop started with scientific talks by prominent

Chinese scientists working in various areas of neuroscience.

The psychologists Yaojia Luo and Xiaolin Zhou presented

psychological studies focused on ERPs. Dr. Ruiwang Huang,

a professor mainly working with fMRI, gave a comprehensive

theoretical talk in which many applications in cognitive

neuroscience were described.

Dr. Yong Li gave a lecture about an investigation on movement

disorders performed in the fMRI with Brain Products BrainAmp

ExG system and GSR MR sensor.

The workshop continued with some practical demonstrations,

including a 1-day ERP workshop for beginners, a set of

measurements with the MOVE wireless system, and a 2-day

comprehensive course on EEG/fMRI co-registration. The latter

included a practical measurement in the MRI scanner at the

Affiliated Hospital of Sun Yat-Sen University as well as hands-on

sessions in which the participants corrected and analyzed the

recorded data

At the end of the workshop, Brain Products’ CEO Pierluigi

Castellone, awarded a prize to Dr. Zhiwen Tang of the

Management School JiNan University for the best scientific

poster presented by Brain Products customers at national and

international conferences in 2012. The winning poster had the

title “ERP study about intuitive processing category decisions”

and was chosen by a committee of experts selected by Shenzhen

Hanix.

The event was organized with the help of the Guangzhou

University. We are very grateful to President Haosheng Ye

and his enthusiastic team for their support throughout the

workshop. We also appreciate the organizational assistance

from Brain Products. The help from them was central to the

event’s success. This workshop was the latest successful

collaboration between Brain Products, Shenzhen Hanix

and Guangzhou University, and we hope there will be many

more to come.

We are now collecting posters for the 9th Advanced EEG & fMRI

workshops in 2013, and again an award will be given for the

best submission. We believe that the next Advanced EEG & fMRI

workshop will once more be a real success!

Brain Products Inside

Who is who..? - Nicola Soldati

It is a pleasure to introduce myself as a new scientific

consultant for the Sales department at Brain Products.

My name is Nicola Soldati, and my background is in

telecommunication engineering, specializing in advanced

signal processing, machine learning and pattern recognition

applied to human brain neuroimaging.

Since 2007, I have been working on the acquisition of

neuroimaging data and the development of experimental

protocols for the investigation of human brain dynamics and

I hold a PhD in Cognitive Neuroscience at the CIMeC, University

of Trento, Italy. In particular my focus has been on developing

novel algorithms and software both for single modality

real-time fMRI and for multimodal data fusion of EEG and

fMRI data. Studying these arguments I became familiar with

Brain Products solutions for the simultaneous acquisition of

EEG and fMRI data, and I made extensive use of these.

While working on my PhD, I also gained experience in neurofeedback effects for the non-invasive study of human brain plasticity and causality. I also got familiar with state of the art analysis techniques for intelligent biomedical signal processing and nonlinear system theory approaches in computational neuroscience. My work gave me the chance to collaborate with and visit some of the more advanced and recognized laboratories across the world, such as the Mind Research Institute at the University of New Mexico (USA), the Martinos Imaging Center, MIT, and the laboratory of Advanced Brain Signal Processing (RIKEN BSI, Japan).

I find the possibility of collaborating to find optimal solutions that suit research problems very stimulating, and I hope that my experience and my problem-solving oriented approach can help to foster scientific research with Brain Products solutions.

Nicola Soldati

Page 13: Press Release Dear Customers, Dear Friends of Brain Products, 2019-02-18 · Press Release April 2013, Volume 46 Contents of this issue Dear Customers, Dear Friends of Brain Products,

www.brainproducts.com

Brain Products Press Release April 2013, Volume 46

This Press Release is published by Brain Products GmbH, Zeppelinstrasse 7, 82205 Gilching, Germany.

Phone +49 (0) 8105 733 84 0, www.brainproducts.com

Notice of Rights

All rights reserved in the event of the grant of a patent, utility model or design. For information on

getting permission for reprints and excerpts, contact [email protected]. Unauthorized

reproduction of these works is illegal, and may be subject to prosecution.

Notice of Liability

The information in this press release is distributed on as „As Is“ basis, without warranty. While every

precaution has been taken in the preparation of this press release, neither the authors nor Brain

Products GmbH, shall have any liability to any person or entity with respect to any loss or damage

caused or alleged to be caused directly or indirectly by the instructions contained in this book or by

the computer software and hardware products decribed here.

Copyright © 2013 by Brain Products GmbH

page 13 of 13

For more information on the conferences we are about to attend, please visit our website at www.brainproducts.com/events.php

News in brief: Conferences

San Francisco (USA), Apr 13th to 16th, 2013CNS Annual Meeting in cooperation with Brain Vision LLC

Istanbul (Turkey), Apr 19th to 21st, 2013 in cooperation with Inter Bilgisayar Elektronik San Dıs Tic.Ltd.

Cognitive X

Salt Lake City (USA), Apr 20th to 26th, 2013 in cooperation with Brain Vision LLCISMRM 2013

Dortmund (Germany), Apr 25th to 27th, 2013 in cooperation with MES ForschungssystemeAging and Cognition

Krakow (Poland), May 9th to 11th, 2013 in cooperation with ELMIKO MEDICAL sp. z.o.o.NEURONUS 2013

Baltimore (USA), Jun 1st to 5th, 2013 in cooperation with Brain Vision LLCSLEEP 2013

Pacific Grove (USA), Jun 3rd to 7th, 2013 in cooperation with Brain Vision LLCInternational BCI Meeting

Seattle (USA), Jun 16th to 20th, 2013 in cooperation with Brain Vision LLCHuman Brain Mapping

Montreal (CAN), Jun 23rd to 27th, 2013 in cooperation with Brain Vision LLC30th Intern. Epilepsy Congress

Wuerzburg (Germany), May 30th to Jun 1st, 2013 in cooperation with MES Forschungssysteme and EASYCAP

DGPA (Psychologie & Gehirn) 2013

News in brief: Downloads, Programs and Updates

All Updates and New Modules can be downloaded on our website at www.brainproducts.com/downloads.php. If you‘d like us to keep you posted on any new Update for BrainVision Analyzer 2, please register for our Analyzer 2 Newsflash at www.brainproducts.com/a2_newsflash.php

Feb 22nd, 2013 / RDA Client for MATLAB®

A 64bit version of the RDA Client for MATLAB® is now also available in the Recorder Downloads Section:www.brainproducts.com/downloads.php?kid=2&tab=5