PhD Defense GraphTyper: A pangenome method for identifying sequence variants at a population-scale

Helmut Neukirchen, 26. June 2019

Hannes Pétur Eggertsson successfully defended his PhD thesis in Computer Science on GraphTyper: A pangenome method for identifying sequence variants at a population-scale. I had the honor to steer this defense in my role as vice head of faculty.

As you notice, only men are occurring here. We need to improve on this! More pictures can be found on flickr.

Datasets for DBSCAN evaluation

Helmut Neukirchen, 20. June 2019

For evaluating implementations of the popular DBSCAN clustering algorithm, various publications use several datasets. Pointers to these datasets and information on paramaters (e.g. normalisation, epsilon and minpts) are collected here. You are welcome to contact me if you have further (big) datasets that are good benchmarks for DBSCAN.

Sarma et al.: μDBSCAN: An Exact Scalable DBSCAN Algorithm for Big Data Exploiting Spatial Locality

TODO: check in detail datasets used, but some are those datasets used in some of the other publications below, but "In addition, we have also used a few other real datasets: 3D Road Network (3DSRN) [32] contains vechicular GPS data; Household Power (HHP*) and KDDBIO145K (KDDB*) datasets are borrowed from UCI Repository [33]."

Gan, Tao: DBSCAN Revisited: Mis-Claim, Un-Fixability, and Approximation

Data normalized to [0, 10^5 ] for every dimension.

MinPts = 100, Epsilon = 5000 and higher. (Note: far too high value turning almost the entire dataset into a single cluster -- the mis-claim is on their side!).

Their preprocessed datasets

  • PAMAP2 (3,850,505 4D points),
  • Farm (3,627,086 5D points),
  • Houshold (2,049,280 7D points)

can be obtained from their webpage.

Mai, Assent, Jacobsen, Storgaard Dieu: Anytime parallel density-based clustering

  • Same household datasets used as by Gan, Tao.
  • Also PAMAP2 is used, but claimed to be 974,479 39D points whereas Gan and Tao reduced it to 4 dimensions using PCA, but claim to have 3,850,505 points.
  • In addition, the UCI Gas Sensor dataset by Fonollosa et al. is used: 4,208,261 16D points (DETAILS NOT PROVIDED IN PAPER).

Kriegel, Schubert, Zimek: The (black) art of runtime evaluation: Are we comparing algorithms or implementations?

  • Same PAMAP2, Farm and household datasets used as by Gan, Tao (including also smaller epsilon values as these make more sense).
  • In addition, for higher dimensional data, the Amsterdam Library of Object Images (ALOI) dataset from Geusebroek et al is used, namely the 110250 HSV/HSB color histograms provided on the ELKI Multi-View Data Sets webpage. Namly, the eight dimensions (two divisions per HSV color component) dataset (I assume, this is the 2x2x2 dataset) with epsilon=0.01 and minPts=20.

Patwary, Satish, Sundaram, Manne, Habib, Dubey: Pardicle: parallel approximate density-based clustering

PDSDBSCAN

A subsampled version of the above Millenium Run dataset has also been used in the paper A new scalable parallel DBSCAN algorithm using the disjoint-set data structure by the same main author as Pardicle describing and evaluating PDSDBSCAN who published also a 50,000 10D point dataset used also in that paper.

Götz, Bodenstein, Riedel: HPDBSCAN: highly parallel DBSCAN

The Bremen 3D point cloud and Twitter 2D GPS locations are available as full and subsampled (small) datasets: DOI: 10.23728/b2share.7f0c22ba9a5a44ca83cdf4fb304ce44e (Note: the original publication refers to the dataset via a handle.net handle which does not work anymore).

  • Twitter (dataset t): 16,602,137 2D points (eps=0.01, minPts=40). Note that this dataset contains some bogus artefacts (most likely Twitter spam with bogus GPS coordinates).
  • Twitter small (dataset ts): 3,704,351 2D points (eps=0.01, minPts=40)
  • Bremen (dataset b): 81,398,810 3D points (eps=100, minPts=10000)
  • Bremen small (dataset bs): 2,543,712 2D points (eps=100, minPts=312)

Neukirchen: Elephant against Goliath: Performance of Big Data versus High-Performance Computing DBSCAN Clustering Implementations

The same Twitter small dataset as provided by Götz et al. has been used with the same parameters.

Towards Exascale Computing: European DEEP-EST research project

Helmut Neukirchen, 17. May 2019

The DEEP-EST ("Dynamical Exascale Entry Platform - Extreme Scale Technologies") project is funded as part of the European Commission's Horizon 2020 ambitious Future and Emerging Technologies (FET) programme in order to create the blueprints of the next generation ("pre-exascale") supercomputer hardware and software.
The current goal in supercomputing is to reach exascale performance: a quintillion in American culture or a trillion in European culture or 10 to the power of 18 floating point arithmetic operations per second (FLOPS). These are needed to drive large-scale scientific simulations and big data analytics forward. Current supercomputers are able to achieve 0.2 exaFLOPS (or 200 petaFLOPS or 200 thousand teraFLOPS) (for comparison: if you have a very high-end personal computer, it's CPU can maybe compute half a teraFLOP).

Exascale computing is some sort of "wall", i.e. it is hard to reach it and in particular to go beyond anytime soon. While according Moore's law the number of transistors in a CPU doubles every two years, the performance of a CPU does not anymore double that fast (the transistors go into more cores and more caches). Currently, the only way to boost performance is to use not generic CPUs, but specialised "accelerators", e.g. graphical processors (GPUs), but also accelerators in other parts of a supercomputer, e.g. the network fabric that inter-connects the many CPU nodes of a supercomputer or the storage. DEEP-EST therefore suggest a Modular Supercomputing Architecture (MSA) where the supercomputer is composed of multiple modules, each being specialised in a particular domain, e.g. a GPU-heavy booster for computations that scale well and are suitable for GPUs, a "normal" CPU cluster module for applications that do not scale that well, a data analysis module having hardware specialised for machine learning.

Talking about accelerators: one of our project partners is CERN and the project meeting took place there: we were lucky enough that the Large Hadron Collider (LHC) and particle accelerator is currently in maintenance/upgrade phase, so we where able to see one of the detectors (when it is running, the collisions create lots of radiation). -- Find the human in the picture below:

LHC detector

DEEP-EST has reached the middle of the project duration and the first module, the CPU cluster module has been installed. Since an additional barrier in exascale computing is energy, which also means heat created by the computers that need to be cooled down, DEEP-EST is also working on novel cooling solutions, e.g. water cooling. While typical data centres use air cooling, i.e. extra energy is needed to cool down air that is then blown into the racks, the DEEP-EST water cooling allows to use water at normal temperatures and pipe it through those components that create most of the heat. This will warm up the water and the energy contained in this warm water can then be even used for something else. I.e. instead of needing extra energy from cooling, the DEEP-EST warm water cooling allows to even gain energy (of course, this is energy inserted in the system by the electrical power that the supercomputing components consume). You see the water pipes of the newly installed CPU cluster module in the middle rack below:

Rack with water cooling

Talking about energy efficiency: another trend are field-programmable gate arrays (FPGAs) that are more energy efficient than CPUs or GPUs. These are as well used in one of the specialised DEEP-EST modules.

The downside of the usage of accelerators is that they need special programming. University of Iceland is as DEEP-EST member developing machine learning software that exploits the DEEP-EST Modular Supercomputing Architecture (MSA) as good as possible. This includes clustering (DBSCAN) and classification via Support Vector Machines (SVMs) and Deep Learning/Deep Neural Networks.

You can follow the progress this project on the DEEP-EST web site and Twitter channel.

Scientists for Future / Fridays for Future / Protests for more climate protection

Helmut Neukirchen, 16. March 2019

Climate change is real and will affect us all. So it is good that the Fridays for Future protests have reached Iceland. Scientists in German-speaking countries made their statement that these concerns are justified and supported by the best available science: The current measures for climate, biodiversity, forest, marine, and soil protection are far from sufficient.

I am participating in the eSTICC (eScience Tools for Investigating Climate Change at High Northern Latitudes) NordForsk-funded research project. As part of the project an impressing (or depressing) simulation of the Greenland ice sheet and climate change has been created (the simulations ran on a supercomputer located in Iceland) that shows the surface air temperature in the Arctic and Greenland glacier ice thickness, e.g. when will the Arctic sea ice be gone during summer (we got used to already now) and during winter (=no ice at the North pole in winter -- imagine this) according to the simulations:

We all should act:

1st Workshop on Evaluation and Experimental Design in Data Mining and Machine Learning (EDML 2019)

Helmut Neukirchen, 22. November 2018

My experience with evaluating implementations of machine learning algorithms is that the content of many accepted research papers cannot be reproduced, in particular because the used implementations are not open-source and the authors typically do not even answer emails requesting to use their implementations. This is one aspect of the

1st Workshop on Evaluation and Experimental Design in Data Mining and Machine Learning (EDML 2019)
Workshop at the SIAM International Conference on Data Mining (SDM19), May 2‑4, 2019

Description

A vital part of proposing new machine learning and data mining approaches is evaluating them empirically to allow an assessment of their capabilities. Numerous choices go into setting up such experiments: how to choose the data, how to preprocess them (or not), potential problems associated with the selection of datasets, what other techniques to compare to (if any), what metrics to evaluate, etc. and last but not least how to present and interpret the results. Learning how to make those choices on-the-job, often by copying the evaluation protocols used in the existing literature, can easily lead to the development of problematic habits. Numerous, albeit scattered, publications have called attention to those questions and have occasionally called into question published results, or the usability of published methods. At a time of intense discussions about a reproducibility crisis in natural, social, and life sciences, and conferences such as SIGMOD, KDD, and ECML/PKDD encouraging researchers to make their work as reproducible as possible, we therefore feel that it is important to bring researchers together, and discuss those issues on a fundamental level.

An issue directly related to the first choice mentioned above is the following: even the best-designed experiment carries only limited information if the underlying data are lacking. We therefore also want to discuss questions related to the availability of data, whether they are reliable, diverse, and whether they correspond to realistic and/or challenging problem settings.

Topics

In this workshop, we mainly solicit contributions that discuss those questions on a fundamental level, take stock of the state-of-the-art, offer theoretical arguments, or take well-argued positions, as well as actual evaluation papers that offer new insights, e.g. question published results, or shine the spotlight on the characteristics of existing benchmark data sets.
As such, topics include, but are not limited to

  • Benchmark datasets for data mining tasks: are they diverse/realistic/challenging?
  • Impact of data quality (redundancy, errors, noise, bias, imbalance, ...) on qualitative evaluation
  • Propagation/amplification of data quality issues on the data mining results (also interplay between data and algorithms)
  • Evaluation of unsupervised data mining (dilemma between novelty and validity)
  • Evaluation measures
  • (Automatic) data quality evaluation tools: What are the aspects one should check before starting to apply algorithms to given data?
  • Issues around runtime evaluation (algorithm vs. implementation, dependency on hardware, algorithm parameters, dataset characteristics)
  • Design guidelines for crowd-sourced evaluations

The workshop will feature a mix of invited speakers, a number of accepted presentations with ample time for questions since those contributions will be less technical, and more philosophical in nature, and a panel discussion on the current state, and the areas that most urgently need improvement, as well as recommendation to achieve those improvements. An important objective of this workshop is a document synthesizing these discussions that we intend to publish at a prominent venue.

Submission

Papers should be submitted as PDF, using the SIAM conference proceedings style, available at https://www.siam.org/Portals/0/Publications/Proceedings/soda2e_061418.zip?ver=2018-06-15-102100-887. Submissions should be limited to nine pages and submitted via Easychair at https://easychair.org/conferences/?conf=edml19.

Important dates

Submission deadline: February 15, 2019
Notification: March 15, 2019
SDM pre-registration deadline: April 2, 2019
Camera ready: April 15, 2019
Conference dates: May 2-4, 2019

Further info

Web page

11th Nordic Workshop on Multi-Core Computing (MCC2018)

Helmut Neukirchen, 19. September 2018

The objective of MCC is to bring together Nordic researchers and practitioners from academia and industry to present and discuss recent work in the area of multi-core computing. This year's edition is hosted by the Chalmers University of Technology (Gothenburg, Sweden).

The scope of the workshop is both hardware and software aspects of multi-core computing, including design and development as well as practical usage of systems. The topics of interest include, but is not limited to, the following:

Architecture of multi-core processors, GPUs, accelerators, heterogeneous systems, memory systems, interconnects and on-chip networks
Parallel programming models, languages, environments
Parallel algorithms and applications
Compiler optimizations and techniques for multi-core systems
Hardware/software design trade-offs in multi-core systems
Operating system, middleware, and run-time system support for multi-core systems
Correctness and performance analysis of parallel hardware and software
Tools and methods for development and evaluation of multi-core systems

There are two types of papers eligible for submission. The first type is original research work and the second type is work already published in 2017 or later. Participants submitting original work are asked to send an electronic version of the paper that does not exceed four pages using the ACM proceedings format, http://www.acm.org/publications/proceedings-template, to https://easychair.org/conferences/?conf=mcc2018. The same URL is to be used should you want to present an already published paper as described above. In that case, you need to clearly specify that the paper is already published and where the paper has been published.

No proceedings will be distributed. Contributions will not disqualify subsequent publications in conferences or journals. (This is a real "work"shop to facilitate discussion.)

Call for Papers (CfP).

The conference web page is https://sites.google.com/site/mccworkshop2018.

Full Paper Submission: October 8th, 2018
Author Notification: November 2nd, 2018
Registration Deadline: November 22nd, 2018
MCC Workshop: November 29th - 30th, 2018

The workshop will be held at Chalmers University of Technology, Gothenburg, Sweden.

New head and deputy head of Faculty of Industrial Engineering, Mechanical Engineering and Computer Science

Helmut Neukirchen, 2. July 2018

Starting from 1. July 2018, our faculty has a new head and deputy head (for two years):

Head: Rúnar Unnþórsson

Deputy head: Helmut Neukirchen

Feel free to contact us in case of any problems that fall into our area of responsibility.

Many thanks to the old heads, Kristján Jónasson and Halldór Pálsson for the great job they did!

Steinn Guðmundsson is still in charge of the study programmes Computer Science, Software Engineering, and Computational Engineering.

Switching to Microsoft cloud servers now putting all Icelandic state institutions at privacy risk

Helmut Neukirchen, 5. June 2018

The Icelandic government finalised a contract with Microsoft which covers using Microsoft Office365 cloud services (including email services) in all state institutions (Icelandic announcement).

At least the introduction of Office365 on the Icelandic state-level, leads to some media coverage at Kvennablaðið including Twitter -- well in fact, there was already in the past Twitter-coverage by the Nordic e-Infrastructure Collaboration (NeIC), but they were forced to remove that tweet (honi soit qui mal y pense).

In fact the, same concerns as for the Office365 introduction at University of Iceland apply -- but this time at an even bigger scale (e.g. while other national governments try the best to keep their IT system inaccessible to anymore else, Icelandic government seems not to care at all. Many other institutions handle sensitive data, e.g. when the Icelandic Directorate of Health outsourced patient data to an Icelandic IT provider, privacy concerns were raised. With Microsoft Office365, data will be moved abroad and makes it subject to wiretapping by foreign secret services or direct access via the US CLOUD act and the the European Commission's matching counterpart: E-Evidence).

2nd Nordic High Performance Computing & Applications Workshop, University of Iceland, Reykjavík, 13-15 June 2018

Helmut Neukirchen, 1. June 2018

Thanks to financial support from the Nordic e-Infrastructure Collaboration (NeIC) pooling competencies initiative, I am again able to organise together with my colleagues Morris Riedel (Jülich Supercomputing centre) and Matthias Book (University of Iceland) an HPC training workshop:

The University of Iceland is offering a free cross-national training workshop on high-performance computing (HPC) and applications at the University of Iceland in Reykjavík, Iceland, 13-15 June 2018 (noon-to-noon).

This training workshop is intended for novices (such as MSc or new PhD students) as well as for more advanced HPC users from Iceland and abroad. This time, there is some focus on data.

More information and registration on https://cs.hi.is/HPC/hpcworkshop2018.html

Note that there is another course on research software development (which is not specific to HPC), namely the CodeRefinery workshop in Reykjavik 21-23 August 2018. While it is also funded by NeIC (or part of NeIC in fact), the topics and trainers are different.

A general overview on the HPC activities of the University of Iceland's computer science department can be found here: https://cs.hi.is/HPC/html

New head of University IT department switches from open-source server to Microsoft cloud servers which is a privacy threat

Helmut Neukirchen, 1. March 2018

Updates:
While I mentioned in my original post the US supreme court case to decide whether Microsoft has to hand over data from European Office365 servers to US, the US supreme court decided to not investigate this, because the new CLOUD act is now in force which anyway allows this (and without any court/investigating judge involved as it would be the case for a search warrant). While you may think that this violates the new European privacy directive, the European Commission is in fact working on a matching counterpart: E-Evidence. Time will tell whether European courts consider this as legal or not. But until then, it is obvious that using cloud services means that your data is not safe. (It is anyway not safe as I explained below as we can rely on that any network traffic, including our emails, leaving Iceland, in particular when going through UK, will be wire-tapped by foreign services.)

I was asked to remove some easy to google links to newspaper articles concerning the new head of the University IT department as it may violate the University's Code of Ethics ("Staff and students of the University show each other respect in behaviour, speech and in writing.").

The university administration tries to wipe away privacy concerns by referring to standards such as the privacy and security policy ISO/IEC27001 or European law. But it is naive to rely on non-European companies implementing European law:

"At the Office 365 launch, Microsoft U.K.'s managing director Gordon
Frazer, gave the first admission that cloud data, regardless of where it
is in the world, is not protected against the Patriot Act Act.

The question put forward:
Can Microsoft guarantee that EU-stored data, held in EU based
datacenters, will not leave the European Economic Area under any
circumstances — even under a request by the Patriot Act?

Frazer explained that, as Microsoft is a U.S.-headquartered company, it
has to comply with local laws (the United States, as well as any other
location where one of its subsidiary companies is based).

He said: "Microsoft cannot provide those guarantees. Neither can any
other company."
"
Source: http://www.zdnet.com/article/microsoft-admits-patriot-act-can-access-eu-based-cloud-data/

For exactly that reason, many European universities and research centres forbid to use external and foreign cloud services for critical information (see, e.g., page 5 of the University of Dublin Cloud Computing Policy and Guidelines or the fact that German universities introduced their own private cloud, Sciebo, because external clouds are forbidden -- instead of outsourcing service (and competence) and depending on external providers as HÍ does, the German universities "insource", i.e. set up their own cloud (and gain cloud competencies)).

I think the above mentioned request to remove critical contents from my web page shows that the state of academic freedom is not the best at University of Iceland.

My protest has in the meantime reached international coverage (this tweet linked here has magically disappeared): the Nordic e-Infrastructure Collaboration (NeIC) used its Twitter account to report about the fact that I resigned on 2.3.2018 from representing University of Iceland in the Nordic e-Science community (how can I represent my University if the administration has a completely different view on IT? I sent the new head of the University IT department an email on 1.3.2018, but he did ignore that email (other university employees confirmed that he did not answer emails from academic staff); as he did not reply, I decided as last resort to resign hoping that he replies to that email -- but he still neither replied to that email. Also later email sent to him via the University discussion list were not answered). NeIC is associated with NordForsk and while on 2.3.2018, a NordForsk representative had a lot of understanding for my resignation, NordForsk forced NeIC on 5.3.2018 to remove the tweet. Honi soit qui mal y pense... Instead, a new tweet has been posted that only refers to a workshop that I organised together with my colleagues in 2017.

By the way, the job advertisement for the position of new head of the University IT department mentions explicitly that the job requires experience in introducing changes. So this may explain why changes are being pushed through in a completely undemocratic way.

Due to protests, the introduction has been postponed by one week. But this does not address any of our general concerns. Instead of solving the problems, they are just postponed by one week. The administration makes still very clear that the changes will be implemented just one week later without any democratic discussion.

In addition to a discussion on the university mailing list, I got many supporting personal emails and people where visiting me in my office to express that they agree with me. For example, the following was pointed out in addition:

In fact, Icelandic government, administration, and parliament were suggesting an open-source software policy -- why does University of Iceland not follow it? The following references are in Icelandic only:

Here is my original article:

Our University's computing IT department got a new head. As he does not answer emails to ordinary staff, such as professors like me, I decided to go public:

One of his first decisions was to switch off the old IMAP/SMTP-standard and open-source based email system (Cyrus and Sendmail) operated at our computing centre RHÍ. Instead, we are forced to use Microsoft Office365. It seems that he wants to make an impression as new IT head, but this is in fact a bad start:

We employees in Tæknigarður got an email that the University of Iceland is stopping to use its email system from today 11:00 and we employees thus have today to be between 10:00-11:00 in our offices to give IT personnel access to our computers so that they can setup a new email program that uses a new external email provider.

This change is implemented first in Tæknigarður, but soon the University email accounts of employees in other buildings will be affected until all email accounts (including all our students) are not provided by the University anymore.

I am very concerned about this massive change that the administration is introducing without any discussion. The head of the University IT department, does not answer my email for further details, but here is what I am aware of:

1.)
We shall not use our usual email programs anymore, but the University wants us to use Microsoft Outlook as only software for email and refers thus to http://rhi.hi.is/office365/ and writes
"
- Windows notendur geta fylgt þessum leiðbeiningum til að setja upp póstinn í outlook http://rhi.hi.is/node/1184
- Mac notendur geta fylgt þessum leiðbeiningum til að setja upp póstinn í outlook http://rhi.hi.is/node/1196
- Linux býður ekki upp á outlook en hægt er að nota vefviðmótið á outlook.hi.is
"

(short English translation: Windows users shall install Outlook, Mac users shall install Outlook, for Linux Outlook is not available, but a Web interface can be used.)

To me, this is not acceptable, I want to continue to use the email client that I am used to (which is not Microsoft Outlook, but Thunderbird on Linux). In addition, I cannot be in my office today 10:00-11:00 (I have a meeting) to allow changing my email system and some of the colleagues in Tæknigarður are abroad, e.g. in sabbatical.

2.)
The new external email provider will be Microsoft, i.e. all email that we get is not sent anymore to our University in Iceland, but to Microsoft servers abroad where the email is stored and when we want to read our email we have therefore to retrieve them from the Microsoft servers abroad.

As a computer scientist, I consider this as a severe security problem: with the old system where our email servers were located at our computing centre RHÍ, an email that I sent to another HÍ colleague was just sent from my office to the RHÍ building, stored there at the RHÍ email server and that colleague retrieved it from there, i.e. that email did not leave the University and our RHnet network (Rannsókna og háskólanet Íslands).

Now, an email that I send to a colleague next door is sent to the Microsoft server abroad, stored there all the time and when my colleague wants to read that email, she or he has to retrieve from the Microsoft server abroad where from now on all our email is stored.

If the Microsoft servers are located in the USA, they will be read by the National Security Agency (NSA) and their XKeyscore system as revealed by Edward Snowden. In fact, as soon as our email leaves Iceland, it may be subject of XKeyscore according to this map.

So when you send a Donald Trump joke to a colleague, NSA can read it and it has been documented by The New York Times that two European travelers reported they were denied entry to U.S. after having made U.S. jokes on Twitter.

Even if the Microsoft servers to which we will have to send and from where we have to retrieve our email would not be located in the U.S., but elsewhere in Europe: Iceland has two submarine cables that go to European mainland and our email might go through FARICE that arrives in UK; Snowden said "They are worse than the U.S.": the Tempora system of the British Government Communications Headquarters (GCHQ) extracts "most" internet traffic (incl. emails) going through the UK and preserves the data for three days to have enough time to search it.

Via the UKUSA Agreement, UK and USA exchange data and from our Scandinavian partners, it is known that at least Sweden and Norway are also involved.

And even if you hope, your emails will not be sent via the FARICE cable, but via the DANICE cable directly to Danmark and for Danmark, no UKUSA Agreement is known: our emails will be stored at Microsoft and while Microsoft Europe claims that the European General Data Protection Regulation applies to their data centres in Europe: the US
administration argue that U.S. law applies to all Microsoft datacentres all over the world (because Microsoft is a U.S. company): "The administration has the support of 35 states led by Vermont who say they routinely seek access to data stored overseas".

So in future think twice what you send in emails that you thought are HÍ internal: your next visit of a scientific conference in the U.S. is in danger.

Already the fact that starting from now, people make think twice what they write is in my opinion a very bad thing and does not fit at all the concept of academic freedom that should be given at a university.

3.)
Why is this change pushed through with such a short notice?
The announcement to us in Tæknigarður was sent to us just last Thursday after office hours, namely at 17:27 o'clock, i.e. essentially one working day between notice and change today on Monday morning. Why is this change not done outside of teaching during summer?

Also the style of ordering all employees to their office today 10:00-11:00 shows that the administration is getting out of hand and ignores the fact that we have academic duties due to which we may not be in our office at that time.

Administration is there to support university teachers, researchers and students to do what is the purpose of a university: higher education and research. -- The purpose of a university is not administration and university teachers, researchers and students are not members of the university in order to support the administration.

The faculties even pay for the IT services that we are using internally. So this sounds like a free market system, but a free market system works only with competition and consumers having a choice. But we have no choice and our IT administration does not listen to us.

Why is democratic participation of the affected staff and students completely lacking? Future directions of the University should be discussed in Háskólaráð and other committees? The University of Iceland should be a place of open discussion. Instead the administration tries to push through this significant change.

4.)
We have a perfectly running email server running at RHÍ. This software is open source and based on standards that allow us to use any email client that we like to use for writing and accessing our emails. While all trends in academia is towards openness (the University just spent significant efforts to introduce open access for publication), the University of Iceland's administration is now replacing that open source email server software by a proprietary software from Microsoft. Microsoft is anyway already dominating the market in Iceland. The University of Iceland has a social responsibility to promote diversity in all fields. Instead, students will now see when they log-in to the Universities email system, a Microsoft logo giving the impression that there is no alternative to Microsoft.

5.)
A justification for this significant change is missing!

Is it the price? The old email system is open source software, i.e. it is available for free. Setting it up may take a couple of days for the system administrators at RHÍ, but this has already happened and our email services are running without any flaws. Of course, from time to time system administrators have to spend some time, e.g. to install security updates, but this is not an 8 hours per day 5 days per week activity.

For the costs of the Microsoft service, I have to rely on what Microsoft advertises to businesses:
I assume the "Office 365 Business Premium" plan (RHÍ mentions Skype for Business only available in that plan) which is $12.50 per user and month before VAT:

Let's assume HÍ got a special price of $10 per user and month and let's assume that while the change will affect all 1600 staff and more than 10 000 students, a special offer requires only to pay for 10 000 users and instead 12 month per year only for 10 month per year. This would yield

10 000 users * 10 months * $10 = 1 million dollar per year

While I hope, that HÍ will not pay 1 million dollar per year for that service to Microsoft, the costs will not be insignificant.

Now compare these costs to the open source solution that costs just a part-time system administrator (plus some server hardware).
In any case, these costs will be significantly lower what HÍ pays to Microsoft.

6.)
As I am doing research in the field of eScience, I in addition extremely worried in abandoning open standards and using proprietary products instead. Also from a security point of view, I am convinced that proprietary closed-source products (such as Microsoft) are less secure due to a lack of source code reviews by independent security experts.

I was an Icelandic delegate to a working group of the Nordic e-Infrastructure Collaboration (NeIC). Given the new head's IT policies, I am very worried about the future of eScience at the University of Iceland as provided by our computing centre RHÍ as e-Infrastructure provider. I am convinced that eScience must be based on open standards and open-source software -- but how can this be the case with an e-infrastructure provider where the head is not convinced if this? Notably, one of our two HPC system administrators at our computing centre RHÍ has quit his job here at RHÍ (and started at a different Icelandic organisation doing HPC) after the new head took over. So it seems, I am not the only one who sees no eScience future at University of Iceland. I therefore resigned being a delegate on behalf of RHÍ/University of Iceland to the Nordic e-Infrastructure Collaboration (NeIC) both because it makes no sense to be delegated with an e-Infrastructure provider in the back that has a completely different opinion and also to protest against the decision of the new head the University's IT department who has no experience with an academic environment nor with eScience and is thus likely to ruin the existing e-infrastructure.

Most academic staff is pretty upset. It seems that the new head is used to push through decisions against the will of the end users. He has a background in IT for banks and public administration, but he seems not to understand that a university is completely different and a place of academic freedom and diversity. For example, the rectors of the University of Iceland have the ambitious goal to become one of the top 100 universities in the world (according to the Times Higher Education ranking) which means, the University of Iceland needs to employ top researchers. Given the fact that the Icelandic-speaking population is less than 400 000 people, it is obviously these top researchers cannot be Icelanders only, but need to be international researchers who do not speak Icelandic. Still, the email concerning the email infrastructure being changes was only send in Icelandic to the affected employees. Alone this shows, that the new head of the University's IT has no idea of an academic work environment which is by definition international.

This shows what goes wrong if a non-academic person is hired into an academic workplace and not willing to listen to his customers, the professors and students.