61 results on '"Data transfers"'
Search Results
2. Efficient Scheduling of Data Transfers in Multi-tiered Storage.
- Author
-
Nan Noon Noon, Getta, Janusz, and Tianbing Xia
- Subjects
DATA distribution ,PARALLEL processing ,PRODUCTION scheduling ,ELECTRONIC data processing ,DATA warehousing ,OPTICAL disks ,PARALLEL algorithms - Abstract
Multi-tiered persistent storage systems integrate many types of persistent storage devices, such as different types of NVMes, SSDs, and HDDs. This integration provides a multi-level view of persistent storage, where each tier has a different data transmission speed and capacity. Data transfer processes operating on multi-tiered persistent storage allow for the parallelisation of data transfers among the partitions of data at the same or different tiers. This work considers the problem of efficient scheduling of parallel data transfers between the tiers of persistent storage. We consider a data processing model where several data transfer processes move or copy data from one tier to another through the buffers in transient memory. We propose a new model for data processing over multi-tiered persistent storage and new algorithms to minimise both the overall time spent on parallel data transfers and the idle time of data transfer processes. We also describe how the scheduling algorithms dynamically apply different procedures to assign data transfers to the processes. Finally, we present the outcomes from the experiments that confirm the correctness and efficiency of the scheduling algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. SHROUDED IN SECRECY - DOES THE COMITOLOGY PROCEDURE FOR GDPR ADEQUACY DECISIONS FIT ITS PURPOSE?
- Author
-
CZERNIAWSKI, MICHAL
- Subjects
DATA protection ,DATA protection laws ,GENERAL Data Protection Regulation, 2016 ,COMMERCIAL treaties ,TRANSBORDER data flow ,TECHNOLOGICAL progress - Published
- 2024
- Full Text
- View/download PDF
4. Reinforcing the Accountability of Global Market Organizations: Data Transfers and the Market Power of "Big Techs".
- Author
-
Yauri-Miranda, Jaseff Raziel, Alcañiz González, Leire, and Aguado Muñoz, Ricardo
- Subjects
MARKET power ,POLITICAL succession ,EXPORT marketing ,HIGH technology industries ,INDUSTRIAL management - Abstract
Traditionally, accountability was primarily designed to hold public bodies accountable to citizens. However, as global corporations continue to shape policy and society, these organizations must be accountable to citizens in a proactive manner as well. Therefore, this article proposes an expansion of accountability to include both public and private organizations. Drawing from political and economic studies, we propose holistic principles to increase social legitimacy in both procedural (regular) and substantive (deep) forms: responsibility, transparency, answerability, and participation. To illustrate those principles, we examine two case studies, Google and Meta, and two crucial fronts, data transfers and data market power. Through qualitative and empirical analyses, we evaluate how big tech companies are being held accountable by public authorities and society, highlighting both current developments and key limitations. The article paves the road to accountability studies and new economic theories that focus on deepening social orientation and increasing citizens' participation in business management. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Automatic mapping of sequential programs to parallel computers with distributed memory.
- Author
-
Bagliy, A.P., Krivosheev, N.M., and Steinberg, B.Ya
- Subjects
PARALLEL programs (Computer programs) ,COMPUTER storage devices ,CARTOGRAPHY software ,DISTRIBUTED computing ,COMPILERS (Computer programs) ,COMPUTER systems - Abstract
This work is aimed at creating an automatically parallelizing compiler for a distributed memory computing system. The conditions for the correctness of parallelization of the considered sequential programs on a computing system with distributed memory are described. The article considers the problem of placing data in distributed memory with minimization of interprocessor transfers for a program represented by a sequence of parallelizable loops. To solve this problem, an auxiliary bipartite "statements-variables" graph (SVG) is constructed and analyzed from the text of the program. The problem of grouping transfers of small data volumes into a smaller number of transfers of large data volumes is considered. The solution of this problem also leads to minimization of the time for data transfer. New chips with distributed local memory of thousands of cores have higher performance than the previous ones. Therefore, the development of compilers for the development of such microcircuits is necessary and relevant. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
6. Algorithm for Searching Minimum Inter-node Data Transfers.
- Author
-
Krivosheev, Nikita M. and Steinberg, Boris Ya.
- Subjects
SEARCH algorithms ,COMPUTER systems ,ALGORITHMS ,SYSTEMS on a chip ,PARALLEL programming ,DISTRIBUTED computing ,MULTICORE processors ,COMPILERS (Computer programs) - Abstract
It is proposed an algorithm finding the optimal array placement in the distributed memory of a parallel computing system. This work is a step towards the development of a new generation of parallelizing compilers for computing systems with a distributed memory. Such compilers may be useful for manycore systems on chip with addressable local memory (not cache) for each core. This is especially important for systems on chip with a many processor cores, where data exchange with RAM is a performance bottleneck. The construction of special auxiliary program graph for estimation of minimum number of data transfers is obtained. The arrays placement with the optimal number of data transfers is constructing in this paper. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
7. The right to effective remedy in international data transfers of electronic evidence
- Subjects
data protection ,effective remedies ,data transfers ,electronic evidence - Published
- 2023
8. The right to effective remedy in international data transfers of electronic evidence: Past lessons and future outlook
- Author
-
Kosta, Eleni, Kamara, Irene, and TILT
- Subjects
data protection ,effective remedies ,data transfers ,electronic evidence - Published
- 2023
9. APPLICATION OF NEW TECHNOLOGIES AND THE POSSIBILITY OF ONLINE DATA TRANSFER IN GEODESY AND CADASTRE.
- Author
-
Zaoralová, Jana and Kocáb, Milan
- Subjects
- *
COMPUTER networks , *DATA transmission systems , *SMALL business , *GEODESY , *SURVEYING (Engineering) equipment - Abstract
The article describes the results of the research project "Development of new technologies for surveying and cadastre", has been solved under the program to support applied research and experimental development "ALFA" and the provider of support Technology Agency of the Czech Republic. The aim of the project was to develop in the years 2012-2015, new technologies and software support for traditional surveying work such as documentation of detailed survey of changes, survey sketches, laying-out buildings, creation of technical maps and laser scanning. In new technologies and software have been incorporated current capabilities of new devices, computer networks, fast data transfers and the possibility of online connection with geodetic offices. In surveying practice procedures were in place greater security and immediate information geodetic measurements in the field, information about the movement of crews surveying the terrain and possible solutions to problematic situations arising during field measurements. Applied research and development showed greater rationalization particularly shorten the time to carry out the measurement technology and suitability of new procedures for small and medium enterprises in the field of surveying and land. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
10. Schrems v Data Protection Commissioner (Case C-362/14): Empowering National Data Protection Authorities
- Author
-
Marina Škrinjar Vidović
- Subjects
case c-362/14 ,schrems v data protection commissioner ,eu-us safe harbour arrangement ,data transfers ,data protection ,national data protection authorities ,directive 95/46/ec ,state surveillance ,Law ,Law of Europe ,KJ-KKZ - Abstract
On 6 October 2015, the Court of Justice of the European Union (CJEU) issued the final ruling in Schrems v Data Protection Commissioner (Case C-362/14). In its ruling the Court invalidated the Safe Harbour arrangement, which governs data transfers between the EU and the US. While the decision does not automatically put an end to data transfers from Europe to the United States, it allows each country's national regulators to suspend transfers if the company in the United States does not adequately protect user data. The paper analyses the most important aspects of the judgment: the Court’s definition of the competences of national data protection authorities, the Court’s interpretation of the criteria for ‘adequacy’ under Article 25(6) of Directive 95/46/EC and the reasoning of the Court for the invalidation of the Safe Harbour Agreement. Further, and in line with the findings of the Court, the paper analyses the relationship between state surveillance and data protection and examines the consequences of the Court’s ruling.
- Published
- 2015
- Full Text
- View/download PDF
11. Contribution to the public consultation on the Guidelines 05/2021 on the Interplay between the application of Article 3 and the provisions on international transfers as per Chapter V of the GDPR
- Author
-
Laura Drechsler, Svetlana Yakovleva, Faculty of Law and Criminology, Metajuridica, and Law Science Technology and Society
- Subjects
data transfers ,individual rights ,Definition ,GDPR - Abstract
The Guidelines 05/2021 on the interplay between the application of Article 3 and the provisions on international transfers as per Chapter V of the GDPR proposed for public consultation (‘the Guidance’) deal with a topic that has long puzzled the data protection community – the identification of international personal data transfers. The General Data Protection Regulation (GDPR) does not define international personal data transfers, though it requires their identification for the application of its transfer rules in Chapter V. It is therefore principally to be welcomed that the European Data Protection Board (EDPB) has tried to address the legal uncertainty stemming from the absence of a clear concept of international personal data transfer, especially in relation to the territorial scope of the GDPR regulated in Article 3. However, as legal researchers active in this area,1 we would nevertheless like to raise three areas to the attention of the EDPB that in our opinion are not sufficiently considered in the guidelines and should be addressed in the revised version after the public consultation. These are: i) the level of detail of the definition of ‘data transfer’ provided, ii) the potential inconsistency for the application of the GDPR by excluding from the definition of a transfer direct transmissions of personal data from an individual residing in the European Economic Area (EEA) to entities outside of it (‘direct transmission from the data subjects’), and iii) measures to ensure the protection of individuals is not undermined when Chapter V is not applicable in the latter situation.
- Published
- 2022
12. Enabling Pipeline Parallelism in Heterogeneous Managed Runtime Environments via Batch Processing
- Author
-
Blanaru, Florin-Gabriel, Stratikopoulos, Athanasios, Fumero Alfonso, Juan, and Kotselidis, Christos-Efthymios
- Subjects
Memory Management ,Virtual Machines ,GPUs ,Heterogeneous Architectures ,Optimizations ,Data Transfers - Abstract
During the last decade, managed runtime systems have been constantly evolving to become capable of exploiting underlying hardware accelerators, such as GPUs and FPGAs. Regardless of the programming language and their corresponding runtime systems, the majority of the work has been focusing on the compiler front trying to tackle the challenging task of how to enable just-in-time compilation and execution of arbitrary code segments on various accelerators. Besides this challenging task, another important aspect that defines both functional correctness and performance of managed runtime systems is that of automatic memory management. Although automatic memory management improves productivity by abstracting away memory allocation and maintenance, it hinders the capability of using specific memory regions, such as pinned memory, in order to perform data transfer times between the CPU and hardware accelerators.In this paper, we introduce and evaluate a series of memory optimizations specifically tailored for heterogeneous managed runtime systems. In particular, we propose: (i) transparent and automatic ''parallel batch processing'' for overlapping data transfers and computation between the host and hardware accelerators in order to enable pipeline parallelism, and (ii) ''off-heap pinned memory'' in combination with parallel batch processing in order to increase the performance of data transfers without posing any on-heap overheads. These two techniques have been implemented in the context of the state-of-the-art open-source TornadoVM and their combination can lead up to 2.5x end-to-end performance speedup against sequential batch processing.
- Published
- 2022
13. Article 38: Derogations for specific situations
- Author
-
Kamara, Irene, Kosta, Eleni, Boehm, Franziska, and TILT
- Subjects
law enforcement directive ,derogations ,data transfers ,law enforcement - Published
- 2022
14. Article 38
- Subjects
law enforcement directive ,derogations ,data transfers ,law enforcement - Published
- 2022
15. Airline Commercial Use of EU Personal Data in the Context of the GDPR, British Airways and Schrems II
- Author
-
Voss, W. Gregory and Toulouse Business School (TBS)
- Subjects
data flows ,GDPR sanctions ,[SHS.DROIT]Humanities and Social Sciences/Law ,airlines ,Privacy Shield ,personal data ,data transfers ,British Airways ,cross-border data flows ,Schrems II ,GDPR ,Schrems ,general data protection regulation - Abstract
International audience; This study, which focuses on the commercial use of personal data by U.S. airlines, uses actual cases to help analyze the application of the EU General Data Protection Regulation (GDPR) to the airline industry. It is one of the first studies to do so, and as such contributes to the literature. It begins by highlighting the British Airways GDPR penalty case, in which the UK regulator publicized its notice of intention to issue the highest administrative fine to-date under the GDPR. When the GDPR applies to them, airlines should become fully aware of key provisions of the GDPR, starting with those related to its scope and its underlying data protection principles, discussed in this study. In addition, airlines must have a legal basis to process personal data under the GDPR and, as this study shows, must have adequately prepared for data subject requests to exercise rights and potential data breaches.Several examples of the first GDPR sanctions in the airline industry are detailed, and lessons drawn. In this context, security of data is a key element. Finally, the recent Schrems II decision invalidating the EU-U.S. Privacy Shield Decision is examined, and its potential impact on the transfer of personal data from the European Union to the United States by airlines is studied, following an analysis of their privacy policies available on the Internet in the European Union.
- Published
- 2021
16. Modeling Resolution of Resources Contention in Synchronous Data Flow Graphs.
- Author
-
Lattuada, Marco and Ferrandi, Fabrizio
- Abstract
Synchronous Data Flow graphs are widely adopted in the designing of streaming applications, but were originally formulated to describe only how an application is partitioned and which data are exchanged among different tasks. Since Synchronous Data Flow graphs are often used to describe and evaluate complete design solutions, missing information (e.g., mapping, scheduling, etc.) has to be included in them by means of further actors and channels to obtain accurate evaluations. To address this issue preserving the simplicity of the representation, techniques that model data transfer delays by means of ad-hoc actors have been proposed, but they model independently each communication ignoring contentions. Moreover, they do not usually consider at all delays due to buffer contentions, potentially overestimating the throughput of a design solution. In this paper a technique to extend Synchronous Data Flow graphs by adding ad-hoc actors and channels to model resolution of resources contentions is proposed. The results show that the number of added actors and channels is limited but that they can significantly increase the Synchronous Data Flow graph accuracy. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
17. WSCOM: Online Task Scheduling with Data Transfers.
- Author
-
Quintin, Jean-Noël and Wagner, Frédéric
- Abstract
This paper considers the online problem of task scheduling with communication. All information on tasks and communication are not available in advance except the DAG of task topology. This situation is typically encountered when scheduling DAG of tasks corresponding to Make files executions. To tackle this problem, we introduce a new variation of the work-stealing algorithm: WSCOM. These algorithms take advantage of the knowledge of the DAG topology to cluster communicating tasks together and reduce the total number of communications. Several variants are designed to overlap communication or optimize the graph decomposition. Performance is evaluated by simulation and our algorithms are compared with off-line list-scheduling algorithms and classical work-stealing from the literature. Simulations are executed on both random graphs and a new trace archive of Make file DAG. These experiments validate the different design choices taken. In particular we show that WSCOM is able to achieve performance close to off-line algorithms in most cases and is even able to achieve better performance in the event of congestion due to less data transfer. Moreover WSCOM can achieve the same high performances as the classical work-stealing with up to ten times less bandwidth. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
18. Adaptive Data Transfers that Utilize Policies for Resource Sharing.
- Author
-
Gu, Junmin, Smith, David, Chervenak, Ann L., and Sim, Alex
- Abstract
With scientific data collected at unprecedented volumes and rates, the success of large scientific collaborations requires that they provide distributed data access with improved data access latencies and increased reliability to a large user community. The goal of the ADAPT (Adaptive Data Access and Policy-driven Transfers) project is to develop and deploy a general-purpose data access framework for scientific collaborations that provides fine-grained and adaptive data transfer management and the use of site and VO policies for resource sharing. This paper presents our initial design and implementation of an adaptive data transfer framework. We also present preliminary performance measurements showing that adaptation and policy improve network performance. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
19. Feedback for the European Data Protection Board (EDPB) in response to the public consultation on ‘Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data’
- Author
-
Gonzalez Fuster, Gloria, Drechsler, Laura, Faculty of Law and Criminology, Metajuridica, Law Science Technology and Society, and Brussels Centre for Urban Studies
- Subjects
transparency ,fundamental rights ,Information ,data transfers ,rights ,EDPB ,GDPR - Abstract
The Court of Justice of the European Union (CJEU) emphasised in the ‘Schrems II’ judgment the importance of ensuring ‘effective and enforceable rights and effective administrative and judicial redress’ for data subjects whenever personal data about them are transferred to third countries. The availability of ‘(e)nforceable data subject rights and effective legal remedies for data subjects’ is actually a necessary condition for data exporters to rely on any transfer mechanisms under Article 46 of the General Data Protection Regulation (GDPR). Data subject rights and remedies, however, can only be effectively available to individuals if the data subjects are aware of the existence of such rights and remedies, and if they know how to exercise them. The CJEU has previously stressed that information about data processing directly affects the exercise of data subject rights, and that information obligations imposed on data controllers – beyond being directly connected to the principle of transparency – are also necessary to comply with the principle of fairness
- Published
- 2020
20. Feedback to the European Commission regarding ‘Data protection – standard contractual clauses for transferring personal data to non-EU countries (implementing act)’
- Author
-
Gonzalez Fuster, Gloria, Drechsler, Laura, Faculty of Law and Criminology, Metajuridica, Law Science Technology and Society, and Brussels Centre for Urban Studies
- Subjects
standard contractual clauses ,data transfers ,GDPR ,SCC - Abstract
In this document we provide a series of recommendations regarding the wording of the proposed Commission Implementing Decision on standard contractual clauses for the transfer of personal data to third countries pursuant to Regulation (EU) 2016/679 of the European Parliament and of the Council and its Annex. Concretely, we focus on recommendations for the proposed Annex (Annex to the Commission Implementing Decision on standard contractual clauses for the transfer of personal data to third countries pursuant to Regulation (EU) 2016/679 of the European Parliament and of the Council), referring when relevant to the Implementing Decision.
- Published
- 2020
21. THE 'TEMPTATIONS' OF THE EUROPEAN AND NATIONAL CLOUD: AMID POLITICAL SIMPLIFICATION AND LEGAL CRITICISM
- Author
-
BOLOGNINI, Luca and PELINO, Enrico
- Subjects
data transfers ,cloud ,e-evidence ,Schrems II ,security ,national security ,GDPR ,cloud act ,privacy ,Court of Justice of the European Union - Abstract
The race to create local cloud solutions has become a constant in the digital development programs of member States. Italy is promoting the establishment of the so-called "national cloud", and Germany and France have been working for some time on the Gaia-X project. At European level, the development of cloud computing is taking a strategic role, at least for the immediate future. The declared objective is to free us from solutions that today are almost entirely dependent on infrastructures made available by international providers. Contributing to the debate - more by superimposition than composition - there are broadly geopolitical motivations, aspirations to global technological predominance and concerns associated with personal data protection for third party interference due to the extraterritorial application of foreign legislation. The recent "Schrems II" ruling by the Court of Justice of the European Union, and previously, but with less impact in the broader public, the results of the joint EDPB (European Data Protection Board)-EDPS (European Data Protection Supervisor) study on the United States' Cloud Act, have forced the question of the international acquisition of personal data flows (and non-personal, we may add) and the associated assurances. This paper intends to offer an analysis of the subject, aimed above all at unravelling the multiple levels of the questions raised, which touch not only on legal but on political matters and give an initial, reasoned census of the various bodies of applicable law. Above and beyond hard-hitting declarations, we need to determine to what point the independence of a local European cloud is effectively possible or desirable compared to non-EU providers and, in more concrete terms, to what point an autonomous solution is economically and technically practicable in terms of services that are essential for the States, and that enable the exercise of other fundamental rights and freedoms, for individuals, and therefore must not be susceptible to impairment or interruption. We also have to understand to what point it would be opportune in terms of security, even if this, at first glance, may seem counter-intuitive. In this sense, at least in the overall assessment, we have to consider the high levels of service stability and the existence of high-tech measures to contrast cybercrime that the major international providers can ensure, levels it would be inadvisable to forgo if it were possible to keep the benefits, thereby significantly reducing the associated risks. Indeed, the protection of processed data from external interference must also be measured on a different, but nonetheless significant, level to that of any potential "extraterritorial" threat, represented by criminal activity and security breaches. Another factor that must be considered is the availability of solutions, already entirely actual and "in the hands" of users, which offer immediate protection, such as encryption or the segregation of strategic data sets. In other words, if there are possible measures that significantly reduce the extraterritorial risk and that retain the advantages in terms of contrasting cybercrime, these must be duly included in the overall assessment. Our analysis will also shed light on inconsistencies in terms of data protection within the European Union, and misalignments between national efforts toward localization and the principle of the free circulation of data. In short, we intend to offer a more articulate, less obvious outline of the subjects we are dealing with, which cannot be reduced (if not at the cost of excessive simplification) to the simple contraposition of E.U. versus non-EU, but which reveal lateral synergies and joint misalignments on fronts apparently united; rather, it would be better to think in terms of the creation of a shared ecosystem that draws the most significant advantage from the solutions available today and acknowledges the need to adopt concrete forms of protection. This does not in any way mean evading the serious, but not immediately solvable extraterritorial questions, but preferably using them, if anything, as a mechanism for obtaining a critical reconsideration of the shortcomings and lack of legal harmonization that emerge even within the European Union. Nor does it mean embarking on a path toward domestic solutions that offer little, or at least less protection than those currently available, but instead maintaining high levels of protection against unlawful activity through more advanced technological solutions, and at the same time identifying legal instruments which assure greater national control of the infrastructures and data, and reduce risk to a legally acceptable level.
- Published
- 2020
- Full Text
- View/download PDF
22. Flux transfrontaliers de données, le RGPS et la gouvernance des données (29 Wash. Int'l L.J. 485 (2020)
- Author
-
Voss, W. Gregory and Toulouse Business School (TBS)
- Subjects
transferts de données ,Conformité ,Privacy Shield ,personal data ,CCPA ,Gouvernance des données ,Safe Harbor ,compliance ,RGPD Règlement Européen sur la Protection des Données ,Localisation des données ,[SHS.DROIT]Humanities and Social Sciences/Law ,sphère de sécurité ,Bouclier de protection de la vie privée ,data transfers ,flux transfrontaliers de données ,data localization ,GDPR ,Cross-border data flows ,data governance ,Données personnelles - Abstract
International audience; Today, cross-border data flows are an important component of international trade and an element of digital service models. However, they are impeded by restrictions on cross-border personal data transfers and data localization legislation. This Article focuses primarily on these complexities and on the impact of the new European Union ("EU") legislation on personal data protection-the GDPR. First, this Article introduces its discussion of these flows by placing them in their economic and geopolitical setting, including a discussion of the results of a lack of international harmonization of law in the area. In this framework, rule overlap and rival standards are relevant. Once this situation is established, this Article turns to an analysis of the legal measures that have filled the gap left by the lack of international regulation and the failure to harmonize law: extraterritorial laws in the European Union (regional legislation) and the United States (state legislation);and data localization laws in China and Russia. Specific provisionsrestricting cross-border personal data transfers are detailed under EU legislation, as are the international agreements that have been invaluable in allowing flows between the United States and the European Union to continue—first the Safe Harbor, and now the Privacy Shield. Finally, in this context, the role of data governance is investigated, both in the context of data controllers’ accountability for the actions of other actors in global supplychains under EU law and under the Privacy Shield. Thus, this Article goes beyond the law itself, to place requirements in the context of the globalized business world of data flows, and to suggest ways that companies may improve their compliance position worldwide.; Aujourd’hui, les flux transfrontaliers de données sont un élément important du commerce international et un élément des modèles de services numériques. Toutefois, elles sont entravées par des restrictions sur les transferts transfrontaliers de données à caractère personnel et la législation sur la localisation des données. Le présent article met principalement l’accent sur ces complexités et sur l’impact de la nouvelle législation de l’Union européenne (UE) sur la protection des données à caractère personnel - le RGPD. Tout d’abord, cet article présente sa discussion sur ces flux en les plaçant dans leur contexte économique et géopolitique, y compris une discussion sur les résultats d’un manque d’harmonisation internationale du droit dans le domaine. Dans ce cadre, le chevauchement des règles et les normes rivales sont pertinents. Une fois cette situation établie, cet article se tourne vers une analyse des mesures juridiques qui ont comblé le vide laissé par l’absence de réglementation internationale et l’absence d’harmonisation du droit : lois extraterritoriales dans l’Union européenne (législation régionale) et les États-Unis (législation des États); et lois sur la localisation des données en Chine et en Russie. Des dispositions spécifiques restreignant les transferts transfrontaliers de données à caractère personnel sont détaillées dans la législation de l’UE, de même que les accords internationaux qui ont été précieux pour permettre aux flux entre les États-Unis et l’Union européenne de continuer, d’abord la sphère de sécurité (Safe Harbor), et maintenant le bouclier de protection de la vie privée (Privacy Shield. Enfin, dans ce contexte, le rôle de la gouvernance des données est étudié, tant dans le contexte de la responsabilité des responsables du traitement pour les actions d’autres acteurs de l’offre mondiale en vertu du droit de l’UE et dans le cadre du Bouclier de protection de la vie privée (Privacy Shield). Ainsi, cet article va au-delà de la loi elle-même, pour placer des exigences dans le contexte du monde des affaires mondialisé des flux de données, et pour suggérer des moyens que les entreprises peuvent améliorer leur position de conformité dans le monde entier.
- Published
- 2020
23. Cross-Border Data Flows, the GDPR, and Data Governance (29 Wash. Int'l L.J. 485 (2020)
- Author
-
Voss, W. Gregory and Toulouse Business School (TBS)
- Subjects
transferts de données ,Conformité ,Privacy Shield ,personal data ,CCPA ,Gouvernance des données ,Safe Harbor ,compliance ,RGPD Règlement Européen sur la Protection des Données ,Localisation des données ,[SHS.DROIT]Humanities and Social Sciences/Law ,sphère de sécurité ,Bouclier de protection de la vie privée ,data transfers ,flux transfrontaliers de données ,data localization ,GDPR ,Cross-border data flows ,data governance ,Données personnelles - Abstract
International audience; Today, cross-border data flows are an important component of international trade and an element of digital service models. However, they are impeded by restrictions on cross-border personal data transfers and data localization legislation. This Article focuses primarily on these complexities and on the impact of the new European Union ("EU") legislation on personal data protection-the GDPR. First, this Article introduces its discussion of these flows by placing them in their economic and geopolitical setting, including a discussion of the results of a lack of international harmonization of law in the area. In this framework, rule overlap and rival standards are relevant. Once this situation is established, this Article turns to an analysis of the legal measures that have filled the gap left by the lack of international regulation and the failure to harmonize law: extraterritorial laws in the European Union (regional legislation) and the United States (state legislation);and data localization laws in China and Russia. Specific provisionsrestricting cross-border personal data transfers are detailed under EU legislation, as are the international agreements that have been invaluable in allowing flows between the United States and the European Union to continue—first the Safe Harbor, and now the Privacy Shield. Finally, in this context, the role of data governance is investigated, both in the context of data controllers’ accountability for the actions of other actors in global supplychains under EU law and under the Privacy Shield. Thus, this Article goes beyond the law itself, to place requirements in the context of the globalized business world of data flows, and to suggest ways that companies may improve their compliance position worldwide.; Aujourd’hui, les flux transfrontaliers de données sont un élément important du commerce international et un élément des modèles de services numériques. Toutefois, elles sont entravées par des restrictions sur les transferts transfrontaliers de données à caractère personnel et la législation sur la localisation des données. Le présent article met principalement l’accent sur ces complexités et sur l’impact de la nouvelle législation de l’Union européenne (UE) sur la protection des données à caractère personnel - le RGPD. Tout d’abord, cet article présente sa discussion sur ces flux en les plaçant dans leur contexte économique et géopolitique, y compris une discussion sur les résultats d’un manque d’harmonisation internationale du droit dans le domaine. Dans ce cadre, le chevauchement des règles et les normes rivales sont pertinents. Une fois cette situation établie, cet article se tourne vers une analyse des mesures juridiques qui ont comblé le vide laissé par l’absence de réglementation internationale et l’absence d’harmonisation du droit : lois extraterritoriales dans l’Union européenne (législation régionale) et les États-Unis (législation des États); et lois sur la localisation des données en Chine et en Russie. Des dispositions spécifiques restreignant les transferts transfrontaliers de données à caractère personnel sont détaillées dans la législation de l’UE, de même que les accords internationaux qui ont été précieux pour permettre aux flux entre les États-Unis et l’Union européenne de continuer, d’abord la sphère de sécurité (Safe Harbor), et maintenant le bouclier de protection de la vie privée (Privacy Shield. Enfin, dans ce contexte, le rôle de la gouvernance des données est étudié, tant dans le contexte de la responsabilité des responsables du traitement pour les actions d’autres acteurs de l’offre mondiale en vertu du droit de l’UE et dans le cadre du Bouclier de protection de la vie privée (Privacy Shield). Ainsi, cet article va au-delà de la loi elle-même, pour placer des exigences dans le contexte du monde des affaires mondialisé des flux de données, et pour suggérer des moyens que les entreprises peuvent améliorer leur position de conformité dans le monde entier.
- Published
- 2020
24. Data transport between visualization web services for medical image analysis.
- Author
-
Koulouzis, Spiros, Zudilova-Seinstra, Elena, and Belloum, Adam
- Subjects
SOFTWARE visualization ,VASCULAR diseases ,WEBSITES ,IMAGE analysis - Abstract
Abstract: With the rapid development of IT data driven experiments and simulations have become very advanced and complicated. The amount and complexity of scientific data increase exponentially. Nowadays the challenge is to efficiently support scientists who generate more data than they can possibly look at and understand. This requires not only high-performance visualization and feature extraction techniques but also efficient and intuitive management of available data and underlying computational resources. Consequently, Service Oriented Architectures and workflow management systems are becoming popular solutions for the deployment of e-Science Infrastructures aimed to assist in exploration and analysis of large scientific data. In this paper, we compare two transport models of workflow execution for data-intensive medical visualizations that rely on web services. The image-based analysis of vascular disorders served as a case study for this project. We applied a service oriented approach to construct distributed visualization pipelines, which allow visualization experts to develop visualizations to view and interact with large medical data sets. Moreover, end-users (i.e., medical specialists) can explore these visualizations irrespective of their geographical location and available computing resources. The paper reports on the current implementation status and presents our main findings. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
25. A framework for reconfigurable computing: task scheduling and context management-a summary.
- Author
-
Maestre, R., Kurdahi, F.J., Fernandez, M., Hermida, R., Bagherzadeh, N., and Singh, H.
- Abstract
Reconfigurable computing is consolidating itself as a real alternative to ASICs (Application Specific Integrated Circuits) and general-purpose processors. The main advantage of reconfigurable computing derives from its unique combination of broad applicability, provided by the reconfiguration capability, and achievable performance, through the potential parallelism exploitation. The key aspects of the scheduling problem in a reconfigurable architecture are discussed, focusing on a task scheduling methodology for DSP and multimedia applications, as well as the context management and scheduling optimizations. [ABSTRACT FROM PUBLISHER]
- Published
- 2002
- Full Text
- View/download PDF
26. Moving Just Enough Deep Sequencing Data to Get the Job Done
- Author
-
Ethan M. Bensman, William L. Poehlman, Nicholas Mills, Walter B. Ligon, and F. Alex Feltus
- Subjects
FASTQ format ,0303 health sciences ,Computer science ,Applied Mathematics ,RNA-Seq ,Computational biology ,high-throughput DNA sequencing ,FASTQ ,Biochemistry ,DNA sequencing ,Deep sequencing ,Computer Science Applications ,03 medical and health sciences ,Computational Mathematics ,High-Throughput DNA Sequencing ,0302 clinical medicine ,ComputingMethodologies_PATTERNRECOGNITION ,lcsh:Biology (General) ,data transfers ,Molecular Biology ,lcsh:QH301-705.5 ,030217 neurology & neurosurgery ,030304 developmental biology ,Original Research - Abstract
Motivation: As the size of high-throughput DNA sequence datasets continues to grow, the cost of transferring and storing the datasets may prevent their processing in all but the largest data centers or commercial cloud providers. To lower this cost, it should be possible to process only a subset of the original data while still preserving the biological information of interest. Results: Using 4 high-throughput DNA sequence datasets of differing sequencing depth from 2 species as use cases, we demonstrate the effect of processing partial datasets on the number of detected RNA transcripts using an RNA-Seq workflow. We used transcript detection to decide on a cutoff point. We then physically transferred the minimal partial dataset and compared with the transfer of the full dataset, which showed a reduction of approximately 25% in the total transfer time. These results suggest that as sequencing datasets get larger, one way to speed up analysis is to simply transfer the minimal amount of data that still sufficiently detects biological signal. Availability: All results were generated using public datasets from NCBI and publicly available open source software.
- Published
- 2019
27. From Big Data to Fast Data: Efficient Stream Data Management
- Author
-
Costan, Alexandru, Institut National des Sciences Appliquées - Rennes (INSA Rennes), Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES), Scalable Storage for Clouds and Beyond (KerData), Inria Rennes – Bretagne Atlantique, Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)-SYSTÈMES LARGE ÉCHELLE (IRISA-D1), Institut de Recherche en Informatique et Systèmes Aléatoires (IRISA), Université de Bretagne Sud (UBS)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-Institut National de Recherche en Informatique et en Automatique (Inria)-École normale supérieure - Rennes (ENS Rennes)-Centre National de la Recherche Scientifique (CNRS)-Université de Rennes 1 (UR1), Université de Rennes (UNIV-RENNES)-CentraleSupélec-IMT Atlantique Bretagne-Pays de la Loire (IMT Atlantique), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT)-Université de Bretagne Sud (UBS)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT)-Institut de Recherche en Informatique et Systèmes Aléatoires (IRISA), Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-École normale supérieure - Rennes (ENS Rennes)-Centre National de la Recherche Scientifique (CNRS)-Université de Rennes 1 (UR1), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT), ENS Rennes, Christian Pérez, Institut National des Sciences Appliquées (INSA), Université de Rennes (UR)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), Institut National des Sciences Appliquées (INSA)-Institut National des Sciences Appliquées (INSA)-Université de Bretagne Sud (UBS)-École normale supérieure - Rennes (ENS Rennes)-Institut National de Recherche en Informatique et en Automatique (Inria)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-IMT Atlantique (IMT Atlantique), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT)-Université de Rennes (UR)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), and Institut National des Sciences Appliquées (INSA)-Institut National des Sciences Appliquées (INSA)-Université de Bretagne Sud (UBS)-École normale supérieure - Rennes (ENS Rennes)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-IMT Atlantique (IMT Atlantique)
- Subjects
Big Data ,analyse de données ,[INFO.INFO-DB]Computer Science [cs]/Databases [cs.DB] ,gestion de données ,workflow management ,transfert de données ,[INFO.INFO-DS]Computer Science [cs]/Data Structures and Algorithms [cs.DS] ,gestion de métadonnées ,données massives ,storage ,transactions ,in-transit processing ,[INFO.INFO-IR]Computer Science [cs]/Information Retrieval [cs.IR] ,HPC ,data transfers ,traitement de flux ,data management ,metadata management ,[INFO.INFO-DC]Computer Science [cs]/Distributed, Parallel, and Cluster Computing [cs.DC] ,data analytics ,stream processing - Published
- 2019
28. Data protection certification mechanisms
- Subjects
Standards ,certification ,marks ,data transfers ,GDPR ,privacy ,accreditation ,conformity assessment ,seals - Published
- 2019
29. Data protection certification mechanisms
- Subjects
Standards ,certification ,marks ,data transfers ,GDPR ,privacy ,accreditation ,conformity assessment ,seals - Published
- 2019
- Full Text
- View/download PDF
30. Data protection certification mechanisms: Study on Articles 42 and 43 of the Regulation (EU) 2016/679
- Author
-
Kamara, Irene, Leenes, R., Lachaud, Eric, Stuurman, Kees, Lieshout, Marc Van, Bodea, Gabriela, TILT, Faculty of Law and Criminology, Metajuridica, and Law Science Technology and Society
- Subjects
Standards ,certification ,marks ,data transfers ,GDPR ,privacy ,accreditation ,conformity assessment ,seals - Published
- 2019
31. Data transfers in networks.
- Author
-
Choi, Hyeong and Hakimi, S.
- Abstract
The scheduling of the transfer of backlogged data in a network to minimize the finishing time is studied. The most complete treatment (of a version) of the problem is due to Gopal, Bongiovanni, Bonucelli, Tang, and Wong, who attacked the problem using the Birkhoff-von Neumann theorem. However, these authors do not provide a complexity analysis of their algorithm. In this paper we solve the version of these authors as well as a more difficult version of this scheduling problem by formulating them as a continuous form of the Hakimi-Kariv-de Werra generalization of the edge-coloring problem in bipartite graphs. This leads to polynomial time algorithms for these problems. Furthermore, our solution of the previously solved version has the desirable feature of having a tighter bound for the number of 'communication modes' than the solution of the above authors. In the above scheduling problem, there may be a time associated with changing from one set of simultaneous data transfers (i.e., a communication mode) to another. It is shown that if the overall finishing time of our schedule includes these times, then even very simple instances of our problem become NP-hard. However, approximation algorithms are presented which produce solutions whose finishing times are at most twice the optimal. Finally, in the above scheduling problem the interruption (or pre-emption) of the performance of each task is permitted. Essentially, the same problem when pre-emption is not permitted was studied by Coffman, Garey, Johnson, and LaPaugh. The relation between the two problems are explored. [ABSTRACT FROM AUTHOR]
- Published
- 1988
- Full Text
- View/download PDF
32. Mobility as an Alternative Communication Channel: A Survey
- Author
-
Benjamin Baron, Prométhée Spathis, Mostafa H. Ammar, Yannis Viniotis, Marcelo Dias de Amorim, Networks and Performance Analysis (NPA), LIP6, Sorbonne Université (SU)-Centre National de la Recherche Scientifique (CNRS)-Sorbonne Université (SU)-Centre National de la Recherche Scientifique (CNRS), Department of Electrical and Computer Engineering [NC State University], North Carolina State University [Raleigh] (NC State), University of North Carolina System (UNC)-University of North Carolina System (UNC), School of Electrical and Computer Engineering - Georgia Insitute of Technology (ECE GeorgiaTech), Georgia Institute of Technology [Atlanta], Networks and Performance Analysis ( NPA ), Laboratoire d'Informatique de Paris 6 ( LIP6 ), Université Pierre et Marie Curie - Paris 6 ( UPMC ) -Centre National de la Recherche Scientifique ( CNRS ) -Université Pierre et Marie Curie - Paris 6 ( UPMC ) -Centre National de la Recherche Scientifique ( CNRS ), Centre National de la Recherche Scientifique ( CNRS ), North Carolina State University [Raleigh] ( NCSU ), and School of Electrical and Computer Engineering - Georgia Insitute of Technology ( ECE GeorgiaTech )
- Subjects
Research literature ,FOS: Computer and information sciences ,Ad-Hoc Networks ,Bridging (networking) ,Wireless ad hoc network ,Computer science ,[ INFO.INFO-NI ] Computer Science [cs]/Networking and Internet Architecture [cs.NI] ,02 engineering and technology ,Computer Science - Networking and Internet Architecture ,[INFO.INFO-NI]Computer Science [cs]/Networking and Internet Architecture [cs.NI] ,0202 electrical engineering, electronic engineering, information engineering ,Electrical and Electronic Engineering ,Disruption-Tolerant Networks ,Data Transfers ,Networking and Internet Architecture (cs.NI) ,Mobility ,business.industry ,Offloading ,020206 networking & telecommunications ,Challenged Networks ,Software deployment ,020201 artificial intelligence & image processing ,Train ,The Internet ,business ,Computer network ,Data transmission - Abstract
International audience; —We review the research literature investigating systems in which mobile entities can carry data while they move. These entities can be either mobile by nature (e.g., human beings and animals) or mobile by design (e.g., trains, airplanes, and cars). The movements of such entities equipped with storage capabilities create a communication channel which can help overcome the limitations or the lack of conventional data networks. Common limitations include the mismatch between the capacity offered by these networks and the traffic demand or their limited deployment owing to environmental factors. Application scenarios include offloading traffic off legacy networks for capacity improvement, bridging connectivity gaps, or deploying ad hoc networks in challenging environments for coverage enhancement.
- Published
- 2018
- Full Text
- View/download PDF
33. Data Protection Certification in the EU
- Author
-
Kamara, Irene, De Hert, Paul, Rodrigues, Rowena, Papakonstantinou, Vagelis, Metajuridica, Faculty of Law and Criminology, Law Science Technology and Society, University of Brussels - European Criminal Law, and Fundamental rights centre
- Subjects
certification ,EDBP ,data transfers ,marks ,GDPR ,accreditation ,Data protection ,seals - Abstract
Certification and seals as a form of co-regulation have been on the EU agenda for over a decade. Enhancing consumer trust and promoting transparency and compliance are central arguments in the policy endorsement for certification. In the field of data protection, the General Data Protection Regulation has substanti- ated considerably these policy objectives of the European Commission. Our con- tribution discusses the new legal EU regime for data protection certification. Starting from the background of data protection certification and the preparatory works of the General Data Protection Regulation, the chapter analyses the legal provisions in the new EU data protection framework and reflects on the steps after the Regulation starts to apply.
- Published
- 2018
34. Cloud computing and data processing: sorting out legal requirements
- Author
-
Konstantinou, Ioulia, Kamara, Irene, Kumar, Vimal, Ko, Ryan, Chaisiri, Sivadon, Metajuridica, Faculty of Law and Criminology, and Law Science Technology and Society
- Subjects
accountability ,personal data ,cloud computing ,data transfers ,GDPR ,data security - Published
- 2017
35. Cyprus country report: Anti-doping and data protection legislation
- Author
-
Kamara, Irene, TILT, Faculty of Law and Criminology, Metajuridica, and Law Science Technology and Society
- Subjects
sensitive data ,personal data ,data transfers ,health data ,anti-doping - Published
- 2017
36. Cyprus country report
- Subjects
sensitive data ,personal data ,data transfers ,health data ,anti-doping - Published
- 2017
37. Cyprus country report
- Subjects
sensitive data ,personal data ,data transfers ,health data ,anti-doping - Published
- 2017
38. 'And [they] built a crooked h[arbour]' - the Schrems ruling and what it means for the future of data transfers between the EU and US
- Author
-
João Filipe Monteiro Marques and Universidade do Minho
- Subjects
Engineering ,business.industry ,Safe harbour ,Decision 2000/520 ,Computer security ,computer.software_genre ,Maximillian Schrems v. Data Protection Commissioner ,Data transfers ,Adequate protection ,Ciências Sociais::Direito ,business ,Telecommunications ,computer ,Direito [Ciências Sociais] - Abstract
Safe Harbour (Henceforth, SH) has been the main enabler of EU-US personal data transfers since Decision 2000/520/EC came into force. Initially, Safe Harbour was seen as an innovative solution to a difficult problem. However, the problems the agreement was created to solve were never remedied. Thus, it did not come as a surprise that the Court of Justice of the European Union (hereinafter, CJEU), in Case C-362/14 (the Schrems ruling), deemed the agreement invalid. In the story “And he built a crooked house”, the infamous ‘crooked house’, designed by Robert A. Heinlein’s character Quintus Teal, mirrors SH’s flawed design. It also exemplifies the fact that great innovations can fail if not thought through carefully. Although the Schrems ruling’s scope does not go beyond Decision 2000/520/EC, it will force European Data Protection Agencies to look deeper into alternative data transfer mechanisms and possibly, consider transfers to jurisdictions other than the US. Furthermore, this decision highlights the fact that if any progress on this front is going to be made going forward regarding personal data transfers, any solution(s) would have to be made at a global level. This paper will provide an overview of the implications of the CJEU ruling on data transfers between the EU and the US going forward., info:eu-repo/semantics/publishedVersion
- Published
- 2016
- Full Text
- View/download PDF
39. Moving Just Enough Deep Sequencing Data to Get the Job Done.
- Author
-
Mills, Nicholas, Bensman, Ethan M, Poehlman, William L, Ligon III, Walter B, and Feltus, F Alex
- Subjects
- *
NUCLEOTIDE sequence , *OPEN source software , *SERVER farms (Computer network management) - Abstract
Motivation: As the size of high-throughput DNA sequence datasets continues to grow, the cost of transferring and storing the datasets may prevent their processing in all but the largest data centers or commercial cloud providers. To lower this cost, it should be possible to process only a subset of the original data while still preserving the biological information of interest. Results: Using 4 high-throughput DNA sequence datasets of differing sequencing depth from 2 species as use cases, we demonstrate the effect of processing partial datasets on the number of detected RNA transcripts using an RNA-Seq workflow. We used transcript detection to decide on a cutoff point. We then physically transferred the minimal partial dataset and compared with the transfer of the full dataset, which showed a reduction of approximately 25% in the total transfer time. These results suggest that as sequencing datasets get larger, one way to speed up analysis is to simply transfer the minimal amount of data that still sufficiently detects biological signal. Availability: All results were generated using public datasets from NCBI and publicly available open source software. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
40. If America Doesn't Set the Digital Trade Agenda, China Will.
- Author
-
Zoellick, Robert B.
- Published
- 2022
41. Data integrity in regulated bioanalysis: a summary from the European Bioanalysis Forum Workshop in collaboration with the MHRA.
- Author
-
Arfvidsson C, Bedaf DV, Doig M, Globig S, Knutsson M, Lewis M, McDougall S, Michi M, Mokrzycki N, and Timmerman P
- Subjects
- Data Accuracy, Government Regulation, Pharmaceutical Preparations standards, Quality Control, Biological Assay standards, Pharmaceutical Preparations analysis
- Abstract
In this conference report, we summarize the main findings and messages from a workshop on 'Data Integrity'. The workshop was held at the 11th European Bioanalysis Forum Open (EBF) Symposium in Barcelona (21-23 November 2018), in collaboration with the Medicines and Health products Regulatory Agency to provide insight and understanding of regulatory data integrity expectations. The workshop highlighted the importance of engaging with software developers to address the gap between industry's data integrity needs and current system software capabilities. Delegates were also made aware of the importance of implementing additional procedural controls to mitigate the risk associated with using systems that do not fully meet data integrity requirements.
- Published
- 2019
- Full Text
- View/download PDF
42. Assessment Methodology for the General Principles for Credit Reporting
- Author
-
International Committee on Credit Reporting
- Subjects
DATA STORAGE ,FINANCIAL SECTOR DEVELOPMENT ,ELECTRONIC DATA ,GENERAL PUBLIC ,DESCRIPTION ,IDENTIFICATION NUMBER ,VERIFICATION ,BINDING ,INTERNATIONAL SETTLEMENTS ,WEBSITES ,ENFORCEMENT PROCESS ,RETENTION ,INTERNATIONAL SETTLEMENT ,CREDIT DECISION ,INFORMATION INDUSTRY ,FINANCIAL INFRASTRUCTURE ,INTERNATIONAL STANDARDS ,CONTENTS ,SITE ,ABBREVIATIONS ,LICENSES ,INSTITUTIONAL FRAMEWORK ,CREDIT REPORTING SYSTEMS ,COLLATERAL ,LEVEL OF ADOPTION ,SERVICE PROVIDERS ,CONFIDENTIALITY ,FRAUD ,RELIABILITY ,ENFORCEMENT MECHANISMS ,CONSUMER PROTECTION ,E-MAIL ,USERS ,BANK FOR INTERNATIONAL SETTLEMENTS ,RISK MANAGEMENT ,TRANSPARENCY ,PERSONAL DATA ,SERVICE CONTRACT ,COMPETITIVE CREDIT ,PROTECTION OF DATA ,REGULATORY AUTHORITIES ,DATA ACCESS ,BANKRUPTCIES ,REGISTRIES ,VALUE ADDED SERVICES ,SERVICE PROVIDER ,DISPUTE RESOLUTION ,ID ,OUTSTANDING AMOUNT ,BUSINESS PROCESSES ,FUNCTIONALITIES ,DEFAULTS ,INTERNAL CONTROLS ,SUPERVISION ,ORIGINAL AMOUNT ,DATA PROCESSING ,ARREARS ,CREDIT RISK ,COPYRIGHT ,REGULATORY AUTHORITY ,CAPABILITIES ,COLLECTION OF DATA ,OWNERSHIP STRUCTURE ,COMMERCIAL CREDIT ,CENTRAL BANK ,TELEPHONE ,DISCLOSURE ,DATA PRIVACY ,IDENTIFICATION INFORMATION ,PRIVACY ISSUES ,COMMUNICATIONS PROTOCOL ,NATIONAL CREDIT ,TECHNOLOGY INFRASTRUCTURE ,AUDITS ,BUSINESS CONTINUITY ,DEFAULT INFORMATION ,TERMINOLOGY ,CONSUMER RIGHTS ,RESULT ,ACCESS TO INFORMATION ,ARRANGEMENT ,MARKET DEVELOPMENT ,CREDIT INFORMATION ,FINANCIAL INTEGRATION ,USER ,REGULATORY FRAMEWORK ,PENALTIES ,SECURITY POLICY ,FINANCIAL SYSTEM ,LEGAL PROVISIONS ,RESERVE ,FINANCIAL INSTITUTIONS ,BEST PRACTICES ,FINANCIAL STABILITY ,CREDIT BUREAUS ,INTERNATIONAL FINANCIAL INSTITUTIONS ,CREDIT REPORTING ,EXTERNAL AUDITORS ,REGULATORY BARRIERS ,PUBLIC RECORD ,CREDIT REPORTING SYSTEM ,MARKET CONDITIONS ,FINANCIAL INSTITUTION ,HUMAN RESOURCES ,DATA ELEMENTS ,AUTOMATION ,LOAN ,BUSINESS INFORMATION ,EFFECTIVE GOVERNANCE ,INTERNAL SYSTEM ,CONSUMER CREDIT ,MATURITY ,INTERNATIONAL BANK ,PUBLIC RECORDS ,ARRANGEMENTS ,PERSONAL DATA PROTECTION ,MONETARY FUND ,MATERIAL ,CENTRAL BANKS ,DATA TRANSFER ,DEVELOPMENT BANK ,SUBSIDIARY ,OBJECT ,RESULTS ,DOCUMENTS ,LEGAL FRAMEWORK ,LEGAL ENVIRONMENT ,STANDARDIZATION ,LAWS ,COMMUNICATIONS PROTOCOLS ,QUERIES ,CREDIT MARKET ,DISCLOSURES ,DATA TRANSFERS ,LIENS ,PHYSICAL SECURITY ,ACCOUNTABILITY ,SYSTEM FAILURES ,INFORMATION SECURITY - Published
- 2013
43. Mathematical analysis of random telegraph noise in low-power applications of MOSFETs
- Author
-
Engert, Sonja, Toepfer, Hannes, Granzner, Ralf, and Schwierz, Frank
- Subjects
MOSFET ,random telegraph noise ,data transfers ,datové transfery ,matematická analýza ,náhodný telegrafní šum ,mathematical analysis - Abstract
Silicon MOSFETs are active switching elements which form the basis for most currently available digital circuits. Especially in information technology the growth in circuit complexity and data throughput leads to an increase in power consumption. For this reason, the energy efficiency of transistors has become a major design issue. A common way to increase the energy efficiency of these devices is to reduce the signal level, which finally leads to the so-called sub-threshold operation. An advantage of this approach is that it can be applied to common structures so that new device concepts are not necessarily required. However, the reduction in the signal level leads unavoidably to a degradation of the signal to noise ratio. Especially random telegraph noise has a significant influence on the circuit behavior. In this work this type of noise is studied mathematically. A stochastic simulation model was developed and a method was provided by which it can be determined how low the supply voltage can be chosen in order to keep the noise influence sufficiently low.
- Published
- 2013
44. A Motion-aware Data Transfers Scheduling for Distributed Virtual Walkthrough Applications
- Author
-
Pribyl, J., Pavel Zemcik, Burian, T., Kudlac, B., Oliviera, Manuel M., and Skala, Václav
- Subjects
pohybová funkce ,distributed virtual walkthrough ,motion function ,virtual environments ,Markov chain ,data transfers ,distribuovaný virtuální průchod ,přenosy dat ,Markovův řetězec ,virtuální prostředí - Abstract
Data transfers scheduling is an important part of almost all distributed virtual walkthrough (DVW) applications. Its main purpose is to preserve data transfer efficiency and render quality during scene exploration. The most limiting factors here are network restrictions such as low bandwidth and high latency. Current scheduling algorithms use multi-resolution data representation, priority determination and data prefetching algorithms to minimize these restrictions. Advanced priority determination and data prefetching methods for DVW applications use mathematic description of motion to predict next position of each individual user. These methods depend on the recent motion of a user so that they can accurately predict only near locations. In the case of sudden but regular changes in user motion direction (road networks) or fast moving user, these algorithms are not sufficient to predict future position with required accuracy and at required distances. In this paper we propose a systematic solution to scheduling of data transfer for DVW applications which uses next location prediction methods to compute download priority or additionally prefetch rendered data in advance. Experiments show that compared to motion functions the proposed scheduling scheme can increase data transfer efficiency and rendered image quality during scene exploration.
- Published
- 2013
45. Beyond Do Loops: Data Transfer Generation with Convex Array Regions
- Author
-
Béatrice Creusillet, Serge Guelton, Mehdi Amini, Département informatique (INFO), Université européenne de Bretagne - European University of Brittany (UEB)-Télécom Bretagne-Institut Mines-Télécom [Paris] (IMT), Centre de Recherche en Informatique (CRI), MINES ParisTech - École nationale supérieure des mines de Paris, Université Paris sciences et lettres (PSL)-Université Paris sciences et lettres (PSL), Silkan, and silkan
- Subjects
For loop ,Computer science ,Regular polygon ,GPU ,[INFO.PAR]Computer Science [cs]/domain_info.par ,02 engineering and technology ,Parallel computing ,020202 computer hardware & architecture ,Control flow ,data transfers ,0202 electrical engineering, electronic engineering, information engineering ,Leverage (statistics) ,020201 artificial intelligence & image processing ,Code generation ,While loop ,[INFO.INFO-DC]Computer Science [cs]/Distributed, Parallel, and Cluster Computing [cs.DC] ,convex array regions ,redundant transfer elimination ,Data transmission - Abstract
15 pages; International audience; Automatic data transfer generation is a critical step for guided or automatic code generation for accelerators using distributed memories. Although good results have been achieved for loop nests, more complex control ows such as switches or while loops are generally not handled. This paper shows how to leverage the convex array regions abstraction to generate data transfers. The scope of this study ranges from inter-procedural analysis in simple loop nests with function calls, to inter-iteration data reuse optimization and arbitrary control ow in loop bodies. Generated transfers are approximated when an exact solution cannot be found. Array regions are also used to extend redundant load store elimination to array variables. The approach has been successfully applied to GPUs and domain-speci c hardware accelerators.
- Published
- 2012
- Full Text
- View/download PDF
46. NewMadeleine: An Efficient Support for High-Performance Networks in MPICH2
- Author
-
Elisabeth Brunet, Francois Trahay, Guillaume Mercier, Darius Buntinas, Laboratoire Bordelais de Recherche en Informatique (LaBRI), Université de Bordeaux (UB)-École Nationale Supérieure d'Électronique, Informatique et Radiocommunications de Bordeaux (ENSEIRB)-Centre National de la Recherche Scientifique (CNRS), Efficient runtime systems for parallel architectures (RUNTIME), Inria Bordeaux - Sud-Ouest, Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Bordeaux (UB)-Centre National de la Recherche Scientifique (CNRS), Ecole Nationale Supérieure d'Electronique, Informatique et Radiocommunications de Bordeaux (ENSEIRB), École Nationale Supérieure d'Électronique, Informatique et Radiocommunications de Bordeaux (ENSEIRB), Mathematics and Computer Science Division [ANL] (MCS), Argonne National Laboratory [Lemont] (ANL), Grid'5000, Université de Bordeaux (UB)-Centre National de la Recherche Scientifique (CNRS)-École Nationale Supérieure d'Électronique, Informatique et Radiocommunications de Bordeaux (ENSEIRB), Centre National de la Recherche Scientifique (CNRS)-Université de Bordeaux (UB)-Inria Bordeaux - Sud-Ouest, and Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)
- Subjects
020203 distributed computing ,Network module ,Performance Evaluation ,Computer science ,business.industry ,Distributed computing ,Message passing ,010103 numerical & computational mathematics ,02 engineering and technology ,01 natural sciences ,Software ,Shared memory ,Computer architecture ,SMP Systems ,Multithreading ,Shared Memory ,0202 electrical engineering, electronic engineering, information engineering ,0101 mathematics ,Layer (object-oriented design) ,[INFO.INFO-DC]Computer Science [cs]/Distributed, Parallel, and Cluster Computing [cs.DC] ,business ,Implementation ,Data Transfers ,ACM D.1.3 - Abstract
International audience; This paper describes how the NewMadeleine communication library has been integrated within the MPICH2 MPI implementation and the benefits brought. NewMadeleine is integrated as a Nemesis network module but the upper layers and in particular the CH3 layer has been modified. By doing so, we allow NewMadeleine to fully deliver its performance to an MPI application. NewMadeleine features sophisticated strategies for sending messages and natively supports multirail network configurations, even heterogeneous ones. It also uses a software element called PIOMan that uses multithreading in order to enhance reactivity and create more efficient progress engines. We show various results that prove that NewMadeleine is indeed well suited as a low-level communication library for building MPI implementations.
- Published
- 2009
47. Data transport between visualization web services for medical image analysis
- Author
-
Elena Zudilova-Seinstra, Spiros Koulouzis, Adam Belloum, Computational Science Lab (IVI, FNWI), and System and Network Engineering (IVI, FNWI)
- Subjects
business.industry ,computer.internet_protocol ,Computer science ,Image-based medical analysis ,Construct (python library) ,Service-oriented architecture ,computer.software_genre ,Data science ,Visualization ,Data transfers ,World Wide Web ,Information visualization ,Service-oriented visualization ,Workflow ,Software deployment ,General Earth and Planetary Sciences ,Web service ,business ,computer ,Workflow management system ,General Environmental Science ,Web services - Abstract
With the rapid development of IT data driven experiments and simulations have become very advanced and complicated. The amount and complexity of scientific data increase exponentially. Nowadays the challenge is to efficiently support scientists who generate more data than they can possibly look at and understand. This requires not only high-performance visualization and feature extraction techniques but also efficient and intuitive management of available data and underlying computational resources. Consequently, Service Oriented Architectures and workflow management systems are becoming popular solutions for the deployment of e-Science Infrastructures aimed to assist in exploration and analysis of large scientific data. In this paper, we compare two transport models of workflow execution for data-intensive medical visualizations that rely on web services. The image-based analysis of vascular disorders served as a case study for this project. We applied a service oriented approach to construct distributed visualization pipelines, which allow visualization experts to develop visualizations to view and interact with large medical data sets. Moreover, end-users (i.e., medical specialists) can explore these visualizations irrespective of their geographical location and available computing resources. The paper reports on the current implementation status and presents our main findings.
- Full Text
- View/download PDF
48. THE 'TEMPTATIONS' OF THE EUROPEAN AND NATIONAL CLOUD: AMID POLITICAL SIMPLIFICATION AND LEGAL CRITICISM
- Author
-
BOLOGNINI, Luca and PELINO, Enrico
- Subjects
data transfers ,cloud ,e-evidence ,Schrems II ,security ,national security ,GDPR ,cloud act ,16. Peace & justice ,privacy ,Court of Justice of the European Union - Abstract
The race to create local cloud solutions has become a constant in the digital development programs of member States. Italy is promoting the establishment of the so-called "national cloud", and Germany and France have been working for some time on the Gaia-X project. At European level, the development of cloud computing is taking a strategic role, at least for the immediate future. The declared objective is to free us from solutions that today are almost entirely dependent on infrastructures made available by international providers. Contributing to the debate - more by superimposition than composition - there are broadly geopolitical motivations, aspirations to global technological predominance and concerns associated with personal data protection for third party interference due to the extraterritorial application of foreign legislation. The recent "Schrems II" ruling by the Court of Justice of the European Union, and previously, but with less impact in the broader public, the results of the joint EDPB (European Data Protection Board)-EDPS (European Data Protection Supervisor) study on the United States' Cloud Act, have forced the question of the international acquisition of personal data flows (and non-personal, we may add) and the associated assurances. This paper intends to offer an analysis of the subject, aimed above all at unravelling the multiple levels of the questions raised, which touch not only on legal but on political matters and give an initial, reasoned census of the various bodies of applicable law. Above and beyond hard-hitting declarations, we need to determine to what point the independence of a local European cloud is effectively possible or desirable compared to non-EU providers and, in more concrete terms, to what point an autonomous solution is economically and technically practicable in terms of services that are essential for the States, and that enable the exercise of other fundamental rights and freedoms, for individuals, and therefore must not be susceptible to impairment or interruption. We also have to understand to what point it would be opportune in terms of security, even if this, at first glance, may seem counter-intuitive. In this sense, at least in the overall assessment, we have to consider the high levels of service stability and the existence of high-tech measures to contrastcybercrimethat the major international providers can ensure, levels it would be inadvisable to forgo if it were possible to keep the benefits, thereby significantly reducing the associated risks. Indeed, the protection of processed data from external interference must also be measured on a different, but nonetheless significant, level to that of any potential "extraterritorial" threat, represented by criminal activity and security breaches. Another factor that must be considered is the availability of solutions, already entirely actual and "in the hands" of users, which offer immediate protection, such as encryption or the segregation of strategic data sets. In other words, if there are possible measures that significantly reduce the extraterritorial risk and that retain the advantages in terms of contrastingcybercrime, these must be duly included in the overall assessment. Our analysis will also shed light on inconsistencies in terms of data protection within the European Union, and misalignments between national efforts toward localization and the principle of the free circulation of data. In short, we intend to offer a more articulate, less obvious outline of the subjects we are dealing with, which cannot be reduced (if not at the cost of excessive simplification) to the simple contraposition of E.U.versusnon-EU, but which reveal lateral synergies and joint misalignments on fronts apparently united; rather, it would be better to think in terms of the creation of a shared ecosystem that draws the most significant advantage from the solutions available today and acknowledges the need to adopt concrete forms of protection. This does not in any way mean evading the serious, but not immediately solvable extraterritorial questions, but preferably using them, if anything, as a mechanism for obtaining a critical reconsideration of the shortcomings and lack of legal harmonization that emerge even within the European Union. Nor does it mean embarking on a path toward domestic solutions that offer little, or at least less protection than those currently available, but instead maintaining high levels of protection against unlawful activity through more advanced technological solutions, and at the same time identifying legal instruments which assure greater national control of the infrastructures and data, and reduce risk to a legally acceptable level.
49. THE 'TEMPTATIONS' OF THE EUROPEAN AND NATIONAL CLOUD: AMID POLITICAL SIMPLIFICATION AND LEGAL CRITICISM
- Author
-
BOLOGNINI, Luca and PELINO, Enrico
- Subjects
data transfers ,cloud ,e-evidence ,Schrems II ,security ,national security ,GDPR ,cloud act ,16. Peace & justice ,privacy ,Court of Justice of the European Union - Abstract
The race to create local cloud solutions has become a constant in the digital development programs of member States. Italy is promoting the establishment of the so-called "national cloud", and Germany and France have been working for some time on the Gaia-X project. At European level, the development of cloud computing is taking a strategic role, at least for the immediate future. The declared objective is to free us from solutions that today are almost entirely dependent on infrastructures made available by international providers. Contributing to the debate - more by superimposition than composition - there are broadly geopolitical motivations, aspirations to global technological predominance and concerns associated with personal data protection for third party interference due to the extraterritorial application of foreign legislation. The recent "Schrems II" ruling by the Court of Justice of the European Union, and previously, but with less impact in the broader public, the results of the joint EDPB (European Data Protection Board)-EDPS (European Data Protection Supervisor) study on the United States' Cloud Act, have forced the question of the international acquisition of personal data flows (and non-personal, we may add) and the associated assurances. This paper intends to offer an analysis of the subject, aimed above all at unravelling the multiple levels of the questions raised, which touch not only on legal but on political matters and give an initial, reasoned census of the various bodies of applicable law. Above and beyond hard-hitting declarations, we need to determine to what point the independence of a local European cloud is effectively possible or desirable compared to non-EU providers and, in more concrete terms, to what point an autonomous solution is economically and technically practicable in terms of services that are essential for the States, and that enable the exercise of other fundamental rights and freedoms, for individuals, and therefore must not be susceptible to impairment or interruption. We also have to understand to what point it would be opportune in terms of security, even if this, at first glance, may seem counter-intuitive. In this sense, at least in the overall assessment, we have to consider the high levels of service stability and the existence of high-tech measures to contrast cybercrime that the major international providers can ensure, levels it would be inadvisable to forgo if it were possible to keep the benefits, thereby significantly reducing the associated risks. Indeed, the protection of processed data from external interference must also be measured on a different, but nonetheless significant, level to that of any potential "extraterritorial" threat, represented by criminal activity and security breaches. Another factor that must be considered is the availability of solutions, already entirely actual and "in the hands" of users, which offer immediate protection, such as encryption or the segregation of strategic data sets. In other words, if there are possible measures that significantly reduce the extraterritorial risk and that retain the advantages in terms of contrasting cybercrime, these must be duly included in the overall assessment. Our analysis will also shed light on inconsistencies in terms of data protection within the European Union, and misalignments between national efforts toward localization and the principle of the free circulation of data. In short, we intend to offer a more articulate, less obvious outline of the subjects we are dealing with, which cannot be reduced (if not at the cost of excessive simplification) to the simple contraposition of E.U. versus non-EU, but which reveal lateral synergies and joint misalignments on fronts apparently united; rather, it would be better to think in terms of the creation of a shared ecosystem that draws the most significant advantage from the solutions available today and acknowledges the need to adopt concrete forms of protection. This does not in any way mean evading the serious, but not immediately solvable extraterritorial questions, but preferably using them, if anything, as a mechanism for obtaining a critical reconsideration of the shortcomings and lack of legal harmonization that emerge even within the European Union. Nor does it mean embarking on a path toward domestic solutions that offer little, or at least less protection than those currently available, but instead maintaining high levels of protection against unlawful activity through more advanced technological solutions, and at the same time identifying legal instruments which assure greater national control of the infrastructures and data, and reduce risk to a legally acceptable level.
50. Article 38
- Subjects
law enforcement directive ,derogations ,data transfers ,law enforcement
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.