archive-edu.com » EDU » O » OSC.EDU

Total: 329

Choose link from "Titles, links and description words view":

Or switch to "Titles and links view".
  • Researcher simulates Alzheimer’s ‘protein misfolding’ errors
    processors for scientific computing Scale your parallel code to tens of thousands of CPU cores Deal with ginormous datasets The Virtual School of Computational Science and Engineering offers these courses and more during its summer program for 2010 Since 2008 nearly 250 students and researchers have participated in the annual Summer School offered by the Virtual School During Summer School students learn new techniques for applying high performance computing systems to their work Due to overwhelming demand for courses in previous Summer Schools we have added 15 sites for a total of 21 sites to the 2010 program in order to accommodate additional students For each course students attend on site in one of 10 state of the art distributed high definition HD classrooms located at academic and research institutions across the country These HD classrooms are equipped with live high definition videoconferencing technology that provides a high quality learning experience Students attend technical sessions presented by leading researchers in computational science and engineering and use cutting edge high performance computing systems provided by TeraGrid resource providers Course participants apply the techniques learned in hands on lab sessions assisted by skilled teaching assistants who work one on one and in small groups to answer questions and solve problems posed during the sessions This summer s courses are Petascale Programming Environments and Tools July 6 9 2010 Big Data for Science July 26 30 2010 Proven Algorithmic Techniques for Many core Processors August 2 6 2010 The cost for each course is only 100 To participate prospective students must first be enrolled in the Virtual School Enrollment is free and can be completed at https hub vscse org After enrolling students select their courses and indicate which of the distributed HD classrooms they would like to attend Snacks and an evening

    Original URL path: http://archive.osc.edu/press/releases/2010/compscicourses.shtml (2013-06-13)
    Open archived version from archive


  • Ohio Supercomputer Center serves new research groups with launch of ‘Csuri’ Advanced GPU Environment
    A new advanced service offered by the Ohio Supercomputer Center leverages the unique computing properties of the Graphics Processing Unit GPU to provide a robust visualization environment to researchers in fields as diverse as biomedicine electrosciences and the animation arts OSC recently completed deployment of the Csuri Advanced GPU environment increasing the Center s capabilities for advanced large scale remote visualization and batch rendering applications as well as GPGPU applications This powerful computing environment is now available to all Ohio college and university faculty students and staff as is an upcoming workshop planned to provide insights on the effective use of the service The high performance computing HPC community is aggressively exploring general purpose GPU GPGPU computing by using them as many core processors to solve scientific problems Modern commercially available central processing units CPUs are multi core processors with 2 4 or 8 independent processors on a single chip Many core processors on the other hand have hundreds or thousands of processors that are more tightly connected For the right kind of problems GPGPU computing can provide revolutionary performance advantages said David Hudak Ph D program director of cyberinfrastructure and software development at OSC The Csuri platform is designed to support the development of both GPGPU and advanced visualization solutions We look forward to working with our user communities to develop codes and evaluate GPU enabled third party applications One example of the work that can benefit from the Csuri Advanced GPU environment is the work of its namesake Charles Chuck Csuri whose sophisticated digital art involves giant rendering of thousands of frames Csuri is best known as the father of computer graphics computer animation and digital fine art creating the first computer art in the 1960s In addition to being recognized by the Smithsonian Magazine Csuri is seen as a pioneer of computer animation by the Museum of Modern Art MOMA and the Association for Computing Machinery Special Interest Group Graphics ACM SIGGRAPH Csuri became interested in the digital computer as a means of imaging in 1964 when he saw a computer generated face in a publication from the university s department of electrical engineering While a senior professor at The Ohio State University OSU Csuri founded the Computer Graphics Research Group the OSC Graphics Project and accad an academic unit dedicated to the development of digital art and computer animation During the beta testing of the GPU system Umit Catalyurek Ph D associate professor in the Department of Biomedical Informatics and Department of Electrical and Computer Engineering at OSU used the nodes to develop a component based runtime system for various biomedical image analyses and synthetic aperture radar image formations One of student researchers is using the new GPU system to finish his experiments for a project on automatic tuning of radar signal processing on emergent architectures said Catalyurek We are now developing software systems that will enable applications to easily scale from a single CPU or GPU to a cluster of GPUs and multicore CPUs With

    Original URL path: http://archive.osc.edu/press/releases/2010/csuri_gpgpu.shtml (2013-06-13)
    Open archived version from archive

  • OSC storing vast data from European supercollider
    speed of light around a 17 mile underground loop The proton proton collisions were conducted at seven tera electron volts TeV a unit of momentum in high energy physics These are the highest energy proton collisions ever produced in the laboratory 3 5 times higher than the previous highest energy proton collisions created at the Tevatron particle collider located near Chicago at the Department of Energy s Fermilab The ALICE collisions expel hundreds to thousands of small particles including quarks which make up the protons and neutrons of the atomic nuclei and gluons which bind the quarks together For a fraction of a second these particles form a fiery hot plasma that hasn t existed since the first moments after the Big Bang about 14 billion years ago Within the massive 52 foot ALICE detector 18 sensitive sub detectors measure the behavior of the expelled particles recording up to approximately 1 25 gigabytes of data per second six times the contents of Encyclopedia Britannica every second Actual reconstructed 7 TeV proton proton collision in the ALICE detector from the April running period Image CERN The massive data sets are now being collected and distributed to researchers around the world through high speed connections to the LHC Computing Grid LCG a network of computer clusters at scientific institutions including the Ohio Supercomputer Center The network employs the connectivity of private fiber optic cable links as well as existing portions of the public Internet The LCG is composed of more than 100 000 processors at 130 organizations across 34 countries and is organized into four levels or tiers Tier 0 is CERN s central computer which distributes data to the eleven Tier 1 sites around the world The Tier 1 sites in turn coordinate and send data to Tier 2 sites which are centers that provide storage capacity and computational analysis for specific tasks Scientists access the stored data through Tier 3 sites individual computers operated at research facilities Traditionally researchers would do much if not all of their computing at one central computing center This cannot be done with the ALICE experiments because of the large data volumes said Humanic OSC has been contributing computing resources to the project from the very beginning of ALICE s distributed computing efforts starting in 2000 Construction of the LHC began in 1995 when much of the necessary computational and networking technologies didn t yet exist The long term plan for the project loosely relied upon a concept referred to as Moore s law which describes the trend of computer processing power doubling every two years If CERN had brought the LHC online much sooner computing centers would have had a problem meeting the challenges said Doug Johnson a senior systems developer at OSC For quite some time OSC has been moving to meet the needs of a different mode of research where computers analyze the huge amounts of otherwise raw data collected from instruments such as satellites microscopes sequencing machines and particle colliders A

    Original URL path: http://archive.osc.edu/press/releases/2010/supercollider.shtml (2013-06-13)
    Open archived version from archive

  • Mathematical model provides new biological insights
    plays a role in developmental biology similar to that of mice and fruit flies The small size and short growing period of Arabidopsis makes it particularly well suited for genetic studies At a specific phase of Arabidopsis leaf development cells on the surface of the leaf receive genetic instructions to become either one of the majority pavement cells or a large hair like cell known as a trichome The specific function of trichomes is unclear although they may be involved in preventing infection protecting delicate tissues on the underside of the leaf or reducing the amount of water lost to evaporation To better understand how cells develop into trichomes Siegal Gaskins colleague Kengo Morohashi and Principal Investigator Erich Grotewold are focusing on relationships between three proteins that figure prominently in determining a cell s fate Most importantly the researchers are supplementing traditional benchwork with mathematics to better understand the proteins functional relationships The mathematical model Siegal Gaskins constructed consists of seven differential equations and twelve unknown factors For his preliminary studies he turned to OSC to choose random values for the unknowns and solve the equations for millions of different random value sets Due to the large range of possible parameters and the complexity of the problem we took advantage of OSC s parallel processing capabilities and the MATLAB computing environment Siegal Gaskins said This process was repeated for five million randomly chosen parameter sets and the set that gave us the closest agreement with experimentation was kept To meet the challenge of processing the millions of iterations Siegal Gaskins accessed OSC s IBM Cluster 1350 The center s flagship supercomputer system nicknamed the Glenn Cluster features 9 500 cores 24 terabytes of memory and a peak computational capability of 75 teraflops which translates to 75 trillion calculations per second Dr Siegal Gaskins is leveraging high performance computing HPC to better understand biological systems at the cellular and molecular level said Yuan Zhang client and technology support engineer at OSC His project is especially well suited for the Glenn Cluster which is largely dedicated to research in the biosciences and MATLAB software which features many tools for numerical computations The MATLAB programming software package is described as a technical computing environment for high performance numeric computation and visualization that produces output in mathematical formats OSC has been a leader in running MATLAB and other scripting languages in HPC environments Our bcMPI software initially released in 2006 interfaces with HPC cluster technologies when executing MATLAB scripts on a cluster explained David Hudak director of HPC engineering at OSC Over the last year we have been working to improve the accessibility of parallel MATLAB We designed Remote MATLAB Services RMS to enable our users to transition MATLAB scripts developed on their laptops to HPC resources Dan was an early adopter of OSC RMS and we learned a lot from his feedback It was a very good fit for his needs With the combination of computational modeling literature based analyses and laboratory experimentation

    Original URL path: http://archive.osc.edu/press/releases/2010/siegal_gaskins.shtml (2013-06-13)
    Open archived version from archive

  • Researcher simulates Alzheimer’s ‘protein misfolding’ errors
    America alone according to the National Institutes of Health NIH Alzheimer s most often appears after age 60 and leads to progressive and irreversible memory loss disability and eventually death Disorders like Alzheimer s occur through a complex series of events that take place in the brain over a long period of time probably beginning a decade or two before the most significant symptoms arise In the nucleus of nearly every human cell long strands of DNA are packed tightly together to form chromosomes which contain all the instructions a cell needs to function To deliver these instructions to various other cellular structures the chromosomes dispatch very small protein fibers called oligomers that fold into three dimensional shapes Misfolded proteins called amyloid fibrils cannot function properly and tend to accumulate into tangles and clumps of waxy plaque robbing brain cells of their ability to operate and communicate with each other according to NIH The strength of this research project lies in the integration of different experimental techniques with a computational approach to effectively illustrate various aspects of amyloid formation and its damaging effects according to Jie Zheng Ph D an assistant professor of chemical and biomolecular engineering at the University of Akron The exact mechanism of amyloid formation and the origin of its toxicity are not fully understood primarily due to a lack of sufficient atomic level structural information from traditional experimental approaches such as X ray diffraction cryoelectron microscopy and solid state NMR data Zheng explained Molecular simulations in contrast allow one to study the three dimensional structure and its kinetic pathway of amyloid oligomers at full atomic resolution Zheng s research group is developing a multiscale modeling and simulation platform that integrates structural prediction computational biology and bioinformatics to establish a direct correlation between the formation of oligomers and their biological activity in cell membranes This research is important for understanding the build up of protein plaque how it contributes to the breakdown of cells and how the process might be prevented This project has broad impacts on the prevention of diseased related protein misfolding and aggregation said Zheng The ultimate goal of this project is to rationally design a series of effective ligands inhibitors to prevent amyloid formation Zheng s anti amyloid project is leveraging the computational muscle of the IBM Cluster 1350 system at the Ohio Supercomputer Center The center s flagship supercomputer system named the Glenn Cluster features 9 500 cores 24 terabytes of memory and a peak computational capability of 75 teraflops about 75 trillion calculations per second Zheng s computational approach uses replica exchange molecular dynamics simulations a computationally intensive process for analyzing protein folding The REMD method performs a large number of concurrent simulations of amyloid formation while introducing different variables such as temperature that can influence the outcome of the process It s great to hear that OSC resources are being utilized to investigate the cause of such devastating human diseases especially one that robs us of the very essence of

    Original URL path: http://archive.osc.edu/press/releases/2010/alzheimer.shtml (2013-06-13)
    Open archived version from archive

  • Simulations key to research seeking faster computer chips
    graphite the dark gray carbon material that fills most pencils is highly stable visible under the right conditions even when only one atom thick stronger than steel and conducts electricity quickly and in exceptional ways While several researchers have been able to deposit small samples of graphene on a base material OSU associate professor of materials science and engineering Wolfgang Windl Ph D and his colleagues wanted to develop a method for producing very accurate very position specific graphene patterns in a way that industry could use to manufacture computer chips from this novel material To do this the OSU team wanted to etch their graphite samples to create numerous miniature pillars that then would be lightly pressed on a base material or substrate When the graphite was pulled away they theorized the silicon dioxide substrate would adhere to the graphite forcing a very thin top layer to shear off from the pillars To confirm their theory Windl turned to the high performance computing systems of the Ohio Supercomputer Center OSC and the Vienna Ab initio Simulation Package a software package for simulating the properties of systems at the atomic scale The calculations are computationally very demanding for the systems under consideration due to their size and complexity and they couldn t have been done without our allotment at the Ohio Supercomputer Center Windl explained Based on our initial success with these computer simulations we currently model adhesion on different substrates along with the resulting electrical transport through the graphene to optimize the stamping process and the resulting devices Dr Windl ran his jobs on OSC s flagship supercomputer system known as the Glenn Cluster explains Jim Giuliani client and technology support manager at OSC The Glenn Cluster offers registered users 24 terabytes of memory and a peak computational capability

    Original URL path: http://archive.osc.edu/press/releases/2010/Windl.shtml (2013-06-13)
    Open archived version from archive

  • Researchers publish article on tracking infectious disease
    of the article and curator in charge of scientific computing at the American Museum of Natural History AMNH The Supramap tool set has broad utility not only in tracking human disease in time and space but historical patters of biodiversity and global biotic changes Janies an associate professor of Biomedical Informatics at The Ohio State University OSU Wheeler and several colleagues created Supramap to calculate and project evolutionary trees in online geographic information systems such as Google Earth The resulting visualizations have been described as weather maps for disease that allow public health officials to see when and where pathogens spread jump from animals to humans and evolve to resist drugs Currently we are investigating H1N1 cases from around the world and Ohio by building evolutionary trees that discover how this strain came to be assembled and jumped from animals to humans We are also monitoring specific viral genes for mutations that confer resistance to drugs said Janies an expert in computational genomics Using parallel programming on high performance computing systems at the Ohio Supercomputer Center OSC greatly improves the efficiency and accuracy of our work Janies and his colleagues used a small cluster computer at OSU to beta test the Supramap application which has been developed through a grant from the Defense Advanced Research Projects Agency DARPA The research team also adapted the Supramap code to function smoothly on the OSC s flagship IBM Cluster 1350 Glenn system which features 9 500 cores and 24 terabytes of memory They now are working with the Center s staff to finish development of a Web interface to provide easy Internet access to the application by scientists and public health officials OSC is at the forefront of emerging computational methods and their use by the next generation of health care researchers and providers researchers like Dr Janies said Ashok Krishnamurthy interim co executive director of OSC The use of high performance computing to help bring research results to community accessible points of care will contribute significantly to the health and well being of people here and abroad Supramap uses Ohio Supercomputer Center resources and online geographic information systems such as Google Earth to calculate and project weather maps for disease Janies vision for Supramap involves receiving a steady stream of genomic and geographic data on pathogens from sources around the planet analyzing the raw data nightly and updating maps available to public health officials each morning Policymakers he believes could then make better informed decisions on crucial issues such as determining global hotspots for the emergence of dangerous pathogens and identifying where and when antiviral drugs are useful or not Janies was formally trained as a biologist but as a result of the computational demands of the biological questions he was investigating he began developing hardware and software Janies led the design construction of a computing cluster during a previous post with AMNH He earned his doctorate in zoology at the University of Florida and his bachelor s degree in biology at

    Original URL path: http://archive.osc.edu/press/releases/2010/janies.shtml (2013-06-13)
    Open archived version from archive

  • LHC research program launched with 7 TeV collisions
    sift through the flood in search of the tiny signals that could indicate discovery It s a great day to be a particle physicist said CERN Director General Rolf Heuer A lot of people have waited a long time for this moment but their patience and dedication is starting to pay dividends The DOE s Brookhaven National Laboratory and Fermi National Accelerator Laboratory are the host laboratories for the U S groups participating in the ATLAS and CMS experiments respectively Scientists from American universities and laboratories who comprise more than 20 of the ATLAS collaboration and 35 of CMS have played major roles in the construction of both detectors and join thousands of international colleagues as they operate the detector and analyze the collision data that will be collected in the coming years In addition Lawrence Berkeley National Laboratory is the host laboratory for U S groups participating in ALICE with American scientists contributing 10 of the ALICE collaboration The United States is also home to major national and regional computing centers that as part of the Worldwide LHC Computing Grid enable scientists in the United States and around the world to access the enormous amount of data generated by the LHC experiments Brookhaven National Laboratory and Fermi National Accelerator Laboratory host to major Tier 1 computing centers are the first stop in the U S for data from the ATLAS and CMS experiments respectively The data are further distributed to smaller NSF and DOE funded Tier 2 and Tier 3 computing centers across the country where physicists will conduct the analyses that may lead to LHC discoveries LHC research programme gets underway Text of the CERN Press Release Geneva 30 March 2010 Beams collided at 7 TeV in the LHC at 13 06 CEST marking the start of the LHC research programme Particle physicists around the world are looking forward to a potentially rich harvest of new physics as the LHC begins its first long run at an energy three and a half times higher than previously achieved at a particle accelerator It s a great day to be a particle physicist said CERN Director General Rolf Heuer A lot of people have waited a long time for this moment but their patience and dedication is starting to pay dividends With these record shattering collision energies the LHC experiments are propelled into a vast region to explore and the hunt begins for dark matter new forces new dimensions and the Higgs boson said ATLAS collaboration spokesperson Fabiola Gianotti The fact that the experiments have published papers already on the basis of last year s data bodes very well for this first physics run We ve all been impressed with the way the LHC has performed so far said Guido Tonelli spokesperson of the CMS experiment and it s particularly gratifying to see how well our particle detectors are working while our physics teams worldwide are already analysing data We ll address soon some of the major puzzles of modern

    Original URL path: http://archive.osc.edu/press/releases/2010/LHC.shtml (2013-06-13)
    Open archived version from archive