archive-edu.com » EDU » O » OSC.EDU

Total: 329

Choose link from "Titles, links and description words view":

Or switch to "Titles and links view".
  • preloader
    alternate content for browsers that do not support scripting or for those that have scripting disabled Alternate HTML content should be placed here This content requires the Adobe Flash Player

    Original URL path: http://archive.osc.edu/bluecollarcomputing/images/preloader.html (2013-06-13)
    Open archived version from archive


  • Blue Collar Computing
    Summer Institute HPC and Software Training Current Training and Events Educators Online OCS Lecture Series Press Releases Headlines Calendar of Events About OSC Media Kit OSC Media Contacts Staff Directory Visit OSC Supercomputing Support Networking Support Blue Collar Computing About BCC Get an BCC HPC Account BCC in Action BCC Resources BCC Applications BCC Press Releases BCC in the News BCC Videos FAQs Contact Us Related Links bcMPI Supercomputing at OSC Hardware Resources Software Resources Research at OSC Blue Collar Computing Testimonials Rick Markham Polymer Ohio was established in 2001 by the state department of development to assist the polymer industry The Polymer industry is the largest manufacturing cluster in Ohio with about 2 800 companies and 140 000 employees Modeling and simulation is very important in the projection molding business Projection molded parts are the types of things that we see all over from fountain pens to parts on out automobiles I was pleased to learn early on in my association with the Ohio Supercomputer Center about the Blue Collar Computing initiative its an important bridge between what we see as more basic fundamental type things with some very sophisticated approaches to science and technology and moving that down

    Original URL path: http://archive.osc.edu/bluecollarcomputing/transcripts/markham.shtml (2013-06-13)
    Open archived version from archive

  • Blue Collar Computing
    s Summer Institute HPC and Software Training Current Training and Events Educators Online OCS Lecture Series Press Releases Headlines Calendar of Events About OSC Media Kit OSC Media Contacts Staff Directory Visit OSC Supercomputing Support Networking Support Blue Collar Computing About BCC Get an BCC HPC Account BCC in Action BCC Resources BCC Applications BCC Press Releases BCC in the News BCC Videos FAQs Contact Us Related Links bcMPI Supercomputing at OSC Hardware Resources Software Resources Research at OSC Blue Collar Computing Testimonials Mike Garvey M7 technologies is the result of a family business that started 1918 by my grandfather making me the third generation of a family business that address the needs and the requirements of a heaving industrial manufacturer sector of our kind So we decided to transition from the skill sets that were created for the heavy industrial complex to cross pollinate those with 21st century technologies that will enable the United States to compete on a global basis in those markets Blue Collar Computing is the application of high performance computational analysis on the shop floor and making formerly complex problems that could only be addressed by post graduate Ph D s massively intuitive to the

    Original URL path: http://archive.osc.edu/bluecollarcomputing/transcripts/garvey.shtml (2013-06-13)
    Open archived version from archive

  • Blue Collar Computing
    Series Press Releases Headlines Calendar of Events About OSC Media Kit OSC Media Contacts Staff Directory Visit OSC Supercomputing Support Networking Support Blue Collar Computing About BCC Get an BCC HPC Account BCC in Action BCC Resources BCC Applications BCC Press Releases BCC in the News BCC Videos FAQs Contact Us Related Links bcMPI Supercomputing at OSC Hardware Resources Software Resources Research at OSC Blue Collar Computing Testimonials Shuchi Khurana The best part of E Weld Predictor is its online on demand Welding is a very difficult process to simulate because of various complexities going on in the materials For example in the oil and gas industry if one is fabricating a pipeline to lay a pipeline from lets say Houston to Columbus type in your input parameters in the E Weld Predictor and within a few minutes you have an answer of what should you change in your welding parameters The partnership with OSC has been just fabulous We had support not only from the engineering side but the team working together just as a part of one company but also we had support from the executive side Blue collar computing is fabulous for manufacturers Its like in business

    Original URL path: http://archive.osc.edu/bluecollarcomputing/transcripts/khurana.shtml (2013-06-13)
    Open archived version from archive

  • OSC Research Report
    Lecture Series Press Releases Headlines Calendar of Events About OSC Media Kit OSC Media Contacts Staff Directory Visit OSC Supercomputing Support Networking Support 2008 Research Report Home Biological Sciences Advanced Materials Data Exploitation Research Landscape Blue Collar Computing Ralph Regula School of Computational Science Contact Us Research Reports 2009 Research Highlights 2008 Research Report 2007 Research Report Research Landscape Ohio s strengths in basic and applied research are broad and deep spanning a multitude of academic business and industrial organizations The spectrum of clients served by the Ohio Supercomputer Center likewise encompasses many fields of study This diversity attracts to Ohio eminent scholars and innovative entrepreneurs as well as a breadth of regional national and global research funding A review of several of these projects yields a team of chemists and naturalists constructing computer simulations of forest fires that predict the dangers of controlled burns to wildlife Researchers are measuring the water elevations of Amazon River tributaries to better understand the complexity of seasonal flooding And others are conducting vital studies in fields as diverse as psychology linguistics economics engineering and political science The Center strives to assist customers with basic needs while simultaneously meeting the requirements of its most

    Original URL path: http://archive.osc.edu/research/report/landscape.shtml (2013-06-13)
    Open archived version from archive

  • Configuring the MATLAB Parallel Computing Toolbox at OSC
    be referred to in the proceeding instructions as OSC CONFIG DIR Start MATLAB on your desktop Within MATLAB use the cd command to move to the directory in which the configuration files are stored i e cd OSC CONFIG DIR Unzip the configuration files using the command unzip osctools zip Use the cd command to change to the relative directory osctools common e g cd osctools common Run the command oscsetup A prompt like the one below will appear Enter your OSC username and click OK Text similar to the following will appear In order to complete the setup process we need to connect to glenn osc edu You will be prompted for your OSC password in order to connect Press return to continue Press enter You may warned about an RSA fingerprint as shown below Click yes A prompt for your password similar to the one shown below will then appear Enter your OSC password and click OK Test the connection by executing an ssh command with your username and the hostname glenn osc edu e g conn ssh humphrey glenn osc edu You may again be warned about an RSA fingerprint Click yes You may also be prompted for your password again Enter your OSC password and click OK To test the configuration run the command job pctdemo This runs the function pctdemo found the directory OSC CONFIG DIR osctools demo which requests that a 1x1 matrix of random values a 2x2 matrix of random values and a 3x3 matrix of values be generated in parallel Once the function is executed MATLAB should output something similar to followed by additional information Check the status of your job by using the command get job State The result should be queued running or finished Continue running this command periodically until the result is finished Get the results of the job using the command outputs getAllOutputArguments job In this case the result should be a 3 by 1 cell array Each result can be obtained using the commands outputs 1 outputs 2 and outputs 3 The MATLAB Parallel Computing toolbox creates a lot of extra files in order to run a job Once you have the outputs from your job you can clean up the job files on your local machine using the command destroy job Once you are finished using the Parallel Computing Toolbox you can close the ssh connection using the command disconnect conn The command destroy job will also get rid of files on Glenn However you may occasionally forget to issue this command after a job is finished You can simply delete files from finished jobs on your local machine To delete the job files on the remote machine log into the cluster using a file transfer utility such as the Filezilla client http filezilla project org To use Filezilla on Glenn run it and under the File menu select Site Manager In the site manager window as shown below press the New Site button and label

    Original URL path: http://archive.osc.edu/research/hll/matlab/index.shtml (2013-06-13)
    Open archived version from archive

  • Detection of Moving Targets in Heterogeneous Radar Clutter Scenarios
    Series Press Releases Headlines Calendar of Events About OSC Media Kit OSC Media Contacts Staff Directory Visit OSC Supercomputing Support Networking Support Computational Science Engineering Research Applications Parallel HLLs at OSC Current Projects Past Projects Contact Us Related Links Research Home Get an Account DoD HPCMP website PET Online Knowledge Center Research Reports 2009 Research Highlights 2008 Research Report 2007 Research Report Computational Science Engineering Applications Research Detection of Moving Targets in Heterogeneous Radar Clutter Scenarios Principal Investigator Juan Carlos Chaves Ph D Ohio Supercomputer Center Funding Source U S Department of Defense High Performance Computing Modernization Program User Productivity Enhancement and Technology Transfer HPCMP PET Surveillance of the ground by air and space borne sensors has proven to be essential to military and intelligence organizations Specifically the U S Department of Defense s 2006 Quadrennial Defense Review highlights the need for a highly persistent capability to identify and track moving ground targets in denied areas Of all the sensing technologies available ground moving target indication GMTI radar has important advantages because of features such as day night all weather operation and foliage obscurants smoke and dust penetration But GMTI radar data from targets also includes echoes from ground clutter and the radar motion strongly degrades the performance of target detection for a conventional moving target Space time adaptive processing STAP is a signal and image processing technique that compensates for the radar s platform motion Engineers must carefully develop and efficiently implement the robust STAP algorithms as the technique s high dimensional vectors and matrices render it computationally intensive To improve efficiency Ohio Supercomputer Center experts developed technology for DoD researchers that simplifies developing complicated algorithms such as STAP and significantly reduces the simulation times by connecting to and interacting with a supercomputer while still using MATLAB software or

    Original URL path: http://archive.osc.edu/research/cse/projects/targets/index.shtml (2013-06-13)
    Open archived version from archive

  • PVFS in the WAN
    Devulapalli Pete Wyckoff Funding Source Data Intensive Computing Environment DICE Program Duration 7 1 07 3 20 08 Description Utilizing geographically separated resources via wide area networks is a good way to take advantage of multiple computational engines and storage pools There are numerous examples of file systems that have been adapted for use in the wide area often relying on fast networks A fundamental challenge of wide area data access arises from the unavoidable communication latency In order to alleviate the effects of limited bandwidth and high latency we are investigating a framework for wide area file systems that includes metadata mirroring and data caching targeted for read only data access at remote sites In this common scenario metadata access on remote sites incurs no network overhead and frequently accessed files are cached but are kept consistent with respect to the source file system The system exerts minimal load at the data site while keeping remote sites consistent This project was funded by the DICE program and has been divided up into three parts Phase 1 Utilize the DICE environment to characterize file systems in the Wide Area Network Phase 2 Implement a framework for loosely couple wide area file system access Phase 3 Gather results and publish data Below we will present our initial findings from Phase 1 using the Parallel Virtual File System PVFS2 in the Wide Area The results were gathered using two remote clusters in the DICE program As is pretty clear the factor most limiting performance is as we expected the speed of the link between the two sites largely attributed to the high latency The minimal 4 byte latency between the two sites was measured to be 38 6 ms The results gathered here confirm our initial theory and will help us toward

    Original URL path: http://archive.osc.edu/research/network_file/projects/dice/index.shtml (2013-06-13)
    Open archived version from archive



  •