archive-edu.com » EDU » I » IRIS.EDU

Total: 1070

Choose link from "Titles, links and description words view":

Or switch to "Titles and links view".
  • IRIS Internship Program - David Watkins
    but an actual paper will be different although I feel like it could be easier in some respects since I ve actually done what I m writing about The writing will run through the fall semester most likely I m planning on continuing my project through the school year as my senior project at IUP It should lend itself well to that since I work remotely a lot of the time already Link to Post Comment on this Post 0 Comments Week Two Technical Difficulties June 17th 2013 Last week was spent largely dealing with computer problems We set up my own account on Harmony s computer which involved re installing all the random Unix tools involved with running RSQSim Suffice it to say that things didn t always go smoothly Lots of things had to be figured out which should have been written down a valuable lesson for the future Coincidentally I also heard from my mentor last summer asking about some issues I had with the program was running last summer Should have done the same thing there By the end of the week we did get down to running the actual fault model in RSQSim and dealing with some issues there as well of course Basically I will be running simulations of a shallow thrust fault representing the plate interface of a subduction zone with a seismogenic zone slow slip zone and a creeping zone RSQSim generates a record of relatively large events Mw 4 or so over hundreds to thousands of years This will be the primary data I will be working with this summer although of course model results are not real so I m not sure you can really call it that We will of course be comparing it to real data to make sure the results are realistic The main advantage of working from a modeling perspective for this issue is the longer time scales you can work with in a model orders of magnitude longer than we have from actual seismic data I will be varying different things with different runs of the model so in that sense it will be entirely unique work RSQSim was originally designed for strike slip faults but Harmony has adapted it to model subduction zones At this point I have a working I hope knowledge of how to run it and mess with different parameters and should be getting to into the meat of my project in the next few days Link to Post Comment on this Post 0 Comments Week 1 June 10th 2013 Well I ve now spent my first full week at my internship and things are starting to fall into place for the summer I ve found myself in yet another place that will be confusing to explain to people I go to school in the state within a state town of Indiana PA and now I m at the school that s supposed to be in Florida Miami but is in

    Original URL path: http://www.iris.edu/hq/internship/blogs/user/83 (2015-11-11)
    Open archived version from archive

  • IRIS Internship Program - Bradley Wilson
    on this Post 0 Comments Reflections on Hole Digging July 27th 2013 Dig drive dig drive dig drive dig drive Repeat x 7 That describes my past week surprisingly well I ve been out in the middle of nowhere Kentucky doing a part of field work called site prep Essentially before you can install a station you need to prepare it The vault which holds the sensor needs to be secured into the ground ahead of time so the sensor has a secure location when it arrives on site Thus we have spent the last week digging large holes and cementing the vaults into the ground with concrete Minimum wage skills for maximum wage fun I m not kidding Field seismology is SO much fun You bond with your fellow coworkers in a way that you can t in the lab You don t actually know someone well until you ve seen them fall into a vault hole I m looking at you Josh Plus you can t really call yourself a seismologist until you ve been stung by a bee while raising a post hole digger majestically into the air as a single bead of sweat drips down your grizzled face reflecting the 100 degree heat radiating oh so gently onto your concrete laced skin See Figure A below for a visual representation of the above mental image Anyways jokes aside the week was a blast Physically demanding yet enriching in a number of ways Firstly it s given me an enormous appreciation for the organization required to pull off such an experiment Simply put it s a logistical nightmare Dealing with multiple teams in multiple states organizing an enormous amount of equipment often needed simultaneously in different places and operating entirely within the permission of private land owners is no small task Remaining calm when things go wrong which they inevitably will is even harder Secondly it s prompted a series of personal musings on the innate sociological aspects of seismology Because I spend 90 of my time in a highly scientific world whether absorbed in my geophysical studies at the technical institution that is the Colorado School of Mines or interacting with the scientifically inclined side of family including my electrical engineer rocket scientist brother and digital strategy online communications guru father I often forget that anything involving the word seismic soars over most people s heads Interacting with and hearing the life stories of land owners from the farming mining industries naturally raises questions and concerns about how experiments like OIINK integrate into the surrounding communities Could we better communicate information from our experiment to those interested in the local communities in which we operate Software limitations currently prevents that data from being easily translated into a publicly available format How do we communicate earth structure problems in a way that is enticing and relatable for the average citizen Why is communicating these issues to the local communities important or is it not worth the time I have a particular interest in the intersection of science and society specifically regarding how we communicate scientific research Scientific storytelling if you will Communication is rapidly changing and science can t afford to fall behind Yet when I hear the life story of a land owner who is kindly letting us use his property for our research I can t help but ponder how experiments can be designed to communicate to an evolving technological society while retaining meaningful impact for a Midwestern farmer These are the things I think about when digging holes Finally the amount of driving involved in between different sites has allowed me to pick my mentor s brain on a number of topics Including but not limited to hiking destinations Farallon Plate research and the intersection between academia and industry It s been great to hear a wealth of knowledge on a whole slew of topics Computer s can spit out a wealth of accurate calculations but the information they provide still pales in comparison to a conversation with a 30 year seismology veteran I m excited for many more conversations during this next week of site installations I would have never thought a week digging holes would have been one of my best weeks all summer I would not be surprised if my next week surpasses this one Site installations are on the board for all of next week which involves less digging a whole lot more electronics and even more camping I m recollected well rested and ready to go Oh and lastly before I forget I ll just leave this here Link to Post Comment on this Post 0 Comments To Kentucky and Beyond July 21st 2013 Josh and I are busy packing currently as we head out into the field bright and early tomorrow morning We re going to be a part of phase III of the OIINK experiment where the goal is to relocate some 40ish seismometers eastward into Kentucky It s going to be two weeks of long and hot days but will be an exciting opportunity to experience field seismology first hand Someone had to install the instruments to collect the data we are using so it s only appropriate that we help the next phase of the project along Field seismology is no walk in the park assignment Preperation for our next couple weeks has been going on for months scouting locations creating recon reports for sights prepping field crews etc Josh and I will be on a team that prepares the stations for install while another team is in Missouri collecting the seismometers that are to be moved At some point during the next couple of weeks these two teams will meet and actually bring the new stations online I am leaving my tomography work somewhat in a state of limbo during these weeks although I have produced a new model that I will be using for my AGU abstract of which the deadline is rapidly approaching Ultimately my model will be a starting point for additional refinement but hopefully my mentor and I can glean some interesting information out of the model I ve produced We will have a computer in the field to start to analyze my results as it will be the first time my mentor and I have had face to face contact since June I haven t yet selected the appropriate damping for my model but I m hoping that s something I can get done before AGU as well Here s a video of the current model with a damping of 20 Looks quite a bit better than the previous iteration Also this is my last blog post for the next couple of weeks while I m out in the field but don t worry I will return with lots to share in August Bradley Link to Post Comment on this Post 0 Comments Damping Tests July 17th 2013 Sadly my mentor and I have not yet cracked the out of bounds entry point error that has been plaguing the tomography process since day one I have been running some debugging tests this week however and found out where exactly the error is generated from Unfortunately because the level of coding is way beyond my current knowledge I don t know how to fix the problem But identifying it will hopefully help my mentor streamline the process of getting rid of the errors once and for all In the mean time I ve been working with my model constructed from the fragmented data Tomography models solved via a least squares method include a parameter known as damping Essentially damping varies how smooth a model looks Depending on what you set your damping parameter to be you are minimizing different parameters If your damping is high your result minimizes error but doesn t necessarily reflect the complexities of the undetermined parameters in the solution If your damping is low your results will attempt to best represent the undetermined parameters in the problem but won t ensure small degrees of error To display this visually I ve included three images below with different degrees of damping The images are of the same slice through an identical model The only difference between them is the damping parameter 1 Damping 200 2 Damping 20 3 Damping 2 As you can see the three images show varying degrees of similarity Features that are evident in one image are completely missing in another After seeing these images you may be wondering how you know which one is most accurate Well the answer to that question is actually fairly simple You can create what is known as a damping trade off curve to analyze how the damping levels compare to each other This graph plots the amount of variance in your data versus the amount of variance in the model In general you want to pick the smoothest model that fits your data Ideally you test all the models on multiple iterations as well I will hopefully be producing a damping trade off curve for my model before the end of the summer Link to Post Comment on this Post 0 Comments Error Skipping this Station July 11th 2013 This past week has been fairly frustrating in the best possible way While I did post a first iteration tomography model video earlier in the week I wasn t quite satisfied with the results When running the inversion code at least 60 of the events generate the following error Error this station has an out of bounds start point skipping this station Now this error isn t preventing the code from completing as the stations are just skipped However with so many stations being skipped the results are less than optimal The reason this is particularly frustrating is that there doesn t seem to be a predictable pattern for how the error messages are generated For some events every station is skipped and the residual value doesn t even calculate For others only a few stations skip and the rest read in just fine There is no evident pattern on why stations are skipping or why certain events work and others don t My mentor is also out of the office this week making it hard for him to help me debug the errors Hersh Josh s mentor here at Purdue has helped me dig into the code a little bit but to no avail I ve gone back into the database of events and searched for patterns related to event distance and event location Still random I ve spent hours looking in the Fortran code itself to try and find out what is setting the error off but to no avail I was able to find the condition that causes it to fail but can t trace it back to anything related to the input data It s been a discouraging process as progressing further on more iterations is sort of futile if this error remains prevalent I ve spent all of today microscopically dissecting the example code specification files and data inputs to see where mine differ It s been quite a laborious process as there are so many parameters that aren t being used in my model and can simply be ignored However deciding which ones to ignore isn t always the easiest decision After engrossing myself in the depths of the specification files I think I may have found the root of the problem I m hesitant to say it will fix anything at this point but it may be a start The code starts the progress on a basic 1D earth model and then uses future models as the bases for additional iterations When looking at my spec file I saw that our 1D model was set up for my mentor s previous tomography run in Alaska thus not meshing well with the new grid for the OIINK model To fix this I downloaded 1D travel time tables from IRIS and built a new 1D model file that matched the geometry of the OIINK experiment I m now in the process of reconstructing the grid files and travel time files to see if I can fix the inversion errors I don t ultimately know if this will fix anything but it seemed like a potentially significant problem to me as I ve learned that the geometry of tomography is extremely important Fixing this issue is especially important for me as Josh and I head out into the field after next week We may potentially have one week after fieldwork to finish up the last bits of our project but it isn t realistic to bank on that time for significant work especially considering AGU abstracts are due at the end of that week It s become quite clear to me how completing something like a fully iterated tomography model is something that isn t done in 10 weeks Unless you have a perfectly organized data set with first arrivals already picked sitting right in front of you But lets face it what is the fun in that Link to Post Comment on this Post 0 Comments Tomography Model Version 1 0 July 8th 2013 I have officially complete my first tomography model Click HERE for link to the video Now for the extremely large disclaimer this model is EXTREMELY rough In fact It doesn t actually represent the updated picks I made a couple of weeks ago to add the Earthscope Transportable Array Stations to the list of picks These will help immensely in filling out the model as the array itself is centered in the middle of the model and doesn t come close to the edges Additionally first iterations are usually just immediately used as the new input for a second iteration However I ve decided to share the first iteration with you as it s been a long process to get to this point and I m proud of the model despite the current flaws I am beginning to feel comfortable with the workflow of the tomography code I can t say I know how to fix all the errors it throws but I know how to run the process from start to finish by myself I also have spent many hours looking into the specifics of the Fortran code in order to better understand exactly what is going on behind the scenes My next steps are to merge the new data with updated picks and rerun the model I m excited to see how the model improves with each iteration Bradley Link to Post Comment on this Post 1 Comments Purdue Progress July 1st 2013 I m writing this post anxiously awaiting my five hour code to finish running As I ve mentioned previously the tomography code is not yet working for my project That being said I ve made what is hopefully a huge step forward this morning I guess moving to Purdue provided a lucky spark The essence of the problem was that our geometry was not working out correctly Because we are dealing with P wave tomography one of the complications is that we need all of our waves to enter through the bottom of our model not the sides It s not a straightforward calculation to compute this and you can t just overestimate because the larger your grid becomes the slower your program runs Without this geometry behaving correctly the travel times for the stations are computed incorrectly To visually display this information Dr Pavlis wrote some code that turns the P Wave arrival times files into a format that is viewable by Paraview a 3D visualization software When the calculations are correct the arrivals are supposed to be sphere ellipsoid in shape When they are not correct well I ll just let you take a look Needless to say not spherical in shape However after making a few adjustments to the buffer zone on the depth grid geometry I have images looking like this Spheres Not stair step messy jumbles Now this is just the travel times for one specific station I decided to test just one station this time before running the four hour code that computes every station which I made the mistake of doing last time Now that I have this image however the code for all the stations is currently running After this is finished I ll try the inversion code one more time If it works this time I may actually do a little victory dance This is potentially a huge huge step towards completing tomography model version 1 0 Needless to say I m a little excited Because apparently clean geometric shapes do that to you when you are a scientist Hopefully I ll be reporting back later this week with even more successes C mon Purdue I know you have more luck in you Bradley Link to Post Comment on this Post 0 Comments Next Stop Purdue June 28th 2013 Today marks my last official day at Indiana University Josh and I will be heading this weekend to Purdue University where we will spend the rest of our internships before heading out into the field in late July This isn t quite the half way point of my summer but it marks a good time for me to reflect on some of the things I ve learned to date and outline my goals for the next segment of my internship While Bloomington has been an awesome town to explore it will be fun to head out yet again to another new location Things I ve Completed 1 I can safely say I know my way around dbxcor and dbpick the two tools I used for processing the raw data I ve now used these tools for a month and have improved drastically in my knowledge of the programs how to filter and pick arrival times accurately and why the tedious work is necessary It feels good to be whizzing around the interface of a scientific program I d never used before a month ago Tasks that would have taken me 10 minutes at the start now takes less than a minute 2 My shell scripting and GMT coding abilities have vastly improved They aren t perfect and I still have a lot to learn but the task of making a map or editing a shell GMT script are no longer daunting This is good seeing as both these tools are used so often in the work I m doing and will apply to work I ll being doing in the future even for my classes at school It is also nice to feel comfortable working in a terminal setting as I ll be connecting to all my data remotely from Purdue for the rest of the summer 3 I ve learned how a tomography model is computed and why it isn t always as easy as it looks There is a whole lot of code behind the pretty maps and I now have much more respect for the labor of love that goes into them 4 I understand the context and larger scale of the problem I m working on After absorbing so much new information the first few weeks I ve finally grasped a sense of the larger problem that OIINK seeks to address Between helping perform field work working on my analysis and seeing Josh s work the problem has truly come alive Things Needing Completing 1 The tomography model still doesn t work We ve spent countless hours adjusting geometry and parameters to no avail I m crossing my fingers that Dr Pavlis and I can figure out the root of the problem next week We took a look at the velocity models in a 3D visualization software today and have made some progress but there are still some bizarre effects happening Since my project relies on this model working I m anxious to have it start behaving correctly In the mean time I ve been learning where we went wrong which has really helped me grasp how tomographic models are constructed 2 The 3D basin model I ve talked about in some of my previous blogs still isn t constructed yet Once this data is formatted into something useful for Josh and I we will have to begin applying corrections to our data and make sense of the geometry of the Illinois basin This may be something that we just begin to look at during our internship 3 I need to starting thinking about and working on the details of my AGU presentation poster and abstract Since we are heading out into the field during the end of July and early August we have to get a jump on preparing some of this information I am so excited to have the opportunity to share my work and want to take the time to represent myself and my work well That about sums up my time in IU and my goals for Purdue You can wish me well as Josh and I move locations this weekend and that everything would go smoothly I look forward to sending my next update from West Lafayette Bradley Link to Post Comment on this Post 2 Comments The Joy of Research Code June 25th 2013 I ve finally got my hands on the tomography code I will be using to make my P wave tomography model It s written almost entirely in Fortran and is what my mentor describes as research code Essentially meaning that it is far far far from bug free and tends to need rough fixes and minor adjustments ALL the time It s been an interesting experience these past couple of days working through the procedures as I feel entirely ill equipped to deal with Fortran Unix error messages I have found myself saying No terminal sorry I don t know what caused a segmentation fault Better go get some help with that And since this code is essentially just borrowed from a fellow seismologist there isn t a wealth of information regarding error messages I am slowly learning why things are breaking and where I should be looking to fix them but it s largely a huge puzzle with very few pieces put together for me currently Here are a couple things I have learned completed so far 1 Tomography geometry is complex Tomographic models rely on the geometry you determine to run correctly There are a thousand ways to mess this up and a thousand places you have to go in and update the model The geometry of this model relies on both a coarse and a fine grid I calculated the size of these grids based on the locations of our stations and a guess of the size of the boundaries needed to contain all the waves Because seismic waves aren t coming in on straight lines your model has to be a bit bigger than the area of your stations This is really just a guess and can be edited later if something is incorrect The fine grid spacing that fits within the spacing can be as complex as you want to make it although mine is pretty simple It s also important to note that the more detailed your geometry comes the larger the output files are And for a program that already takes over five hours to run size matters if you want to run the code multiple times My final models won t be perfect but they will hopefully improve as I run the code additional times 2 Workflow is so so important This seems self explanatory although it is so easy to become too relaxed on The tomography code I m running consists of roughly 6 8 steps depending on what you count as an actual step Each of these steps comprises of multiple subsets Each of these subsets probably consists of manipulating data or files Needless to say it s quite easy to get lost in code That is if you can manage to not get confused in the Finder window Knowing what steps have to be completed in what order and where the necessary files are for each step is something that I am currently trying to keep track off As I worked through the process the first time with my mentor we chugged along happily Now that we have hit a significant snag in the process I am finally taking a breather and retracing my steps to ensure that I can repeat them when I head to Purdue next week and won t have Dr Pavlis down the hall from me It s also tricky because there is a balance to be struck between fixing inefficient workflow and leaving things be because they already work 3 Research Code is nasty because it works After three days of consistent debugging editing and fixing problems I finally asked my mentor why someone didn t take the time to update the code Not that Fortran is completely outdated but its rigid tendencies can be frustrating I learned that there are three primary reasons for keeping the code as is Firstly it s flexible There are so many inputs and different ways to approach the problem that a rough around the edges code is in some ways better It isn t pretty but you can fanagle with it until it works for what you want to do I don t really know if fanagle is an word but it adequately describes the process Secondly the program is massive The code I am using has been worked on longer than I ve existed Or close to it It is tens of thousands of lines of code at least To rewrite this in a different language would be a massive undertaking and beyond the scope of something you could stick an undergraduate intern on Simply put if it works don t fix it Finally it s fast The code has been designed to some level of efficiency It isn t always feasible to update code if the efficiency goes down And in this case it would plummet 4 You will make mess up probably critically mess up Obviously as an undergraduate intern working with code way way beyond my comprehension level it was inevitable that problems would arise However even simple mistakes are made all the time that force you to backtrack and redo your work It turns out all the data I ve analyzed for the first two weeks was missing all of the Earthscope TA stations because of a simple database error my mentor made Neither of us caught it until I had finished analyzing the 2013 data and as a result I have to go back and reanalyze roughly 90 events Such is life working with such large data sets For me it s a great opportunity to check my work before it is inputted into the tomography model Thankfully a mode exists in dbxcor that allows me to quickly sort through and find the relevant events to fix This will cut the time spent analyzing by over half It also works out because I am able to reanalyze the events while Dr Pavlis attempts to debug the tomography code Those are just four lessons that I ve learned over the past couple of days along with many more that I could spend hours typing out My work moved from something I understood fairly well to something I know nothing about in the matter of hours It s been a blast working outside my standard comprehension level Maybe I ll come out as a Fortran debugging wizard who knows Probably not but I ll at least be an apprentice Until next segmentation fault Bradley Link to Post Comment on this Post 0 Comments Residual Successes June 18th 2013 Today I ve accomplished a fairly large achievement of the past couple weeks of work I ve successfully summed the residual maps for all OIINK stations for the 2013 data that I analyzed entirely by myself To add a cherry on top all the processes and code to sum the residuals across all the stations was done this week while my mentor was out of town It feels good to have struggled through something and come to the answers by yourself To explain a little bit more about what I ve done I ll take you back to my first blog post when I showed you the color map The map I talked about in that post represented one specific earthquake event The map I ve just finished creating and will show below comprises roughly 100 events Looking back at what I ve done the task doesn t necessarily seem particularly difficult yet I learned much more in taking the slow route by myself Here s the gist of what I had to do 1 Separate the individual events into azimuth ranges I don t really know why I had to do this yet because my mentor hasn t been here but the 2012 data was separated this way so I just followed the same process 2 Isolate the files containing the lat long and residual data that GMT uses for mapping 3 Combine these files into a single file 4 Search through these files for matching stations 5 Sum the residuals for each station Not trivial with 80 stations 6 Calculate the averages for each station Also not trivial considering each station had a different number of events as a result of my initial editing of the waveforms 7 Record these averages along with the lat and long values in a new file 8 Create a GMT code that would map the averages 9 Run all the procedures for each azimuth range 10 Repeat the entire process to combine every data point Now some of this was trivial and some of it was quite difficult for me I had to look up and learn multiple new Unix commands as the easiest way to do this was in a shell script I had significantly adjust the GMT code as our existing code pulled data directly from our database and I needed it to pull the data from the new files I was creating There were also some complications stemming from how I had initially organized the files I definitely learned that the way you think you want the files organized is not always the way you want them organized for the final product My finished code isn t long or particularly complex but to see it work just as I intended is satisfying It s also nice to see the visual results of all the data I ve been processing since the start of my internship And alas here is the picture The figure clearly shows that P waves arrive quicker than normal in the southwest region of the figure Missouri Adjusting the scale to be a bit more color sensitive might show even more trends although you have to be careful with how you set color scales They can become quite deceptive if you aren t careful When my mentor comes back later this week he ll hopefully have some adjustments to make to improve the figure Who knows I may have done everything wrong Regardless it s fun to see the averages across all the events and all the stations Some stations aren t visible on single events because of noise or poor signals and the averages allow you to see the larger picture with every station That s all for now I ll update when my mentor returns to see how my figures have changed Hopefully I ve done at least part of the work correctly Bradley Link to Post Comment on this Post 1 Comments Collecting Data and Deer Ticks June 14th 2013 These past three days I ve been out in the field retrieving data from the OIINK standalone stations We recovered data from 21 stations which coincidentally was about the number of ticks I had to pick off my body throughout the three days Another day in the life of a Midwest field seismologist Despite our country s wealth of cellular networks the current location of our array lies in an area with mediocre cell service at best Thus the stations that cannot be relied upon to provide real time telemetry data are left to themselves to record their data all alone How depressing it must be for the standalone DASes only getting attention once every couple of months For the other interns who are reading this our stations are almost identical to the ones we set up during orientation Same setup same equipment and similar procedures The general process for recovering data from a station is relatively simple if nothing is awry You head to the station record the status of the instruments check to make sure the masses are aligned correctly dump all the data to the disk remove the disk or disks and replace them with empty ones and ensure the new disks are recording properly This only takes about 20 30 minutes if everything looks okay However that isn t always the case as I m sure you can imagine The most common problem is for one of the mass channels to be off center For those readers that haven t worked with such instruments the masses that move when the ground shakes are controlled by electronics that send pulses of electricity to keep the mass in the center These pulses are translated into the amplitude of ground motion There are three separate channels that measure different components of motion vertical east west and north south There is a standard voltage range which represents the mass being centered and everything working correctly If the mass voltages are outside this range a centering command needs to be sent to the sensor to realign the masses into the correct positions This is done automatically every five days and can be sent remotely via telemetry for many stations but has to be done manually for the standalone stations Sometimes the auto centering command sent every five days works and other times it isn t enough There is a whole slew of reasons why the masses can get off center on both the hardware and software sides A few of our stations ended up having dead channels that were unfixable within the time frame of this servicing run The sensors were not recognizing where the center was supposed to be and would instead just slide themselves along the whole range of voltages only to return to their original position As I had been told many times there are so many things that can go wrong in field seismology and I was definitely able to see first hand quite a few of them Although it takes longer when you have to open up the actual sensor vault to check the level and check if the mass balance is a hardware issue I enjoyed the experience of having to troubleshoot a little bit Below is a nifty little picture of one of the trouble maker stations that we tried to open up and fix Considering we were recovering raw data from our stations I figured this is probably a good time to detail the data set I am working with a little bit I ve given bits and pieces of the past couple of weeks but I don t know if I ve outlined the data set in specifics The OIINK Flexible Array experiment has already been going for a couple of years now I am jumping into the project somewhat in the middle although the conclusions on the data are far from being reached Data has been collected continuously as the array is moved to different locations in the Midwest The specific

    Original URL path: http://www.iris.edu/hq/internship/blogs/user/84 (2015-11-11)
    Open archived version from archive

  • IRIS Internship Program - Ryan Armstrong
    the man who predicts medals Cheers Ryan Link to Post Comment on this Post 0 Comments First GMT figure and looking forward July 19th 2012 Thankfully I think I am finally getting closer to having sorted through all of the necessary data to start getting interpretations Close is definitely relative but I feel closer to the end of it than the start now After cross correlating about one and a half more fault segments I will just need to finish picking my arrivals to calculate locations and the initial results On Tuesday I produced my first GMT figure which is simply a fault map overlain by the station map used for my project Fortunately I don t think many of the figures I will need to produce will be too much more complicated than this one but that remains to be seen The weather is finally cooling off here It rained almost all of last night and was about 71 degrees for my bike ride to work After checking the temperatures for the last month I can confidently say that it has been over 3 weeks maybe even more since it has been that cool in the morning Hopefully this will last a while Back to picking arrivals Cheers Ryan Link to Post Comment on this Post 2 Comments Picking Earthquake Arrivals July 12th 2012 This week has been good so far I ve picked earthquake arrivals to find locations for two of my now eight families of repeating earthquakes By the end of the day I hope to be done with the third and fourth The weather is finally cooling off a bit here it s gotten down to the mid 80 s a couple times but is supposed to heat up to around the hundreds again over the weekend I realized that a few days ago I didn t specifically address my goals that I set forth at the beggining of the internship which involved understanding the codes I am using beginning to understand the limitations of the theory and what it means to Christchurch and being able to make good figures for the project So here it goes I am understanding how to use the codes and how they work very well for the most part I will probably have a couple more to understand before it is all said and done so I will have to revisit this later I haven t yet had to make any figures with scripting only GISMO created plots I should begin doing this in the next week or two with GMT however So I will update you all on that I think of these goals I have made the most progress on understanding the theory better There have been a few instances where I have seen weaknesses in the theory s slip estimates just due to the methods of finding repeaters But for me to gain a full understanding of how this affects the end result will require more knowledge about how the final data set turns out after I get the locations and focal mechanisms At this halway point I think I ve made pretty solid progress on these three goals and am hopeful that the remaining weeks of my project will wrap them up nicely Cheers Ryan Link to Post Comment on this Post 2 Comments Recognizing my successes July 9th 2012 Sorry I ve been M I A for a few days the past week has been a rewarding albiet stressful one I spent a good portion of the week continuing my cross correlations on a new fault segment To date I have discovered about 7 different repeating eathquake families on 4 different segments of the Greendale fault system During a meeting with my advisor we decided that since many of these earthquakes don t show up in the GeoNet catalog as events I will need to find an earthquake location for most of them over the later portion of the internship This will allow me to further determine which earthquakes are actually repeaters Unfortunately I haven t had much time to work with GMT yet to make figures but once I have the new earthquake locations I m sure this will be one of the next logical steps On Friday afternoon we had an interesting seminar with Tiffany Lohwater about the troubles of communication with non science readers or listeners The two areas that struck home most strongly with me after a couple of poster presentations in Colorado were not necessarily knowing the knowledge of your audience and dealing with an audience of mixed experiences These are issues that I never conciously thought about which will hopefully help me out a lot in future presentations On the personal side my parents came into town for the 4th of July The were staying at a hotel only about a block from my house so travel to and from Capitol square with them was very convenient We got lots of good food went to a great botanical gardern and even got to watch a great fireworks right alongside the lake I ll be back to share more later this week Cheers Ryan Link to Post Comment on this Post 0 Comments Business as usual July 7th 2012 This week has mostly been a continuation of cross correlating earthquakes on the second fault segment to try to find repeaters I m hoping that by this afternoon I will be able to move on to the next fault segment Making progress On Monday I attended an awesome thesis defense which analyzed the evolution of a Japanese subduction zone using an amazing set of seismic reflection data After that I had a meeting with Cliff and some of his grad students so that we could all bounce ideas off each other I was able to show a couple of the graphs that I made the week before in GISMO This week has been rough watching Colorado Springs be threatened so thoroughly

    Original URL path: http://www.iris.edu/hq/internship/blogs/user/47 (2015-11-11)
    Open archived version from archive

  • IRIS Internship Program - Greg Brenn
    a much higher gain setting than the two horizontal component amplitudes Rotating this unnormalized data proved to be difficult when having to use different gain settings 3 of us went out into the field yesterday to fix these gain settings for the future data sets but for the data sets that I hoped to use the gain settings obviously could not be adjusted in the field Hopefully we can adjust these settings using code but the question of whether these gain settings actually affects the receiver function application also arises Because in the deconvolution process the Impulse response of the seismometer is cancelled out of the equation when isolating only the earth response This week we ll troubleshoot these receiver functions almost as a last ditch effort to see if we can produce anything What s even more frustrating is that even when we shipped our data off to another receiver function resource to see what they produce similar bad receiver functions result as shown in the image below Hopefully we can produce something significant but filling that AGU poster with no interpretation of good data is really starting to haunt me Link to Post Comment on this Post 0 Comments Well that s no Receiver Function July 9th 2012 Today was a somewhat frustrating day Simply put the receiver function code did not seem to create a decent model as seen in the attached figure After changing the water level parameter as well as add in the Gaussian Filter I still could not clean up the Signal to Noise ratio that would allow me to see the P S conversions On the bright side my advisor and I have a working code that we can build off of and stare at for the next couple of days to see what we can do that may help fix our problem This deconvolution method is known as the water level deconvolution method working in the frequency domain so tomorrow rest of this week if we do not figure out what is wrong with our present code I will investigate new receiver function deconvolution methods such as working in the time domain or using an iterative deconvolution method I have been doing a lot of reading on numerous journal articles and I am definitely understanding the process of creating receiver functions but it s just applying that process to a new computer program that is becoming very difficult and frustrating I guess this is all part of this IRIS learning experience finding ways to troubleshoot through problems to hopefully come up with a solution More updates as this issue is hopefully resolved Greg NOT A RECEIVER FUNCTION I WANT Link to Post Comment on this Post 2 Comments Beginning Figures of Receiver Functions July 9th 2012 Hi everyone I d like to update you all on this past week of progress made regarding receiver function analysis Kasper my advisor and I have been working with this Python program Obspy new to the both of us but we finally finished the first step and created a figure that contains the vertical radial and transversal components of a seismogram See figure below This is a great step forward because now I can use these three components in the next process known as source equalization To put it simply and to practice writing my research for the layman to understand three components are needed to isolate the receiver response from other effects that create seismic waveforms Now the next step is to use deconvolution a somewhat complicated mathematic equation using Fourier Transforms to create the receiver function seismogram Deconvolution takes a signal and removes these effects in the signal to leave only the pieces of the signal we want Now all I need to do is add in some filters both water level and Gaussian as also mentioned in Erin s blog Next week I hope to get as far as I can on the process for one receiver function so I can hopefully copy this process for more than seismogram The process is going a little slower than I was expecting but I guess this is what happens when one is working with a new code and new program More to come next week Link to Post Comment on this Post 0 Comments The Third Week June 28th 2012 Hi My Third week at Boise State has included both collection of data as well as becoming more familiar with what I plan to do through July The seismic data we collected from NHS last Wednesday is finally downloaded and turned into the proper file format for Antelope This data set will be extremely important to see if there has been seismicity related to fluid flow injection in the geothermal system I ve only briefly looked at this data but next week I plan on attempting to find events characterisitic waveforms that correspond to these injections There are a couple of steps that need to be accomplished before identifying these events 1 Can I flag and catalog events during the time of data collection and match these events up with the events in the NEIC database If so these events should be removed from my data set to make the data more manageable in finding local events that were not catalogued in the NEIC database 2 Once these larger regional global events are removed from Antelope s Datascope I will need to figure out what exactly I m attempting to look for I will need to answer the question what kind of characteristic waveforms are generated from fluid injections in a geothermal system I am going to search through previous research to see if there is any information on what these waveforms may look like 3 Lastly I will need to figure out which filters are suitable to pick out the clearest waveforms Additionally I have been reading up on receiver functions concepts Fourier Transforms deconvolution etc to attempt to understand the theory behind how

    Original URL path: http://www.iris.edu/hq/internship/blogs/user/48 (2015-11-11)
    Open archived version from archive

  • IRIS Internship Program - Leah Campbell
    long to fix most spots but of course once you change one set of picks you have to go through to make sure it corresponds with all the following picks The next step was to invert the first picks for the refraction data and turn it into a suitable velocity model Making the inversions isn t too difficult but it takes the program written by fellow IRIS interns Amanda and Rachel s mentor a while to run One has to also create multiple versions as well each with slightly different inversion parameters initial velocity model grid size smoothing parameters etc Unfortunately the GPS readings from the line are all off because of tree cover so I haven t been able to input the real geometry yet for the line But because the inversions need elevation data I had to trace out my line the best I could on Google Earth record the elevation at points 5m apart convert elevation from feet into inches and then turn the raw elevation data for each point into distance above the lowest point It s not a perfect method but it does let one add a sense of the topography into the inversions Once the inversion program finished running I can run a GMT program that turns the inversions into an actual image of the velocity model Each version produces 30 or so iterations and so I ve had to go through and choose the best iteration for each version and compare them to get the ideal velocity model However I realized last Friday that I had reversed the elevation data so it ran East West while my shot point data runs West East That meant I had to go through and recalculate the elevations on Google Earth and rerun the inversions I would have then gone through all of my velocity models on Friday but I had to leave that until today since I left work early on Friday to drive up to Yosemite for the weekend with my family I ve been in California 3 years and have never made it up but needless to say it was an AMAZING weekend Of course it made getting up early again this morning for a Monday at work quite difficult Link to Post Comment on this Post 0 Comments Week Seven July 20th 2012 I can t quite believe seven weeks of my internship are already done I still have four weeks left but I feel like the half waypoint of the summer just kind of flew by Things have been pretty slow this week though looking at my action plan for the rest of the summer I know it s sure to pick up by next week Since you can t remotely access the USGS server unless you re on a USGS laptop I have to get all of my processing interpretation and figure making done by the end of the summer I m not going to worry at all about my poster until I m back at school in the fall Abstract deadlines are also coming up so I really want to get through a lot in the next two weeks to help add substance to my abstract As I mentioned this week was quite slow I was picking the entire week essentially I ve never picked first arrivals before so it was a pretty tedious week spent mostly going over my own mistakes I finally got through all of my shots this morning but now I m just going through them again to make sure they all make sense as a consecutive series Hopefully then this afternoon I ll be able to learn a bit about the next step in processing reciprocity To improve and check the first picks we use the basic physical fact that the time taken to travel between two points is the same no matter which direction you come from We can check the first picks for one shot by looking at the time it takes the energy to get from geophone A the shot to geophone B and compare it to the time it takes energy to get from the later shot at geophone B back to geophone A Hopefully between that and killing any traces I couldn t pick I will weed out most of the bad picks because there are a few In the meantime I realized I never looked back at the goals I mentioned in the first two weeks Now that the half way mark has come and gone I decided it might be a good idea to go back briefly to some of my primary goals I know I m going to be using a program called ProMax as well as a lot of basic Unix so a large goal I have for myself is just to become comfortable and self sufficient in both given that I don t have a strong programming background I ve definitely worked on this one a lot and I feel a lot more comfortable with both Unix and ProMax I ve learned to find mistakes and fix them myself and I have a better grasp on what specific flows and codes mean On my final product I know Adobe Illustrator and GMT will be involved so hopefully I can figure those out as well I haven t touched Illustrator or GMT all summer but I know I m still going to have to One of the guys in our office has codes in C that will create GMT scripts plus Dulcie has been willing to let me look through some of her scripts to get a better grasp on GMT so hopefully I ll be okay on that front With Illustrator this may be something I have to figure out when I m at school and trying to throw together some maps for my poster I want to become more comfortable with doing fieldwork and learn to understand all of the little details that go into both planning and doing real fieldwork I think I ve also had success with this one We went out 4 or 5 days and may be going out a bit more in August and I ve become familiar with all of the tools we use what each instrument does and how to take care of and install the instruments Plus for two of those days I was considered the PI so I had the opportunity to put together and organize a field team which was an interesting experience when everyone is years older than you I want to be able to understand how this kind of work can actually advance our understanding of seismic activity and fault hazards I haven t gotten to interpretation yet but looking through past papers and posters and making an action plan for the last few weeks has helped me understand why we do each step in the processing I ve also gotten a better grasp I think on why the USGS does this and what results they get from these surveys that they deem worthy of being released to the public Link to Post Comment on this Post 1 Comments Week 6 July 13th 2012 This week I ve continued with the pre processing of my data set and I m finally starting to get the hang on using the Vi editor and ProMax I ll admit the week did get off kind of slowly and I ran into a few problems mostly caused by my own carelessness On Monday we had a power outage and then the madmax server which we need to get onto ProMax went down and so I have to wait a few hours for that to get rebooted I did spend some of the time talking to my advisor though about creating a working plan now that the preprocessing is finishing up of things that I want to do to the data and what I hope to get out of it I ve been going through a few more of my advisor s papers and papers published by the team here as well as AGU abstracts they ve written to get a sense of the usual processing steps It first involves getting through the pre processing and then through first break picking but from there I will have to migrate and stack the data use reciprocity to make up for bad and skipped shots and possibly use deconvolution to improve the resolution of my images I think I understand all of the concepts fairly well but we ll see how it goes when I actually have to start applying them using ProMax flows It was useful though to get a sense of what I need to get done in the next five weeks and also start thinking about what kind of images and what texts I m going to want on my poster Finally I was able to get back onto madmax and learn the next few pre processing steps On Tuesday I ran into a wall when one of the flows kept freezing on me but it was soon pointed out to me that the file I was trying to import was merely empty the program I used had malfunctioned and not copied anything to the file Then I kept getting erros on another flow but soon realized it was only because I was using the wrong instrument number And then of course I spent a while looking at the output file of one of the flows responsible for removing repeated shots before remembering that I didn t have any repeated shots and that the output file was supposed to be essentially empty Those little silly mistakes have been a lot of what have been slowly me down but once I got through those I was able to get the fake geometry all set up on ProMax and then get rid of any bad shots from my data Technically once the fake geomeotry was done and the necessary files were imported onto the system I was supposed to upload the real geometry that is based on the actual coordinates of the geophones But since I haven t gotten that file yet from our GPS guy I had to upload a new less fake geometry It still uses the spacing between the geophones as the coordinates but this time it bases the geometry on all 118 stations rather than a standard 60 stations I had to then go through all of the same uploading and set up steps from before and this time run a flow that combined and resorted the data from both RXs together At that point I needed to artificially move up every shot to the same time to compensate for any variations in shooting To do this I just had to run a few more flows and ProMax code to create a spread sheet of the amount of time each shot needed to be shifted we wanted every shot to be as if it had taken place at 2ms I then had to read this spreadsheet onto ProMax so it would shift all the shots In the same flow I was also able to kill all of the bad traces I had found previously This flow also did automatic first break picks on the trace each shot originated from With this I could make sure that the first picks looked good that the polarity the direction the geophones were facing of the traces were all the same and make sure the shot was recorded at the correct geophone Once all of these checks had been done and the shots had been shifted I was able to begin the next flow first break picking I have quite a lot of shots and my data is quite noisy but hopefully I ll be able to have picked all my first breaks for the P wave line by early next week Of course I ran into lots of little problems along the way but it s amazing how much better I ve become at actually identifying and solving the problems myself I m also a lot better at understanding what the actual issue is when someone points out a problem to me So it s still slow going but I would say I ve definitely made progress Link to Post Comment on this Post 0 Comments Week Five July 10th 2012 Last week I continued with the preprocessing of my data set in ProMax Because we shot the line backwards they like them to run W to E and the line was long enough that we had to use two RXs and two cables of takeouts it means there are a lot of little things I need to get my head around first while using ProMax which makes the process move a bit slowly Again at the beginning of the week I sat down with the main ProMax guy on our team and took notes as he went through his own data set We also went through the log files I had made the week before to confirm that there were no mistakes On the files I have to keep track of which stations are on which cable how the new FFID numbers match up to the old and how many takeouts we want to be read by each RX when we switch over to the other cable we also switch the placement of the RXs and what they re recording because one of them is apparently much better than the other That process was relatively straightforward but involved a lot of thinking about how the RXs were placed and how the switch midway down the line would have affected which stations were being recorded Once the log files looked good I was able to get onto ProMax and go through all of our shot gathers For each old FFID which corresponds to each shot I had to go through the shot gather and find bad traces which would correspond to bad takeouts I had to do this for all 135 shots on the one RX and then for all the shots around the middle point for the other RX because it is less sensitive the data is not nearly as good from this instrument Once I had a sense of bad traces I began looking at receiver gathers These look a lot like shot gathers but instead of showing every trace for each individual shot it can show you every shot recorded by each individual trace I then through all the traces I had thought were bad from the shot gathers four times on each instrument I had to look at the receiver gathers for before and after the switch independently From these receiver gathers I could get a sense of the range of shots over which the traces were bad and make note of which traces we were going to ignore for which shots Once I had made another log file with all of this information I was able to begin putting the information from the log files onto ProMax To begin with I just have to use a fake geometry which calculates the coordinates of each geophone from the geophone spacing The first geophone is arbitrarinly taken as 0 0 and then each subsequent geophone moves 5m down the x axis Eventually I will get a file with the real coordinates of each shot point found using GPS and input this into ProMax In the meantime I used a few programs already on the system to create a few basic shot files that take note of how many old FFIDs stations and new FFIDs I have I then input this onto the ProMax server using a series of flows that have to be done independtly for each RX instrument Once this information has been fully input right now there s a few inputting commands I need to find and put into the ProMax system which should have been set automatically I will be able to check that there are no strange offsets recorded on any geophone which could mean the geophone moved during shooting and I will be able to remove any bad traces and shots as well as put in blank signals for any skipped shots And then of course happy late 4th of July to everyone I can t say I did anything too exciting besides a barbeque in the backyard with my family but it was definitely nice to have the day off Link to Post Comment on this Post 0 Comments Week Four July 2nd 2012 Last week was interrupted by more fieldwork right tin the middle but I did finally start to get looking at real data On Wednesday we went back to Point Año Nuevo to do another seismic survey on the San Gregorio Fault We were slightly further north than last week and this time we were actually on private land on the historic Cascade Ranch The farmers let us do the survey right through their land partly because they re actually hoping to use the subsurface images we produce to precisely locate the water table in order to dig more water wells for the farm We used the same procedure as last week and our line was again approximately 600m but this time we did the entire data collection process S and P waves in one very long day On a side note the day before we went out I had another learning experience when one of the government trucks we were driving to the warehouse died on us right in the middle of an intersection Between the survey and the warehouse we had to get out 2 or 3 times to push the truck and jumpstart it Most of Tuesday afternoon was just spent getting this car to the warehouse loading it up and waiting for it to be fixed so we could get back

    Original URL path: http://www.iris.edu/hq/internship/blogs/user/49 (2015-11-11)
    Open archived version from archive

  • IRIS Internship Program - Lily Christman
    about everything I have done I am probably most excited about my work with MATLAB When I actually think about the different things I have used MATLAB to do I realize how much I have actually learned about it I have a whole folder of MATLAB scripts that I have written mostly myself to do various tasks I needed to do this summer While they are not the most efficient scripts I have definitely learned a ton and hope that in the future I can learn more tricks and way to improve on my coding so what I write is more proficient Link to Post Comment on this Post 0 Comments Hello Again World July 18th 2012 Sorry I haven t posted in a while I took the week off to head home for the 4th of July and am working to get back into the swing of things It was a very nice relaxing week with lots of good food company swimming and sun Last week back at work I read more thoroughly through a few papers which analyze electromagnetic data and conclude they see precursor anomalies in order to come up with a list of processes they did to replicate on our data The first analysis I am going to focus on is looking for a certain type of pulsation characterized in Bleier et al 2009 What I understand so far is that the pulsations are characterized by having amplitudes greater than twice the site background noise are unique in duration 1 30 seconds and exhibit strange singular polarities and bi polar waveforms I am going to look in our data for pulses like this End of last week and this week I have been figuring out how to download my raw data and filter it Caroline another intern in my lab taught me the basics of SAC in order to determine the time a bunch of earthquakes chosen using my earthquake graph shown above arrived at our stations so that I would know where to look in our EM data I had to use some filters on the seismic data to see the arrivals more clearly I figured out how to use what is called a bandpass butterworth filter The code I type in SAC is bandpass butter corner 0 33 10 where corner and the numbers indicate the range of frequencies I want to KEEP with my data I am very entertained every time I type butter I also figured out how to open our raw EM data in MATLAB and have written a bandpass butterworth filter in MATLAB to filter our EM data in the same way as the seismic data because the frequency of the seismic and EM data are about the same Simon also has me beginning to write up and organize my abstract and things that will go on my poster for AGU like an introduction any background and graphs I think I will have This is making me realize that I am halfway through my internship which is pretty intense Since I haven t produced any concrete graph or results recently I having been feeling slightly like I am not moving forward but when I take a step back like I did when writing this blog post it makes me realize how many different things I have been working on to get me ready to produce actual graphs and results I m excited to see what I find Link to Post Comment on this Post 1 Comments Poison Oak or Mosquito Bites June 26th 2012 I had a really great time out in the field for two days with Leah and about 20 other people from USGS this week Eva was there too for one of the days We were working on a seismic survey down south along the coast in Ano Nuevo Park We were working on a 600 meter array along a service road with a beautiful view of the ocean off in the distance Leah was the PI on the project so it was pretty neat to see all these people many of them adults working to get data for her to analyze I worked on putting in and taking out geophones using a hand auger for the p wave shot source and avoiding the poison oak plants that were EVERYWHERE I currently have some itching on my elbow but I m pretty sure it s a couple of mosquito bites fingers crossed As exhausted as I was from two very long and active days it was nice to get out of the office and see a beautiful coastal area and cool geophysics field techniques Thanks Leah Back to work at my cubicle wasn t too bad however I am still working on figuring out what earthquakes make sense to look at in our data I have a pretty nice looking graph that gets more and more colorful as I add earthquake events to it In response to Katie s question on my previous post I will most likely start by working off previous work in the sense that I will look at how other people have analyzed their data and begin by doing that to some of our data It s looking like we really haven t had any earthquakes recently enough that would be large enough to show us anomalies over the noise of the area on our stations I will most likely not see any anomalies associated with earthquakes but even that will be good to report so we can begin to establish the size and proximity of earthquake needed to see any possible anomalies Here is what my graph is looking like I talked with Simon about a good range of earthquakes to examine based on the graph I first have to catalog a little more so that I can make sure we have good data from our stations on the dates of the earthquakes I would be looking at Then my plan is

    Original URL path: http://www.iris.edu/hq/internship/blogs/user/50 (2015-11-11)
    Open archived version from archive

  • IRIS Internship Program - Erin Cunningham
    have a bit more to share at the end of this week Link to Post Comment on this Post 2 Comments Receiver functions with Albert July 15th 2012 I spent this week running the matlab code for receiver function parameters It has taken me much longer than predicted Unfortunately I have been having problems creating illustrator plots but resolving this problem will be part of my goals for next week While the codes have been running I have taken the time to write up the general goal of this part of my project as well as what changing each parameter does The basic idea is that to get good receiver functions that can be most easily interpreted requires knowledge of the parameters that can be changed and how changing each parameter will affect the data When changing parameters the goal is to minimize the noise without changing the integrity of the deconvolution For the parameter changes only the parameters being tested will change and all other variables will remain as constants There have been three stations chosen to undergo parameter changes The first is CBKS which is a permanent US station located in Cedar Bluff Kansas This station has been active for 17 years and therefore has a lot of data to offer Underneath the station lies basement geology of consolidated sediments The second station is MPH from the New Madrid NM seismic network and is located in Memphis Tennessee This station has been collecting data since 1998 and unlike the US station CBKS it lays on unconsolidated sediments along the Realfoot rift The final station is V42A of the Transportable Array TA network and has only been around since 2011 so it has not collected much data Like the NM station MPH it also lies on unconsolidated sediments around the Realfoot rift The goal is to compare each station with itself as different parameters are changed as well as to compare the receiver functions of each station against one another Things to consider about the stations when they are being compared to each other include the length of time the station has been active and the geology underneath each station After all of this has been analyzed the next task is to determine the minimum magnitude event that can still provide a significant number of viable seismograms for each station Like Greg mentioned in his blog I m really glad that there are so many good papers on receiver functions Although I was getting bored of reading papers at the beginning it has really helped to gain a background in receiver functions and it s exciting to know exactly where to look when you aren t sure about something On that note I m going to take the rest of this wonderful Sunday afternoon to go read about the crustal structure along the North Anatolian Fault zone with Albert down the street Al s such a great listener Until Next Week Link to Post Comment on this Post 0 Comments The beginning of the end for single receiver functions July 6th 2012 I hope everybody enjoyed a short week due to the fourth of July and was able to see some awesome fireworks During this past week I have moved back to IRIS headquarters for awhile as my host at Maryland travels around Europe not all for fun and games apparently some seismology is involved as well I spent the beginning of the week updating the receiver function codes on the computer at IRIS so that they matched changes I made in the codes at the University of Maryland Much of the rest of this week has been spent working and starting a number of small projects A few that I started working on are creating a KML file of the TA stations I will be using and deciphering CCP stacking code I will be using on the TA stations in the future I felt as I was starting too many new things before I was finishing anything so at the very end of this week I have decided to finish up single receiver functions for the permanent stations Throughout the process of calculating receiver functions I have had to make decisions on what the parameters should be and have spent time figuring out which parameters will work best for my data Some of these parameters include what the signal to noise ratio is what frequency range is included what the minimum allowed frequency or water level is what the Gaussian is that is used in the deconvolution and whether to use the time or frequency domain in the deconvolution I wanted to wrap up this part of my project by producing some final figures to outline the effects of changing each parameter and write a small paragraph explaining what changing each parameter does I hope to add this as a small part of my end project My goal for next week will be finishing all of this up and having some great hopefully poster ready figures As for the weekend I plan to attend a free jazz in the garden concert at the sculpture garden just a few blocks from IRIS headquarters and to also just stay out of the heat as much as possible its supposed to be 105 F tomorrow Link to Post Comment on this Post 2 Comments Red White and Back Azimuth June 29th 2012 Happy forth of July from the nation s capital The title of this weeks blog is not only for the fourth for overcoming the first problems I had encountered with the matlab code during the second week of the project where the azimuth was being calculated incorrectly Luckily since then I have had no major problems figuring out the code and running it for other stations This week as been challenging as I have run the same calculations many times but I am glad to finally be done for now with calculating the receiver functions for the permanent stations

    Original URL path: http://www.iris.edu/hq/internship/blogs/user/51 (2015-11-11)
    Open archived version from archive

  • IRIS Internship Program - Noor Ghouse
    weeks turn out Noor Link to Post Comment on this Post 1 Comments The road so far July 5th 2012 Hey Everyone sorry for the late blog theres just been so much going on lately ok so last time I blogged I left off on mapping up earthquake events on lat and lon I had to map multiple lat and lon plots to reveiw accuracies of associated picks through antelope After deliberating with Mike we both came to a conclusion that the parameters that dont pick too many yet keep a trend in the mapping data would be most suitable to run through 6 years of seismic data Since then we ve been running the parameters which best locates earthquakes and I have been reorganizing the arrivals As I progress through the internship some of my goals have been accomplished such as 1 As the weeks move on I have gotten better at GMT and scripting I still have a long ways to go to call myself the Master of GMT but I m now comfortable and capable on working with them 2 The amount of progress throughout the day has been increased as I get familiar with coding and analyzing earthquakes Similar to David and Greg I ve been analyzing earthquakes through Antelope Antelope is very familiar to me because I ve used it for previous research The only difficult task that comes with antelope is uncompromising residuals from beautifully picked arrivals Since the internship has started I ve been enjoying my summer at my home university A bunch of the undergrads in the geology department are still in oxford so the summer has consisted of burrito nights and fun times I also went to my hometown a couple of weekends ago for a family friends wedding which was a nice break from the lab I m really excited to see where my work will take me in the next couple of weeks Back to work Noor Link to Post Comment on this Post 1 Comments I ve got a feeling we re not in New Mexico anymore June 10th 2012 Alas After a long week of unwinding from orientation unpacking and starting up research I can finally sit down and blog about my first week Being based at my home university is amazing because I can move into the apartment I ll be living in for the next year Its also good because I m familiar with the campus and the grad students working in the department Before monday I showed my fellow IRIS buddy and labmate Calvin the life and sites of Oxford I also get the privelage of working near my two apartment mates for the summer Katie and Sarah The weather in Oxford is usually Sunny and warm so its perfect weather to eat out on the front steps of Shideler while watching the eager incoming freshmen leave the student center everyday during their orientation After doing further reading on the details and importance of

    Original URL path: http://www.iris.edu/hq/internship/blogs/user/52 (2015-11-11)
    Open archived version from archive



  •