Linux/UNIX and FTOOLS Tutorial
THE LINUX OPERATING SYSTEM
(I) An Introduction
All manner of research begins with one thing: raw data. Only from raw data can phenomena be defined, models built, and theories and laws deduced. Because of the nature of the sources you will be researching, your raw data will come in the form of x-rays collected from satellites in orbit. The results of these data are usefully catalogued in what are known as Flexible Image Transport System files, or FITS files for short.
However, as raw data FITS files, in their original form, are as useless to us as long rows of random numbers. Their useful information must be extracted and put in a form which is (relatively) easy to study. This is where a handy-dandy software package known as FTOOLS comes in. FTOOLS is specifically designed to view, study, and analyze astronomical image files such as FITS files.
So now, you say, all we have to do is learn how to use FTOOLS and we’ll have all the tools to commence research, right? Not so fast. Before we even begin to discuss FTOOLS, we need a well-rounded understanding of the operating system FTOOLS was designed to run on: Linux. And finally, there is the question of where we get these amazing FITS files in the first place.
That’s where this tutorial comes in.
(II) A Note About This Text
The emphasis of this tutorial is on the “how,” not the “what,” of astrophysics research. In other words, the information you will find contained here is mainly procedural. At most points, an attempt will be made to explain what you are doing, what you are seeing, and the reasons behind both in simple terms. However, it is likely that all your questions will not be answered.
This is by design so don’t panic. We felt it better if we could introduce to you the nuts and bolts of astrophysics research before giving you the meat and potatoes. The meat and potatoes will come later, so I ask you to accept what we will be doing provisionally.
Background on Linux/UNIX
First, a little history about UNIX. Brian Kernighan was the first computer researcher to use the term UNICS, meaning Uniplexed Information and Computing System, in 1969. This term somewhat aptly described the multitasking operating system Dennis Ritchie and Ken Thompson developed for the DEC PDP-7 machine at Bell Labs to run a solar system simulation game called Space Travel. This UNICS also had a simple command interpreter and file system. The name stuck after a slight change to the present day UNIX a year later. With more support from Bell Labs in the early 70's, UNIX grew to include a simple text editor and typesetter, a programming language (C), and several utilities/tools such as pipes. System V UNIX was first commercially released in 1983 by AT&T, while a somewhat different flavor had been developed at Berkeley since 1974. Known as the Berkeley Software Distribution (BSD), it was widely distributed and became the foundation for some operating systems like SunOS and Solaris.
While there are/have been efforts to standardize UNIX, there are several distributions or flavors of the operating system. The variant gaining popularity in the past few years is Linux, developed initially by Linus Torvalds in 1991, and designed to run primarily on then-IBM-compatible PCs. Linux also distinguishes itself because it has always been free (except for user's manuals or CD installations).
The overall UNIX philosophy has been to develop a relatively small, flexible operating system. The operating system consists of drivers operating the hardware (which we won't worry about) called a kernel, a command interpreter called a shell, a file system used to store or display information, and a set of compact utility programs called commands with which the user interacts with the shell. Typically, a shell command is converted to simple system calls, like enabling a keyboard, interpreting a keystroke, preparing a printer, and so on, so that one could think of the shell and kernel as a bridge between the user and hardware.
Finally, UNIX is unique in that it was always designed to be used in shared environments. This means that several people share the system resources, like disk space, processing time, and so on. More crucially, this means that you can share information, and observe the same file with someone else simultaneously. This is also risky too, so UNIX has a few built-in safeguards.
We will spend most of our time using the rather cryptic commands for which this operating system is known. This somewhat intimidating feature of UNIX is a direct result of its flexibility, but distinguishes it sharply from other popular operating systems you would see in home computers, like Windows 95/98 or MacOS. However, Windows 98 still allows one to use DOS, the pre-Microsoft IBM operating system popular in the 80's, which has many striking similarities to UNIX (in addition to annoying, subtle differences). The most difficult thing about UNIX (or DOS) is that there are no/few pictures, no/few pull down menus, so it has the reputation of not being user-friendly. (While the shell concept is still very much alive in UNIX, there are now many XWindows or OpenWin applications and even desktops which are menu- or icon-driven, such as the Common Desktop Environment that we're using for the terminals in this room). Because there is no guarantee that you will encounter such UNIX environments at school or at home, and because the version of Linux we are using really isn't very windows oriented, we'll stick to learning the UNIX shell commands.
(III) Getting Started
The terminals you see around the room have no independent hard drive themselves, so no data is stored in them. In order to begin using Linux, you’ll have to log on to a host.
1. Most of the time when the terminals are not in use, you’ll see a screen saver which is usually a simple star field. Wiggle the mouse or hit the space bar to get out of the screen saver.
2. You should see a gray window titled “Default Hosts” in the middle of a blue screen. Double-click on “physics.”
3. The next screen should have a black background with a gray welcome prompt. You should see the message, “Welcome to remote host physsunN,” where N is a number between one and nine. For example, your message might say, “Welcome to remote host physsun5.”
4. In the prompt, type your username and hit enter. Now type your password and hit enter again. Note that no evidence of your typed password will appear on screen, not even the usual asterix you’re probably used to. Just be careful and if you think you’ve messed up, you can always start over.
5. If this is your first time logging on, you will be prompted for what kind of desktop you wish to use. Choose the “Common Desktop Environment,” as this is the most user-friendly of the two, and proceed.
6. After another welcome screen (in quite a few languages; see how many you know), you will see your desktop area. Several windows may open. As most of them are rather useless to you right now, you can close all of them down by clicking on a window and hitting Alt-F4 or double-clicking in the top left-hand corner of each window.
7. Now that you’re officially logged in, time to introduce you to the command line interface you’ll be using throughout the program. Right-click on the desktop; a small sub-menu should open up. Click on “Tools” and then in the new sub-menu that appears, click on “Terminal.”
8. This will open a window called a shell. A prompt should appear that should be in the form of “physsunN%,” where N is a number between one and nine. If you are familiar with DOS, then this is equivalent to the C:\> prompt in that operating system.
9. At the end of each day or each session on the computers, you should log out. To do this, right-click on the desktop and click on “Log Out.” Confirm that you wish to log out. You should be returned to the original blue screen you started with.
(IV) A More Detailed Look at Linux/UNIX
Files
Data in any operating system is stored as a set of discrete files. These files are strings of binary code, ones and zeros, which when run through the correct program can output everything from text documents and images to light curves and energy spectra. To distinguish one from the other, each file is given a name followed by a dot “.” and then an extension. The extension is like the “genus” of the file, denoting what type of file it is, either text, image, or whatever. The more common file types you will be dealing with are listed here along with their appropriate extension:
.fit Image files commonly used for data taken at national telescope laboratories.
.gz A file compressed with the gzip command.
.ps Image files in postscript code.
.tar A tape archive file; groups of many files stored together for each transfer.
.txt Text document.
.Z A file compressed with the compress command.
Directories and Subdirectories
Files in UNIX are separated into specific groups known as directories. These directories can have their files further divided into more specialized groups within them called subdirectories. It is this organization of files, directories and subdirectories, that makes up the UNIX file system tree (see Figure 1).
The best way to picture this file system tree is with the old “filing cabinet” analogy. The computer is an old fashioned filing cabinet. The filing cabinet as a whole is represented by the root directory “/” above. Directories one level below the root directory, such as usr, home, sbin, and spool, are like each drawer of the overall filing cabinet. One level below that and you have individual file folders. These are the subdirectories matilsky, etkina, vjacobs, and others. Inside these file folders, you can have files or other folders. In the real world, these would be grade reports, official records, etc. In computers, these can range from tao.txt to w3browse-13986.tar.
Paths
Now that we have an organization of files, we need a way to specify the location of these files. That’s where paths come in. Paths are, quite simply, the “coordinates” of a file in the UNIX file system tree, and give the sequence of directories and subdirectories that need to be traversed in order to reach a certain subdirectory or file.
There are two types of paths, absolute paths and relative paths. Absolute paths indicate the path from the root directory. Since the root directory is represented by a slash, an absolute pathname always starts with a slash. For example, /home/matilsky/ is an absolute path because it shows that the subdirectory matilsky is under, a) the root directory “/”, and b) the directory home (a subdirectory of root).
Relative pathnames, on the other hand, gives the location relative to the current working directory. Thus, if the current directory was home, then the path to matilsky would simply be matilsky/.
For example, look at Figure 1 below. Imagine for a second that in the directory etkina there is a file called astrophysics.txt. How would we write the path to this file? Here it is: /home/etkina/astrophysics.txt.
Figure 1: Typical Linux/UNIX file system tree.
V) Putting It All Together
Now that you have a basic understanding of the UNIX file systems, it’s time to learn how to manipulate, navigate through it, and use it to your advantage. All of this is done through special commands you input that the operating system can understand and which will eventually be passed onto the system hardware.
Introduction to Commands
Commands are simple text statements that are typed into the prompt in the shell which can be followed by modifiers and arguments. The typical format for a command is as follows:
% command –modifiers arguments
The command is usually a short one word statement, often an abbreviation of a longer statement that is (usually) easy to remember and easy to type. Modifiers always follow a minus “ – ” and are usually single letters that slightly alter the behavior of the command. And finally, arguments are the basic input the commands require. What this boils down to is that commands are simply coded instructions for the operating system.
Notation
Before we get to far, let’s point out that this tutorial uses a language all its own, and you have to be able to understand it as fast as I dish it out. The most common of this text’s “jargon” are listed here.
%
This an abbreviated version of the prompt you see in your terminal shell. The actual prompt looks something like physsunN%, where N is a number from one to nine. To save my aching hands from typing that string over and over again, I use a simple % in this document. This is also useful when looking at the various examples, and you’re wondering what you’re supposed to type. The answer is, any line beginning with a %. Any other lines are example output from the shell and should not be typed. They are the lines you should see after a command has been correctly inputted.
command –modifier argument
This is the basic format for any command that will be presented to you in this tutorial. There is only one small thing to note. The “command” part of this statement is static. It can’t be changed because UNIX will understand the instruction only if it is typed correctly. The argument, however, can be changed and can be thought of as a variable in algebra. Arguments can have restrictions such as just being an integer, just being a word, or anything else. I will note these where appropriate. The argument part will always be written in italics so there won’t be any confusion.
[Ctrl+d], [Up-arrow], [Enter]
Anything in brackets calls for the appropriate keys to be typed on the keyboard. In the above example, you would press the Ctrl key and the d key at the same time. An easy way to do simultaneous button presses is to hold down one button and hit the other. The brackets are to distinguish key strokes from output text, two things which are mixed together in the examples offered later.
The Good Stuff: Basic Shell Commands
(1) whoami
The question philosophers have been asking for ages. UNIX, however, offers a more practical answer. It will output the username you are currently logged on as, in case you forget.
% whoami
twang
(2) pwd
Print Working Directory. That’s easy enough to understand. This command will output the path of the current directory you are located.
% pwd
/import/thoth/h4/twang
(3) ls –alr path
Will output a listing of files. Typing ls alone will output the contents of the current working directory, including subdirectories, in rows and columns. The modifiers, a, l, and r, alter the style in which the listing appears. The argument path will allow you to see the listing of directories other than the current working directory.
- a Will list all files, including hidden ones.
- l Will list the files one line at a time and also print relevant file information.
- r Will list all the files in reverse alphabetical order.
Modifiers can be combined in any combination after the minus ( – ). In UNIX, hidden files are denoted by a dot “ . ” preceding their filenames. If the path argument leads to a file instead of a directory, ls will simply output that one file with the appropriate modifiers in place.
In the examples that follow, note the string of ten characters at the beginning of each listing for ls –al. We will discuss this string in more detail later. For now, simply remember that a “d” at the beginning of this string specifies that the listing is a directory. A dash “ – ” signifies a file.
% pwd
/import/thoth/h4/twang
% ls
asi cool.ps tao.txt
% ls –a
. .. asi cool.ps tao.txt
% ls –al
drwxrwxr-x 6 twang physgs94 1536 July 15 2001 ./
drwxrwxr-x 2 twang physgs94 512 July 15 2001 ../
drwxrwxr-x 1 twang physgs94 256 July 13 2001 asi/
-rwxrw-r-x 1 twang physgs94 3072 July 12 2001 cool.ps
-rw-rw-r-- 1 twang physgs94 4332 July 14 2001 tao.txt
(4) file filename
Outputs the type of file of any file specified as an argument. Recall the discussion on file extensions. What if a file didn’t have an extension? How would you be able to tell what type it was? Why, with this command of course!
% file tao.txt
tao.txt: ASCII text
% file cool.ps
cool.ps: PostScript document
(5) cd path
Change Directory. This will change the working directory to the directory specified in the path argument. The cd command offers a few shortcuts to reduce the time you spend typing. A single dot “.” represents the current directory. Double dots “..” represent the parent directory, the directory one level above the current one. And finally, a tilde “~” in the path argument brings you back to your home directory, the one that is the same as your username.
% pwd
/import/thoth/h4/twang
% cd asi
% pwd
/import/thoth/h4/twang/asi
% cd ../..
% pwd
/import/thoth/h4
% cd ~
% pwd
/import/thoth/h4/twang
It’s important to note that the cd command will look for new directories within the current directory. For example, if I was in the directory /import/thoth/h4/twang/asi and typed in cd twang, UNIX wil try to find a twang subdirectory inside the asi directory instead of returning me to twang one level above. To get around this, use “..” or “~” to move up the file system tree.
(VI) More Commands Than You Can Shake a Stick At
To really be able to use this operating system for productive work, you’ll need to be familiar with the following additional commands.
Fun With Text Documents
(6) cat filename
cat dumps the contents of a certain file to the screen. This is one if the file (usually a .txt. file) is less than one screen page long. However, longer files will be displayed all at once so parts of the file may be hidden from view by the limits of the window.
Look in your directory and you should find the subdirectory tutorial/texts. Change the working directory to tutorial/texts. Note the file riddle.txt contained therein.
% cat riddle.txt
At night they come without being fetched,
And by day they are lost withot being stolen.
What are they?
%
cat > filename
cat >> filename
These are two ways in which the cat command can be used to create text documents instead of simply showing you their contents. The single greater than sign will tell cat to write data to a new file designated in the filename argument. If filename already exists, cat will overwrite any existing data. The exact data to write to the file will be given by you. After the command is inputted, the shell will leave the prompt and give you blank lines to type in your document. A [Ctrl+d] key press at the end will save the file you’ve typed and return you to the prompt.
For example, return to your home directory and create a file called Bright.txt and type in the following poem:
There was a young woman named Bright
Whose speed was much faster than light.
She set out one day
In a relative way,
And returned on the previous night.
When you’re done, hit [Ctrl+d] to exit. You’re screen should look like this:
% cat > Bright.txt
There was a young woman named Bright
Who could travel much faster than light.
She set out one day
In a relative way,
And returned on the previous night.
%
The double greater than signs “>>” tell cat to write data to the end of an existing file. All previous text in the file is saved and new data is added on. Data is typed in the same way as with cat >.
(7) more filename
The more command is an alternative to the cat command. more filename will dump the contents of filename to the screen one page at a time so it’s ideal for longer files. Hitting [spacebar] will scroll down through the document one whole page at a time. [Enter] will scroll one line at a time.
(8) tail –N filename
tail is another command in the vein of cat and more. This time, tail will output only a certain number of lines of text. Specifically, it will only show the last few lines. If tail is used without any modifiers, you will see the last ten lines. The –N modifier will change that number to any other N you specify.
(9) grep string filename
grep is perhaps the most specialized of this group of commands that outputs the content of files. Instead of outputting the whole file, grep will search the designated filename for a word that is specified in string. Each line containing that word is then displayed on screen.
For example, remember the file riddle.txt?
% grep stolen riddle.txt
And by day they are lost without being stolen.
%
File and Directory Management
(10) cp filename destination
cp copies the contents of a file exactly to a single destination. The original file is left intact and a new one is created at the new location. If destination is a path, then a file with the exact same name as the original is created in the new directory. If destination leads to another file however, that file is given the contents of the original while its own contents are overwritten or created.
For example:
% cp cool.ps cold.ps
will create a new file cold.ps with the exact same content as cool.ps. While,
% cp cool.ps asi/
will create a new cool.ps inside the asi/ directory.
(11) mv filename destination
The mv command works in almost the same way as the cp command except that the original file is deleted after the copy process. In essence, the file is “moved” from one location to another. Note that the same rules that apply to cp also apply to mv. Thus, setting destination as another filename, in effect, renames the original file.
% mv cool.ps asi/
(12) rm filename
This is the UNIX command to delete a file. Deleting a file is permanent and cannot be undone so be careful!
% rm asi/cool.ps
(13) mkdir directory
The command to create a new directory.
% mkdir images/
(14) rmdir directory
This is the delete command, only specially designed to remove directories. Note that in the UNIX shell, all files and subdirectories within a certain directory must be deleted before the operating system will allow you to delete the designated directory. This is simply a safety measure to prevent you from accidentally destroying a lot of data that might be important. It can get pretty annoying sometimes, though.
% rmdir images/
Archive Management
(15) compress filename
(16) uncompress filename
Compression is a systematic way of shortening file sizes by removing redundancies in the file. When file sizes get too big, compression can make storage or transfer over the Internet much easier. Compressed files are given an additional .Z extension in addition to their normal filenames and extensions.
% compress tao.txt
% ls
tao.txt.Z
In their compressed state, however, files cannot be normally read by other programs. The uncompress command restores files to their original configuration.
% uncompress tao.txt.Z
% ls
tao.txt
(17) tar –xvf filename
The tar command takes many files and combines them into a single file. This eases bookkeeping and file transfer because .tar files can be “un-tarred” at some later date to retrieve all of the original files. Whole directories can be stored in a single .tar file.
We will mostly be dealing with un-tarring files in this program. Tarring files is not that much harder. But to save space, I won’t put anything about it here.
To un-tar a file, the tar command requires three modifiers. They are listed below:
- x Extract archive.
- v Verbose listing of archive contents.
- f Use “tar” argument as name of archive.
A little confused? Perhaps, but the simple thing to know is that –f and –x are required to extract an archive, while –v is just plain useful in seeing what files you are extracting, so you can just remember the tar command as: tar –xvf filename. Here’s an example using the w3browse-13986.tar file found in your tutorial/examples directory.
% tar –xvf w3browse-13986.tar
me/rates/a/a56910.lc.Z, 11127 bytes, 22 tape blocks
me/rates/b/b56910.lc.Z, 11417 bytes, 23 tape blocks
Securing
(18) chmod field1field2field3 filename
Remember the string of ten characters at the beginning of a file listing? If you don’t, go to any directory and run the ls –al command. You might see something like this:
% ls –al
drwxrwxr-x 6 twang physgs94 1536 July 15 2001 ./
drwxrwxr-x 2 twang physgs94 512 July 15 2001 ../
We’ve already established that the first letter tells you whether the listing is a file or a directory. Dashes “–” are files; the letter “d” represents directories. The other nine letters tell who can or cannot view these files or directories. Recall also that Linux/UNIX was designed to run in shared environments. That means that the person next to you can access your directory (quite easily) and read all your files. Because this represents quite a security problem on large networks, UNIX offers you an easy way to modify the secureness of your files.
Each three letter block in the entire nine letter sequence after the first letter is the setting for a different type of user. The first three letters represent the settings for the user which is you. The second three are the settings for the group, all the other people logged on to this network. And finally, the last three are the settings for any other users that may be logged on remotely from home computers or anything else.
The letters in each three letter chain stand for a different type of action. They are:
r: Read (User is able to view the contents of the file)
w: Write (User is able to make edits to the file)
x: Execute (User is able to run the file if the file represents a program)
and always appear in the order, rwx. A dash in place of the letter in the sequence means that action is forbidden to the appropriate user. Let’s look at an example.
-rw-rw-r-- 1 twang physgs94 4332 July 14 2001 tao.txt
This is the listing for a file tao.txt. The current user is allowed to read and write to the file. Users on the group can do the same, whereas other users can only read it. No one can execute the file since tao.txt is not a program.
To modify these user settings, the chmod command is used. chmod is a bit tricky so read carefully. The three arguments, field1, field2, and field3, are typed one after the other without any spaces to make up one long argument. field1 can take the following values:
u: User
g: Group
o: Other
in any combination and represents which type of user for who you wish to modify settings. field2 can either take a plus “+” or a “–” depending on whether you want to give the selected users an allowed action (plus) or take it away (minus). field3 takes any combination of rwx discussed above and depends on which specific action you are modifying.
% chmod ug-w tao.txt
% ls -al
-r--r--r-- 1 twang physgs94 4332 July 14 2001 tao.txt
This command took away the ability to write to the file tao.txt from the main user and the group.
Miscellaneous
(19) clear
Completely wipes all output from the current shell and returns you to a single prompt at the top of the window.
(20) history
Outputs the most recent commands you have executed in the current shell and the times which you had executed them.
% history
1 22:24 pwd
2 22:24 ls –al
3 22:25 cd asi
4 22:25 history
(21) lpr filename
Sends the contents of a file to the printer.
(22) man command
man is short for manual and is the command to access Linux’s online user’s guide. This command will give you a complete documenation on any system command specified in the command argument. This tutorial has offered only a cursory glance at most of the operating systems commands. If you wish to see a more in-depth instruction manual, use the man command.
(23) which command
Will give the path of the executable file in the command argument if it exists. Helpful in determining if you have the correct software necessary to run certain operations.
% which lcurve
/usr/local/ftools/bin/lcurve
% which cd
cd: shell built-in command
VII) Slightly More Advanced Topics
Combining
Many commands can be "combined" into a single command through the use of wildcards. In UNIX, a wildcard is represented by an asterix *. The asterix can stand for any combination or sequence of characters when used in a command. It's use is actually best shown through examples instead of explained.
% rm *.txt
This command deletes all files in the current directory with a .txt extension. Because the asterix stands for any possible sequence of characters, the rm command simply looks for any filename with a .txt extension and deletes them.
A similar command is this:
% rm data*
This command will remove all files beginning with the word data and that has any possible sequence of characters after. Any command that requires an argument can take advantage of the wildcard in some way.
The second combining method is a simple way to be able to type more than one command at each prompt. It's uses are limited, although it does save your pinky finger from hitting the [Enter] or [Return] keys too often.
This is done with a semi-colon between the two commands you wish to type on single command line prompt as shown here:
% cd asi/;cat > astro.txt
The above command will switch the current working directory to asi/ and then immediately allow you to type in new data into the file astro.txt, which will be created in your new working directory.
Pipelines
Pipelines are in the format of:
% command1 | command2
and take the output of command1 and set it as the input of command2. The simplest way to understand pipelines is to look at this example, which again uses Bright.txt which you typed a while back.
% cat Bright.txt | grep night
And returned on the previous night.
command1 is cat Bright.txt. The output of this command is the five-line poem. This output is then sent to the second command, grep turn. Recall that grep looks for some word, in this case turn, in some file. That file is provided by the output to cat Bright.txt.
Now, this example isn't particularly useful since the above pipelined command is exactly the same as the less complicated:
% grep night Bright.txt
And returned on the previous night.
However, pipelines are the easiest way to do small "screen dumps," taking the output of commands on screen and saving them as a file. For example:
% ls -al | cat > directory.txt
% more diretory.txt
total 12
drwxr-xr-x 11 milawren visitor 512 Jul 17 09:04 ./
drwx------ 17 milawren visitor 1024 Jul 17 08:54 ../
drwxr-xr-x 2 milawren visitor 512 May 18 2000 bin/
drwxr-xr-x 2 milawren visitor 512 Jun 26 2000 data/
drwxr-xr-x 2 milawren visitor 512 May 18 2000 emacs/
drwxr-xr-x 3 milawren visitor 512 Jul 16 14:40 examples/
drwxr-xr-x 5 milawren visitor 512 May 18 2000 ftools/
drwxr-xr-x 2 milawren visitor 512 May 18 2000 pix/
drwxr-xr-x 2 milawren visitor 512 May 18 2000 project/
drwxr-xr-x 2 milawren visitor 512 Jun 28 14:01 ps/
drwxr-xr-x 3 milawren visitor 512 Jun 28 13:58 texts/
-rw-r--r-- 1 milawren visitor 16 May 18 2000 zzz.ps
Cool, huh?
Other Shortcuts
There are two more time-saving measures you should know about. Command line completion allows you to hit [Tab] to have Linux automatically complete the filename you are currently typing into the prompt. Of course, Linux can complete a filename only so far. If you have two files such as data1.txt and data2.txt, typing in “d” at the prompt and hitting [Tab] will only complete the filename up to “data.” You must provide the distinguishing character.
Also, hitting the [Up-arrow] or [Down-arrow] will allow you to scroll through previously typed commands in order to execute a easily execute a single command many times or edit previous commands without typing out a whole long line again.
HEASARC
Data Reduction Using the X-Ray Archive and ftools
downloading files from HEASARC
The High Energy Astrophysics Science Archive Research Center will be your most important source for x-ray astronomy data. Please familiarize yourself with the site, since it provides information about x-ray telescope missions past and present, satellite instruments, data imaging and analysis software among other things. It also has an extensive archive of data sets from these past missions, which we will use a lot. First, let’s become comfortable with downloading important files from the archive.
• Open Netscape.
• Go to the HEASARC web page by typing .
• Click on the Archive Tab
Select the Browse option.
• Search for a specific object
In the box at the top labeled Object Name or Coordinates enter AM Her.
Under What missions and catalogs do you want to search? Find Past X-Ray Missions and click on the box to the left of EXOSAT.
Scroll down to Types of Information and only check the box for Archived Data and Observations
Scroll down and click on Start Search.
• Select an observation
Scroll down to the section labeled EXOSAT ME Spectra and Lightcurves (me).
Select the observation with sequence number 17 and a quality flag (qflag me) of 5.
Slightly below the table of observations there is a box asking Are you interested in data products? The different types of data that are collected in each observation are listed. Make sure that all checkboxes are selected and click on Preview and Retrieve.
On the next page titled Data Products Listing, you can download the observation files individually, examine pre-made GIF images, or choose to combine any number of data products into a tarfile, and then download them together.
Outside of the GIF images, there are the spectra and response matrices files (.pha and .rsp respectively), which together will allow you to create energy spectra, and the lightcurve (.lc) files in band 1, 2, 3, broadband, and background varieties. Different types of lightcurves are distinguished by a file prefix (a, b, c, d, or r). This difference will be important to your research as well as later when we learn about concatenating a sequence of files.
Select files for download by clicking on the checkboxes to their left. Choose the single files under EXOSAT ME Spectra and EXOSAT ME Response Matrices and both files under EXOSAT ME Band 3 Lightcurves. Scroll down and click on TAR Selected Products.
• Download TAR files
The next screen that appears should list the files you chose as well as two links (FTP and HTTP) to the tarfile that contains the listed files. Click on either link.
Choose the directory you wish to put the tarfile. Make note of it and click on OK. Netscape will copy the tarfile to the directory specified. The tarfile is named in the form of w3browse-#.tar, where # is usually a random string of 4 or 5 numbers. This file is compressed and cannot be used until it is opened up properly.
Open a new shell, go to the directory where you downloaded the tarfile and untar it by typing:
tar –xvf w3browse-#.tar
This command should unpack the files into subdirectories, normally me/spectra and me/rates/c.
• Organize and uncompress the files in your directory
You will find the files you downloaded from HEASARC in their new subdirectory by typing:
cd me/spectra or
cd me/rates/c
Then type ls to list the files. Notice that the files end with a .Z. This means it is compressed further and still cannot be used
Uncompress the files by typing:
uncompress *.Z
The * is a wildcard character so that your command above says “uncompress anything in this directory that ends with .Z”.
Then type ls . The .Z should have disappeared from the filenames making the files usable.
Remove the w3browse-#.tar file now that you have what you need by returning to the directory to which you downloaded it and deleting it with the rm command.
EXOSAT
Light Curves, Power Spectra, and Energy Spectra
The European Space Agency's X-ray Observatory, EXOSAT, was operational from May 1983 to April 1986. During that time, EXOSAT made 1780 observations of a wide variety of objects, including active galactic nuclei, stellar coronae, cataclysmic variables, white dwarfs, X-ray binaries, clusters of galaxies, and supernova remnants.
Further information can be found at:
light curves
The light curve will show the total energy received per second as time passes. This may help to establish which type of emitted radiation is prominent, luminosity, whether eclipses are present (indicative of a binary system), and possible periodicities within the system.
• At the prompt type:
lcurve
• You will be asked for the number of time series for this task. A time series is an observation contained within a single FITS file. lcurve lets you plot up to four time series together on the same plot. For now, we only want to plot one time series so type:
1
If you had hit without typing any value, the value in the brackets (which is usually either the default value or the last value that was selected) would be chosen by the program.
• Ser. 1 filename +options (or @file of filenames +options)
If you ran lcurve from the directory in which you saved the .lc files, you can simply type the filename here: c02468.lc . Otherwise, specify the relative path from where you ran lcurve.
• Name of the window file ('-' for default window)
After printing some basic information about the file such as the source’s coordinates and the start and stop time of the observation, the program will ask you for a window file. If any plots are going to be made, ftools requires a window file that contains various parameters for drawing and graphing. Normally, we do not need to worry about how to configure a window properly for plotting because the default window suffices for most tasks. Type - to specify the default window.
• Newbin Time or negative rebinning
Satellites conduct their observations in bins, periods of time in which photons collected from sources are counted up. The intuitive notion is that all bins are of length one second, which would make determining counts per second rather easy. However, for a number of reasons – mainly detector sensitivity and source strength – one second bins are not always reasonable. Instead, a satellite such as EXOSAT may use larger bin times in order to increase the accuracy of its observations.
As a researcher, there are other reasons for varying the bin time. Smaller bin times may reveal more patterns in the variation of the source at the cost of precision while larger bin times increase precision but may hide periodicities shorter than the bin time itself. Of course, the researcher is limited by the original bin time of the satellite.
Note the Minimum Newbin Time listed among the data that was printed out after the last command. This is the smallest subdivision of time allowable by this observation. It is also a good starting point for analyzing the light curve. Later, you may want to come back to this graph and bin it at different lengths to see what patterns they may reveal. For now, take the minimum.
• Number of Newbins/Interval
An interval, in ftools terminology, is the length over which an analysis is carried out. This length is measured in the number of bins. Different bin times will yield different numbers of bins per interval. Note the Maximum Newbin No. printed after the last command. If the entire observation in this file were split into bins of the size specified in the step above, there would be that many bins spread across one interval. Choosing a number half the Maximum Newbin No. will yield two graphs of roughly half the length of the observation.
Take the maximum number. It is always best to begin by looking at the entire observation and later narrowing the focus.
• Name of output file
The results of creating a light curve will be plotted to the screen and saved into a file with an extension of .flc. The output file may be used later for producing power spectra, as we will see in the next section. The default option here gives the output the same name as the input file only with the modified extension. Usually, this allows you to keep track of which light curves produced which output files.
• Do you want to plot your results?
Answer yes to see your light curve.
• Enter PGPLOT device
A device (program) is needed by ftools in order to draw the graph to the screen. This is usually /xw in Linux environments, though some may require /xterm. If one doesn’t seem to work, try the other, or type ? to see a list of legal devices. Remember, both are case-sensitive.
the PLT environment
Once the plot appears on the screen, the command prompt will change to PLT> which tells you that you are now dealing with PLT, an interactive plotting and fitting facility. Here, you are given access to a variety of commands that will allow you to manipulate the image you have in front of you.
Type exit to leave the PLT session at any time.
Type iplot to return to the PLT environment from, for example, the xspec environment.
• help
The help> prompt gives you access to the documentation for PLT. Here, typing the name of any PLT command (or ? for a list of commands) will give you the correct syntax as well as a brief description of what the command does. < Ctrl-D > will exit the help> prompt at any time and return you to PLT>.
• plot
Refreshes the graph on the screen and applies any changes you’ve made to its properties.
• marker Z on/off
Markers are the individual points labeled on the graph. Z is an integer between 1 and 16 that alters the style of the marker.
• marker size Z
In addition to changing the style of the markers, you can alter their size. In this case, Z represents a real number between 1.0 and 5.0.
• line on/off
Switches between a line graph and a scatter plot.
• error on/off
Toggles the error bars at each data point.
• rescale x xmin xmax and rescale y ymin ymax
Rescales the plot to zoom in on a part of your data set in the interval [xmin, xmax] or [ymin, ymax]. The entire observation is plotted by default but focusing on a particular portion may prove useful (especially later when we deal with power spectra). Typing rescale by itself will reset the plot to the original view.
One notable feature of the PLT environment is that you may use an abbreviation of a command, so long as it is the longest unique string that identifies the command. In this case, you can type r x xmin xmax to get the same results. Other abbreviations are available in help.
• label position string (You can use this to “name” your plot)
Place a label. The available positions include x, y, and top or ox, oy, and otop. The first set places string where you currently see labels for the x-axis, y-axis, and the title while the positions preceded by an o places string below, to the side, or on top of the current labels.
• hardcopy filename/ps
Dumps the image on the screen into a file in PostScript format. The resulting file can then be printed or viewed again later (with a program such as Ghostview). If a filename is not specified, hardcopy dumps the image into a file named pgplot.ps.
concatenation
Recall that there were two band 3 light curves that you downloaded from the AM Her observation. Both .lc files (they should be c02468.lc and c02491.lc) are part of this same observation and are broken up for purposes of convenience. ftools gives you the option of joining these files and looking at a single longer light curve. This is called concatenation.
• Open the text editor of your choice and name the file something recognizable such as amher.txt.
% cat > amher.txt
• Type the files you want to concatenate into the file. These files MUST all have the same bin time AND be in chronological order of observation:
file1
file2
file3
…
• Run lcurve. This time, when the program asks you for a filename, input the text file you created (i.e. amher.txt) preceded by an @:
@amher.txt
• Specify the rest of the information as you would normally. The concatenated light curve will be plotted, you will be given the option to customize the plot at the PLT> prompt as usual, and a .flc output file will be created for the entire concatenated light curve.
power spectrum
The .flc files resulting from the created light curves contain enough information for ftools to determine a power spectrum for a time series.
The power spectrum is a plot of power against frequency. It can be used to show prominent mechanical frequencies within the system (not the frequencies of radiation emitted). These frequencies may be used, over time, to establish (via the Doppler Effect) how fast a source is moving relative to us.
• At the prompt type:
powspec
• Ser. 1 filename +options (or @file of filenames +options)
Input a .flc file. The program will use it to determine a power spectrum for the time series (i.e. the corresponding .lc file) that created the .flc file.
Note that powspec, like lcurve, gives you the option to input an @file. You may concatenate power spectra as well by editing a text file with the names of .flc files, one per line.
Unlike light curves, which can only be added, there are two ways to concatenate power spectra.
1) You can add the data sets together to increase the amount of signal. This may or may not also increase the noise levels. If the noise levels are due to some systematic error, then the noise level will also increase. If the noise is due to random error, then the noise may have a much lower level than the signal. If the noise levels are tolerable, then you will get a much better idea of the data set’s characteristics.
2) There are situations where the light curve you’re studying has so much noise that the peaks in the power spectrum that are genuine signal are hard to distinguish from those due to random noise because they are about the same height. One way to fix this problem is to combine the data sets while averaging different times. The resulting power spectrum is the average of the power spectra from each data set. A real signal will likely appear in all sets, and so the average for that signal will be strong. Random errors will appear in some, but not all data sets, and will have different frequencies in each, so the average at any one frequency will be weak. Therefore, the signal will look significantly stronger relative to the noise after concatenation than it would before. You will specify which method you would like to use in a moment.
• Name of the window file ('-' for default window)
As with lcurve.
• Newbin Time or negative rebinning
As with lcurve. Note that the Minimum Newbin Time for the power spectrum may be different than it was for the light curve depending on what bin time you chose for the light curve that produced the .flc file you’re working with.
• Number of Newbins/Interval
As with lcurve.
• Number of Intervals/Frame
A frame consists of the average of the results of the analysis of one or more intervals. Here is where you can specify which concatenation method (detailed above) you would like to use.
1) Choose 1 (usually the default) if you’d like a true concatenation (adding) of the data. This is used to increase the amount of data (which may include both signal and noise) so that you can more easily visualize the data.
2) Choose N where N is the number of files in the data set. This concatenates by averaging.
• Rebin results?
Choose 0 (zero) so you do not rebin the results into different time segments.
• Name of output file
As with lcurve, only this time the default output file has an extension of .fps.
• Do you want to plot your results?
Choose yes.
• Enter PGPLOT device
As with lcurve.
• The power spectrum will be displayed and you will be given the PLT> prompt. You may want to rescale the x-axis to get a better look at the most prominent spikes in the plot.
fdump
You may have noticed that the various files we’ve been working with are not in ASCII format. If you have tried to view them with a text editor such as emacs or attempted to output their contents to the screen via more or cat that a part of the files can be read but the rest (the important information concerning data points) is a jumbled mess.
That’s where the fdump tool comes in. It will allow you to view the guts of a FITS file, specifically the actual numbers used to plot the graph. For example, in the power spectrum you created in the previous section, there probably was evidence of one or two extremely prominent spikes. You can keep scaling the x-axis down further and further to get a better look at it, but you’ll have little idea of the exact coordinates of the spike. That data point is stored in the file and can be accessed.
• At the prompt type: fdump
• Name of FITS file and [ext#]
To view the data points for a power spectrum, give fdump the name of the output file from powspec (usually a .fps file). Similarly, .flc files give information about light curves.
• Name of optional output file
The contents of the FITS file will be dumped, in ASCII, to another file you specify. This can be the standard output (STDOUT, usually your moniter) or, if you would like to save the information, a text file.
• Names of columns
The tables stored in the FITS file contains columns, some of them interesting, some of them not. You may choose to limit which columns are dumped by typing the names of the columns, separating different columns with a space. For .fps files, you can choose to show only the frequency and power columns by typing:
FREQUENCY POWER
Of course, you can always leave this blank to show all columns.
• Lists of rows
You may also choose to limit the number of rows shown by giving an interval of two numbers separated by a hyphen, such as: 25-100
This will show only rows 25 to 100. Leaving the upper bound of the interval blank will tell fdump to output all the rows above a certain number. For example:
100- will show all the rows after the hundredth, and:
-500 will show the first 500 hundred rows.
From this, you can see why - will reveal all rows.
• The information will be sent to the screen (if you selected STDOUT) or the file you chose. At the beginning, you will see some header information such as the date the file was created and the program (probably powspec) that created it. Scroll down further and you’ll see the data columns.
efsearch and efold
The two programs, efsearch and efold, introduce the concept of light curve folding. Assume you have a light curve with a definite periodicity determined by some property of the source and random fluctuations due to background radiation, interference, etc. If you divide an observation of the source into sections of length equal to some period and then take the average of all the different sections, the fluctuations should cancel each other out (because they are random and their values over a sufficiently long amount of time should negate each other). What you would be left with is a smooth(er) plot of the source’s period.
efsearch takes input and tries to find the period that will produce the best folded result by calculating the χ2 (chi-squared) of each period it tests. The χ2 is a statistical measure of how close a given data set comes to the expected values. You can always look up information about how exactly efsearch calculates the χ2, but for now, it is enough to know that the higher the number is, the better the fit.
From the results of efsearch, the program efold will actually fold a given light curve and produce a plot a single period of the source.
• efsearch requires from you an appropriate “guess” at the actual period of the source. How can we come up with this guess? We can always eyeball the light curve that we’re looking at or we can look at the most prominent spike in our power spectrum. Recall the power spectrum we just created in the previous section. fdump its contents and search for the highest power and its corresponding frequency. A quick look gives us this line:
FREQUENCY (Hz) POWER
4 9.765625000000001E-05 3.0029425E+02
Now that we have the frequency, it is a simple matter to calculate the period. Taking the reciprocal we come up with a period of 10240s. That will be our guess.
• At the prompt type:
efsearch
• Ser. 1 filename +options (or @file of filenames +options)
Enter a .flc file.
• Name of the window file ('-' for default window)
Use the default window.
• Epoch
The epoch is the start time, where you want the program to begin searching for a good period. The format is in days, and by default it is set to the beginning of the observation. Normally, this is a good place to start unless there is something in the observation you want to avoid (such as eclipses or satellite drop outs). Since we aren’t dealing with any of these now, take the default:
INDEF
• Period
Input your best guess for the period in seconds.
10240
• Phasebins/Period {value or neg. power of 2}
Instead of asking you for a bin size, efsearch wants to know how many bins each period should be split into. For this observation, we know that a 10 second bin time is the absolute minimum. Since our guess for the period is roughly 10,000 seconds, a good number of bins to choose would be:
1000 In general, enter (period ( bin time)
• Number of Newbins/Interval
Input the maximum value. This number will roughly be an integer multiple of the number of bins per period you specified in the step before.
• Resolution for period search {value or neg. power of 2}
The resolution is the spacing between consecutive periods that the program will check. For example, if you select a resolution of 10 seconds with a period of 100 seconds, efsearch will check the periods of …80, 90, 100, 110, 120… etc.
Experiment with the sizes for the resolution. Low values have the tendency to return a bunch of periods that are all good choices. Higher values will probably miss some of the more accurate results. Also, keep in mind the length of the observation. The length of this one is only on the order of 104, the same order of magnitude as our guess period so only two or three periods can fit into a single light curve. Sometimes, if you run into the case where the period is close to the length of the observation, it may be a good idea to concatenate several light curves (see above) to produce a longer data series.
A good resolution for this period is:
100 In general, use a value that is about 1% - 10% of the period.
• Number of periods to search
The number of periods, in increments and decrements of the resolution, to search in each direction from the guessed period. Try different values here to see how they are represented on the plot.
• Name of output file
Do you want to plot your results?
Enter PGPLOT device
We’ve seen these before. Handle them as you did with lcurve and powspec.
• The plot shown will be of χ2 values against possible periods. The best period along with the resolution will be printed at the top of the graph. Watch out, however, for harmonic periods, integer multiples of the best period. For example, in this graph, depending on the resolution and the number of periods you chose, you may see two peaks, one around 10,000 seconds and one at 20,000 seconds. As an exercise for yourself, think of why these two peaks should appear.
Now that you have an even better estimate for the period from efsearch, you can now use efold to fold the light curve and see a close-up view of the period. This may reveal details in the light curve that repeat each period but are too weak to be noticed in the original plot.
• At the prompt type:
efold
• Number of time series for this task
Ser. 1 filename +options (or @file of filenames +options)
Name of the window file ('-' for default window)
The first few options to set for efold are identical to what you’ve seen in lcurve. This time, the input file is a .flc file, as it was in efsearch.
• Epoch
Period
Phasebins/Period {value or neg. power of 2}
Number of Newbins/Interval
As with efsearch. The period this time will be used to fold the light curve.
• Number of Intervals/Frame
As with lcurve, set it to 1 .
• Name of output file
Do you want to plot your results?
Enter PGPLOT device
As with lcurve, powspec, and efsearch.
energy spectra
The last type of plot we’ll deal with is the energy spectrum, a plot of power against frequency (or wavelength). An energy spectrum contains vital data on the photons that we have collected from our source – and therefore the photons produced in our source. You will see that by analyzing and fitting this spectrum to theoretical curves you can find the temperature of the source, its luminosity, and information about its chemical composition as well as its distance.
The program xspec allows us to make these kinds of plots with the .pha and .rsp files we downloaded from HEASARC.
• For this exercise, go to HEASARC and find EXOSAT observations on the source GK Per. Under the EXOSAT ME Spectra and Lightcurves observations, find the one with a quality flag of 4. After clicking on Preview and Retrieve, select the boxes for Spectra and Response Matrices to download. Untar and organize the files. They should be s05865.pha and s05865.rsp.
• Type xspec to call up the package.
• The first thing to do is enter the data into the program. Type
data s05865.pha .
• Command the program to ignore glaringly bad data points that may be problematic to a good fit by typing ignore bad .
• Next, you should tell the program which plotting device to use to plot your spectrum. Type cpd /xw .
• To plot your spectrum type plot data .
If you type plot ?
you will get a list of all the things besides your data that you can plot.
• Command the program to ignore glaringly bad data points that may be problematic to a good fit by typing ignore bad .
It is a good idea, after every data manipulating command that you enter, to
plot data so that you may see what you have done.
• Notice that around channel 60 the signal to noise ratio becomes small and it would be a good idea to remove some more possibly problematic data points. Use the ignore command to filter out undesired channels.
Type ignore 60-** which tells xspec to ignore the channels 60 and above. The same result can be achieved with the command ignore 60-100.
• Now we’ll convert the energy spectrum into units of energy (keV) that will allow us to fit models. Type setplot energy and then plot data .
Fitting the data to spectral models
We will attempt to fit our data to three spectral models:
1. Powerlaw spectrum.
This model represents a spectrum that arises from a source whose radiating electrons are moving within a magnetic field and therefore experience a force perpendicular to their motion. This torque accelerates the electrons and causes them to radiate in a characteristic way. The spectrum produced by this effect has the mathematical form of a powerlaw where the intensity of the radiation energy is proportional to the energy raised to a power.
I(E)=AE-α
where α is a spectral index, and A is a constant.
2. Blackbody spectrum.
A blackbody spectrum is one that is only dependent on the temperature of the radiating source. This model assumes that the radiating photons get their energy solely from the temperature of the object.
I(E)=2E3/h2c2 (e-E/kT-1) -1
where h is Planck's constant, and c is speed of light.
3. Thermal Bremsstrahlung.
This radiation spectrum is caused by the thermal motions of electrons in gasses hot enough to be ionized. The presence of charges from the ions exert electromagnetic forces on the electrons that 'bend' their motion and cause them to radiate. The distribution of the energy emitted from such 'bending' depends on the densities of both electrons and ions as well as the temperature of the gas, and it has the form:
A(E)=C x G(E,T) Z2neni (kT) –1/2 e-E/kT
where C is a constant, G(E, T) is a function that varies with temperature and energy, Z is the charge of positive ions in the gas, ne and ni are electron density and positive ion density, respectively, and T is the temperature.
The Powerlaw Spectrum model.
Type model phabs(powerlaw) , or for the abbreviated command (xspec accepts truncated commands as long as they are unambiguous) type
mo pha(po) .
You will receive a list of parameters that this particular model requires. These parameters are default and we can change them to fit our data with the fit command later.
We will accept the default values by pressing three times.
The output that follows includes a χ2 and a reduced χ2 (the original χ2 divided by the number of degrees of freedom). If the reduced χ2 is near one then the fit should be good. The null hypothesis probability is the probability of getting a value of χ2 as large or larger than observed if the model is correct. If this probability is small then the model is not a good fit.
• plot data
Your fit should be very bad. This is because the model is constructed using the initial default values that do not correspond to your data. Notice that the null hypothesis probability is 0 and that the reduced χ2 is tens of thousands large. Both of these facts indicate a poor fit.
• You can help the fit by renormalizing your data, you are basically scaling the model down by multiplying the entire curve by a single factor determined by your data. Just type renorm and plot data to see the results. Notice the dramatic reduction in χ2 (but still not close to 1).
• We will now fit our data. The fit command will attempt to match the parameters of the model to the data. The algorithm goes through ten iterations at a time attempting to find a minimum χ2 and prints out the results of the fit.
Type fit and after ten iterations, when prompted to continue fitting type y .
• When the fitting is done plot data to see the results.
The model does converge to the data though not exactly. To see where the model departs from the data type plot data residuals . At the bottom of the screen, the model is represented by the straight line and the data oscillates above and below. It is only toward the higher energies that the residuals shrink and stop oscillating.
The results in our output shows a reduced χ2 around 8. The null hypothesis probability is zero. These results are consistent with the plot since most of the data points do not intersect with the fitted model. This indicates that our model is not good enough to describe this source very well.
In the table (your results may not exactly match but should be close):
---------------------------------------------------------------------------
---------------------------------------------------------------------------
mo = phabs[1]( powerlaw[2] )
Model Fit Model Component Parameter Unit Value
par par comp
1 1 1 phabs nH 10^22 23.21 +/- 0.6166
2 2 2 powerlaw PhoIndex 1.106 +/- 0.3817E-01
3 3 2 powerlaw norm 3.2135E-02 +/- 0.2802E-02
---------------------------------------------------------------------------
---------------------------------------------------------------------------
You should make a note of the parameter values that achieved the best fit. This output claims that the column density (nH) for this source is about 2.32 x 1023 and the Photon Index is 1.106.
• Type flux . You will receive an output showing the flux of the star in photons/cm2/s and in ergs/cm2/s. Flux is what we’ve referred to as the apparent brightness: once you know the flux and the distance to the source, you can know its luminosity. Make a note of the flux to see how it changes with a chosen model.
The Blackbody Spectrum model.
Type model phabs(bbody) (or mo pha(bb)) and again accept the default parameters by pressing
three times.
• plot data
Again this is not a good fit and χ2 is around 50 million!
• renorm and plot data to scale down the model.
• You must now fit the model to the data. Then
plot data residuals to see the deviation from the curve.
The reduced χ2 seems to be the same implying an unimproved fit. It seems to be the peak of the curve that won't conform to either model. What is useful from this model is that we can extract a temperature (in units of keV) from the output. With a kT of 3.081 (or close to that) the temperature is around 3.6 x 107 Kelvin.
(with k=8.6 x 10-8 keV K-1).
The column density of the blackbody analysis is smaller ≈ 1.34 x 1023 but is of the same order of magnitude.
• Measure the flux here. Type flux . Was there a change?
The Thermal Bremsstrahlung radiation model.
• Type mo pha(br) and accept the initial parameters as before.
• plot data
• renorm and plot data again.
• fit until the fitting is complete. Then
• plot data residuals < Enter >.
The results give a similar reduced χ2 ≈ 8.5 and a column density of 2.55 x 1023 which is consistent in order of magnitude with the other two models. From this we can conclude that the column density for GK PER is on the order of 1023.
• Type flux to measure the flux one last time. It seems that the flux measurement is consistent. You can go ahead and calculate luminosity (if you know how far away the source is).
Review Activity for EXOSAT Data
Repeat every thing you’ve done for both Am Her and GK Per for the object
Cyg X-1. Use sequence number 894 under EXOSAT ME Spectra and Lightcurves.
Select the Band 1, Band 2, Band 3, Spectra, and Response Matrices files to download. Get hardcopies of all of your plots.
-----------------------
Please Note: This section, and the following section (efsearch & efold), can be skipped when doing the tutorial. You can proceed directly to the energy spectra section.
For EXOSAT data, the chronological order is the same as the numerical order of the .lc files.
................
................
In order to avoid copyright disputes, this page is only a partial summary.
To fulfill the demand for quickly locating and searching documents.
It is intelligent file search solution for home and business.
Related download
- linux unix and ftools tutorial
- lab assignment 2 makefiles wpi
- operating system laboratory dept of computer engineering
- linux basic and administration commands
- cipher command line utility
- report on learning linux networking
- viewing packet flows with tcpdump
- main title ctrl t
- introduction louisiana tech university
Related searches
- linux networking tutorial pdf
- probability and statistics tutorial pdf
- linux find and execute command
- linux commands and meaning
- numpy tutorial and practice
- understanding unix linux programming pdf
- unix linux command reference pdf
- unix redirect stdout and stderr
- download and install linux os
- unix shell scripting tutorial pdf
- linux operating system tutorial pdf
- linux redirect stderr and stdout