Revisiting IC 417 on the ES 102mm

IC415.jpg

I decided to revisit IC 417 but this time with a wider field of view. The image was previously taken with my AT6RC, and it focused primarily on IC 417. But this time I was able to frame it such that I also got the open cluster NGC 1907, as well as the smaller nebula below IC 417, NGC 1931. I’m really happy with how it turned out.

Imaging details

Imaging telescope or lens:Explore Scientific ED102 FCD-100 CF

Imaging camera:ZWO ASI1600MM-Cool

Mount:Celestron CGX

Guiding telescope or lens:Stellarvue F050G

Guiding camera:ZWO ASI290MM Mini

Focal reducer:Stellarvue SFFR102-2

Software:Kstars/Ekos,  Astro Pixel Processor,  PixInsight 1.8 Ripley

Accessories:Moonlite High res stepper motor and Mini-V2 controller,  MoonLite CF 2" Focuser

Resolution: 4432x3235

Dates:Jan. 4, 2019,  Jan. 5, 2019

Frames:
Astrodon Tru-Balance H-a 5nm: 125x180" (gain: 200.00) -15C bin 1x1
Astrodon Tru-Balance OIII 5nm: 112x180" (gain: 200.00) -15C bin 1x1
Astrodon Tru-Balance SII 5nm: 80x180" (gain: 200.00) -15C bin 1x1

Integration: 15.8 hours

Darks: ~50

Flats: ~50

Bias: ~50

Avg. Moon age: 28.56 days

Avg. Moon phase: 1.31%

Bortle Dark-Sky Scale: 6.00

Astrometry.net job: 2454319

RA center: 82.351 degrees

DEC center: 34.736 degrees

Pixel scale: 1.380 arcsec/pixel

Orientation: 92.987 degrees

Field radius: 1.051 degrees

Locations: Home Observatory, Pearland, Texas, United States

Data source: Backyard

Narrowband Imaging IC410 and IC417

IC410 imaged on the Explore Scientific 102mm FCD100 APO refractor.

IC410 imaged on the Explore Scientific 102mm FCD100 APO refractor.

Over the last two weeks, I’ve had 3 imaging nights. KStars & EKOS 3.0 were released which fixed a ton of long standing issues with the scheduler. In addition to that nice software update, I got a Celestron CGX for Christmas! So, those two things combined and I set my sights on the only northern region available to me from the back yard and imaged IC417 on my AT6RC, and IC410 on my Explore Scientific 102mm FCD100 scope.

Here’s a recent photo of the setup.

IMG_0340.jpg

So far it’s worked great. Average RMS has been between .6 and .8. My AVX was hovering between .8 and 2.0 RMS. I think I can get the CGX tuned a little more in guiding to get those numbers even lower, but have not attempted any adjustments. These are the numbers I’ve been getting without changing any of the default guide settings.

IC 417 imaged on the AT6RC from Astro-tech.

IC 417 imaged on the AT6RC from Astro-tech.

KStars/EKOS 3.0 released with new features

EKOS-7.jpg

The team behind KStars and EKOS have been busy wrapping up a new version of their imaging software just in time for the holidays. There’s a lot of new features in this one.

The first major feature is the XPlanet solar system viewer developed by Robert Lancaster. It’s a significant upgrade over the built-in viewer.

Robert also created a new interface for the FITS viewer which can how show you all the data of your images in a new side panel which features the FITS header info, Histogram, Statics, and recent images.

EKOS-9.jpg

Additionally, Eric Dejouhanet dedicated time to a huge scheduler rewrite. The scheduler system previously allowed for scenarios where you could have conflicts in operations, but with the rewrite all this has been fixed and numerous improvements have been added:

  • Dark sky, which schedules a job to the next astronomical dusk/dawn interval.

  • Minimal altitude, which schedules a job up to 24 hours away to the next date and time its target is high enough in the sky.

  • Moon separation, combined with altitude constraint, which allows a job to schedule if its target is far enough from the Moon.

  • Fixed startup date and time, which schedules a job at a specific date and time.

  • Culmination offset, which schedules a job to start up to 24 hours away to the next date and time its target is at culmination, adjusted by an offset.

  • Amount of repetitions, eventually infinite, which allows a job imaging procedure to repeat multiple times or indefinitely.

  • Fixed completion date and time, which terminates a job at a specific date and time.

EKOS-6.jpg

A few other enhancements are a new scripting and DBus system allow for 3rd party applications to take advantage/control of features with EKOS which will open up the system for more options down the road.

Other improvements and new features can be found on Jasem’s (lead developer) website.

Here’s a few more screens of the rest of the updated interface panels.

13 Panel Mosaic of the Moon

Moon-13-panel.jpg

Yesterday, I had a few hours of clear sky, and got out my AVX along with my Celestron C5, and ZWO ASI224MC camera. I was determined to get a few shots of the moon. I initially tried a Powermate 2.5x, but the seeing just wasn’t there. I ended up shooting everything at prime focal length 1250mm.

I used Planetary Imager on the Mac to capture everything, then merged them all together in Photoshop using it’s photo merge feature.

Below you can see the individual frames that make up each part of the mosaic.


HD 14771 and Galactic Friends

NGC_891-v3.jpg

I managed to produce a neat image over the last two nights. I centered the star HD14771 to place NGC 891 in the lower right and a galaxy cluster in the upper left. After two nights of imaging, I had around 13 hours. I placed it all together today. Turns out that the galaxy cluster was only a few of the total galaxies in this image. In all, there are 79.

Captured in EKOS/Kstars

Integrated with slight processing in Astro Pixel Processor

Completed processing in PixInsight and Photoshop

Overlay done in Observatory


Equipment used and shown in photo:
Imaging telescope or lens:Astro-Tech AT6RC
Imaging camera:ZWO ASI1600MM-Cool
Mount:Celestron Advanced VX
Guiding telescope or lens:Orion 60mm Guide Scope
Guiding camera:ZWO ASI224MC
Focal reducer:Astro-Physics CCDT67
Software:Astro Pixel Processor
Filters:Astrodon Tru-Balance Blue E-Series Gen 2 31mm, Astrodon Tru-Balance Green E-Series Gen 2 31mm, Astrodon Tru-Balance Red E-Series Gen 2 31mm, Astrodon Tru-Balance Luminance E-Series Gen 2 31mm
Accessory:MoonLite CSL 2.5" Focuser with High Res Stepper Motor

In this overlay, done in Observatory on the Mac, you can see all 79 galaxies highlighted.

In this overlay, done in Observatory on the Mac, you can see all 79 galaxies highlighted.

Here’s the scope that took the image. The AT6RC with CCDT67 reducer and ZWO ASI1600MM-C camera.

Here’s the scope that took the image. The AT6RC with CCDT67 reducer and ZWO ASI1600MM-C camera.

iObserve gets a new release, now with Mojave Dark Mode support

Screen Shot 2018-11-20 at 10.06.53 PM.png

A little over a year ago, iObserve saw its last update. The developer (Cedric Follmi) had put the Mac iObserve application on hold to devote time to an online only web version over at arcsecond.io. But after a year or so of developing efforts on the website, he put up a poll online asking users what development path they would like to see going forward. Continue the website? Update the Mac app to be compatible with Mojave? Make an even better Mac app longer term? Given those choices, people voted, and now there’s a new Mac application.

What’s new in iObserve 1.7.0?

  • Added full support for macOS 10.14 Mojave with a complete update of the app internals (especially about network requests and dates).

  • Dropped support for all macOS versions before High Sierra (10.13).

  • Mojave Dark Mode

  • Suppressed the large title bar to adopt a more modern and compact look .

  • Suppressed the ability to submit new observatories by email, and explain that Arcsecond.io is the new home for observatories.

  • Fixed the failing downloads of the sky preview image (available when clicking the icon to the right of the object name in the right-hand pane).

  • Fixed an issue that prevented the app to complete the import of a Small Body.

  • Fixed an issue that prevented the user to select a Small Body in the list when multiple ones are found for a given name.

  • Fixed the failing downloads of 2MASS finding charts.

  • Fixed various stability issues.

Get the latest version directly from the Mac App Store.

Screen Shot 2018-11-20 at 10.06.33 PM.png

How to take easy flats using an inexpensive light source.

IMG_0309.jpg

Here’s my setup at 5:30 am this morning. Taking good flats is key. I had been using the dawn sky to shoot flats for some time. EKOS has a feature where it will shoot flats of any desired ADU value. I’ve found that a median ADU value of 22,000 is perfect for my setup. I found this value through trial and error, by taking flats ad different ADU values, then calibrating with them to see what the results were. Anything above 24,000 overcorrected, and anything less than 20,000 under corrected, so I’m right in the middle now.

I recently discovered this really awesome and inexpensive light source for flats. It’s worked like a charm.

A3 Light Box by AGPtek - currently $47.99

First off, A3 is large enough to cover the front of most large scopes. It’s 11.69” x 16.53” and it’s a flat evenly lit LED panel with three built in brightness settings. It can be powered by the A/C plug it comes with, or through USB plugged into your laptop.

In the photo above I have it plugged into the laptop, and am taking my flats through EKOS. This makes capturing flats quick and easy.

Within EKOS, I build a camera sequence for all my filters, 50 images each, auto exposure set to ADU value 22,000. Then I run the sequence. Within seconds it measures the light from the frame, and knocks out 50, then switches filters, measures the light again, and bangs out another 50 frames. In about 2-5 minutes I can capture all my flats in one go.

Below are the two sequences I captured for the evening (Double Cluster, and M33). While short at under 2 hours each, you can see that they are clean and well calibrated thanks to the easy flats system I’ve been using.

double cluster.jpg
m33.jpg

Pacman Nebula captured with EKOS, and processed in PixInsight.

NGC281.jpg

I managed to get a few clear nights last week. Just two in fact. So, knowing I had only two nights to image, I focused on getting another bi-color nebula image. My earlier Pacman nebula imaging attempt was at F9 on my RC with a one shot color camera. It didn’t quite come out with the detail I could achieve on this particular image.

This one was imaged on my Explore Scientific 102mm FCD-100 scope. I captured about 8 hours per filter (Astrodon 5nm HA, and OIII) using the ZWO ASI1600MM-C camera using the EKOS imaging platform on the Mac.

I did the initial image integration and calibration with Astro Pixel Processor, then took each final filter’s image HA and OIII into PixInsight for final processing. In PixInsight, I processed each image to maximize the brightness and contrast of the nebula without destroying the stars. I tried a pixel math combination into RGB with the goal of keeping the nebula as natural looking as possible with good star colors. I find that the following combination is good for that R=HA, G=OIIIx.6+HA*x.4, with B=OIII. I think the final image turned out great.

UPDATE: I added some SII data now, for a new pallet based on the Hubble coloring. This new image is a total of 24 hours of data.

NGC281.jpg

Tutorial for Astro Pixel Processor to calibrate and process a bi-color astronomy image of the Veil Nebula

The final result of processing the Veil Nebula in Astro Pixel Processor on the Mac.

The final result of processing the Veil Nebula in Astro Pixel Processor on the Mac.

To get started, you'll need to have taken a full set of light images to process. In addition you will need darks, flats, and bias for calibration of those light images. The calibration process is going to remove any artifacts caused by dead pixels in your camera, and correct for lens dust and uneven illumination caused by your image train. Starizona has a great page detailing why you would do image calibration.

If you don't have Astro Pixel Processor, you can download an unlimited 30-day trial at the website. If you like the process, and find the program easy to use, it's fairly affordable in comparison to some of the other tools out there.

Loading your images into Astro Pixel Processor

APP-01.jpg

When you first open the program, you'll be asked to choose a working directory. I typically make a folder on my desktop called Processing, and put all my images neatly organized into folders within. I label them Light, Darks, Bias, and Flats. Inside each folder there are more folders for each filter. For this particular image, I have HA and OIII images of the Veil Nebula, so there is a folder for each in the light folder, and a folder for each in the Flats folder. Bias and Dark can be shot as a single set (the filter doesn't matter since the frames are dark) and used to calibrate both HA and OIII.

APP-02.jpg

From here, I need to go to the Load tab on the left panel in APP. I'm going to check off a setting here for Multi-channel/filter processing since I'm processing two channels/filters at once. You do both at once here, because you want them to register the alignment of all stars across all images at the same time.

APP-03.jpg

If I had shot the same filters over multiple nights, I could select Multi-Session processing, which allows you to do day 1, 2, 3, 4, etc. of each filter and use a different set of flats for each day of the same filter. This is especially useful if you re-image an object over multiple nights throughout a year, you can keep adding data to improve your image. But in this case, I shot all my images for each filter on a  single night.

NOTE: If you have a One Shot Color (OSC) camera, and need to make some modifications to the images as they are processed, you would go to the RAW/FITS tab and you can select debayer options there before you load your lights in.

On the load tab, you will now select the Light button, navigate to your first lights folder and select your first set of images for the first filter. In this case, I'm selecting my HA images. You can select the first image in the list, scroll to the last, hold shift, and select it. This will multi-select all images in the window, and you can then press OPEN to import them.

When you add them, it will ask you which filter these light images are, be sure to select the correct filter. In this case they are Hydrogen Alpha images. I select that, then you'll see that there are a bunch of images now associated with the Light button on the left pane. Press the light button again to add the second filter's images. This time go to your OIII light directory and select all OIII frames. Open them, and choose Oxygen III for the filter.

Now your light frames are loaded. You'll do the same for flats, being careful to assign HA flats to HA filter and OIII flats to the OIII filter. This insures the right frames get calibrated with the right flats. Now add your Darks and Bias frames. For both of these when it asks which filter to choose, pick the top options to apply the Darks and Bias to all filters.

Calibration options

Now, we're going to set one calibration option, and that's to have the program create a bad pixel map. You'll press tab two "Calibration", scroll down and find the check box "Create Bad Pixel Map". This will create a map of all bad pixels on your camera sensor, and correct for them when processing the final image.

APP-06.jpg

Integration options

This is a very straight forward simple process just to give you an idea of how the program works. We're only going to set a few things in this tab.

APP-07.jpg

We're going to integrate per channel under the Multi-Channel/Filter options setting since we're processing multiple filters of light data. We're also going to stack all 100% of the images, because I've already gone through them and removed any images where clouds or airplanes came into the frame. You could lower the % to integrate if you want the program to automatically remove the worst images based on a percent of overall images. I'm going to also set "weights" to quality. This is going to look at all the images in my set, and integrate lower quality images with a lower weight than higher quality images.

APP-08.jpg

I'm going to set "Outlier rejection" to "Windsor clip" and leave the rest of the settings to the default. This is going to average out satellites and other stray objects that get into the frames.

Finally, we now have all settings ready to go. It's time to start the integration process.

APP-09.jpg

You'll press the integration button, and now APP will run through your entire set of images creating master bias, dark, flats, and bad pixel map. It will then apply them to all your light frames to calibrate them, then it will align all frames using the registration process, and finally integrate them into two light images, one for HA, and one for OIII.

APP-10.jpg

Once complete, you should have a folder that looks something like this:

These image thumbnails were generated with a finder plugin that came with Observatory on the Mac.

These image thumbnails were generated with a finder plugin that came with Observatory on the Mac.

Processing your calibrated images

From here, we're going to load integrated light frames in order to process them into a final image. If you look at the bottom of your files window in APP, the last two files on there should be your Integrated HA and OIII images.

APP-12.jpg

The first thing we're going to do is remove any light pollution that came from either the moon or any nearby lights (or city glow). Even narrowband images can be affected by light pollution, but it will not be quite as bad as RGB images.

Open the tools tab (Tab 9), and double click the first integration image, this loads the image into the viewer, and you can now press remove light pollution on the tools tab to open it for editing.

APP-14.jpg

With your image loaded into the light pollution removal tool, you'll take your cursor and draw boxes over any area that is sky only. Be careful not to draw boxes over any area that has nebulosity or image data in them. It's OK if you include stars. Once you have a good set of boxes covering most of the area, press the Calculate button. This will remove the light pollution based on the boxes you have currently.

APP-16.jpg

Once you see how this has an effect on your screen, it might reveal some hidden nebulosity that was covered by the light pollution. You should now be able to add a few more boxes to finish refining the light pollution removal. Once complete you can check by pressing calculate again. If you're happy with the results, press OK & Save. Rename your file here to remember which version this file is. Each time you process an image with APP it will have you save that image. You can continue to save over the previous image, but I always find it best to rename it different after each save. I use HA-LPR in this case. Keep the format as FITS.

Now you can process your other channel the same way.

APP-17.jpg

Don't worry about the edges of your frame. Due to the different alignment of each frame (assuming you used dithering curing your image capturing) you will have a few pixels on each side of your frame from the registration process. It's not necessary to try and remove any light pollution from here as you'll just crop it out in the next step.

Cropping your frames is a little tricky. You can load them one at a time using the batch modify tool and manually draw a box around each one. But that's imprecise. You might accidentally crop each frame differently. In order to do both at the same time with the same crop, you'll need to NOT load an image before choosing "batch modify". So press Batch Modify, tell it not to load an image, and it will then ask you which files you want to batch modify. Select your two frames that have light pollution removed. It will load the first frame. Draw a box around it, and press the Crop OK button. This will crop both frames at this location you've indicated.

You'll now see two frames at the bottom of your list that are both cropped.

Combine RGB

This is the fun part of the process. We're now going to take your two individual frames and combine them into a single color image. 

APP-18.jpg

Now you'll choose Combine RGB in the tools area of APP (left Tab 9). It will open a new area with nothing in it. From here we're going to add in our images using the Add button. We'll pick the cropped HA image, choose HA channel. Then we'll pick OIII and choose OIII channel. Then, since a full color image actually consists of three channels Reg, Green and Blue, we need to add the OIII frame again. On the left column, you should now have 3 channels listed, each with a few settings underneath them. From here, we can assign which color we want each frame to be colored. 

Take the slider under Hydrogen Alpha labeled R (for red), and slide it to 100%. Then make one of the OIII channels B (for blue) 100%, and the last channel of OIII G (for green) 100%. Now press the calculate button at the top of the column. This will process the three channels into an RGB image, and you should see the results in your main window.

Now press the create button, save it as your RGB integration (any name you choose), and keep the format as FITS.

Background calibration

APP-20.jpg

We're now going to use the background calibration tool on our new RGB image. This is going to make sure that the black in the background is a true neutral color. If we had just a little bit too much red, or blue, this will knock it out and make sure the black background is true black.

Load your image using the Background Calibration tool (tab 9). Draw your boxes around only background area that is black sky and stars. Do not get any of your nebulosity in the boxes, because we don't want it to neutralize your pretty colors. Press calculate to see the results. If it looks good press OK & Save. Name your file and pick FITS again for the format.

Star calibration

APP-23.jpg

Because we're making a false color image with two filters, our star colors are going to be exaggerated a bit. I like to use the star calibration to bring them more in line with the typical star color temperature they should be. 

The calibrate stars tool is is also in the tools menu (tab 9). Select the image where you calibrated the background. Load it into the calibrate stars tool. And draw your boxes around large sets of stars. Press the calculate button, and this will process the image. You'll notice your bright red stars drop down to a more normal color. The stars are now in a proper temperature color range. But you'll notice that we also lost a little color in the nebulosity. We're going to bring that color back in the next step. Save your image, and again name it and pick FITS for the format.

Final processing

In this last step, I'm going to process the final color in another app that I'm more familiar with. These steps can be done in APP using the tools always shown to the right of the image. But for me, I can achieve the results faster by using an app I know better. In this case, I'm going to use Photoshop, but the same tools I'll use here are also available in a number of other apps on the Mac. GIMP, which is free has these tools, as well as Acorn, Pixelmator, and others.

First things first. We need to get the last image we did out of APP and into a regular image format for use an a standard image editor. In the upper right hand corner of APP, you'll see a Save button.  Load your last star color calibrated frame, and then press the save button.

APP-21.jpg

Keep the stretch option checked, and it will export the image as you see it in the viewer. If you uncheck this, and export, you'll have to stretch your image in your other image app instead. I find the default stretch here to be adequate. When saving, make sure to pick a format you can read in your image application. I picked 8-bit Tiff, but you can also pick JPG if you want a smaller file at the expense of a little bit of quality.

I now load the image into photoshop and apply some Curves to darken the blacks, and brighten the lights.

APP-22.jpg

I also add some saturation here to make the colors more vibrant. With those two things done, I'm ready to save my final image and complete the process.

This tutorial is provided to show the basic steps in processing with APP. It is capable of so much more, and I only touched the surface with this tutorial. To achieve the best results, experiment with all the tools to see what you can achieve. 

The final processed Veil Nebula image

veil_nebula.jpg

Astronomy and astrophotography planning with AstroPlanner on the Mac

Overview of AstroPlanner

AstroPlanner is a complete system for tracking observations and planning out nightly viewing or imaging sessions with your equipment. It also offers computer scope control from within the application.

Upon launching the software you'll need to start populating it with your user information. You'll provide your observing locations, this can contain your current location, as well as offsite locations that you visit for observing. AstroPlanner can access a USB GPS device to give you pinpoint accuracy for your site location. This should allow you to plan for those remote visits before you travel, so that you can be prepared with the equipment you require for the objects you plan on viewing or imaging.

The filter resource. (Add your filters here on this tab, and AstroPlanner will show you the visible wavelengths your an view or image with.

The filter resource. (Add your filters here on this tab, and AstroPlanner will show you the visible wavelengths your an view or image with.

In addition to your location, you can add each telescope you own, any eye pieces you have, optical aids like Barlows or reducers, camera or viewing filters, the observer (yourself or a buddy who might observe with you), and any cameras you might utilize for imaging.

camera.png

Once you've added all your equipment, you can start to add objects to the observing list. There are four primary tabs for objects. The objects list, the observations tab to add observations, the field of view tab which shows you how your image will look using the selected equipment, and finally the sky tab which shows the nights sky chart and allows you to view where the object you selected lies in the night sky, as well as other objects that are visible.

The Objects view in Astro Planner

astroplanner-main.png

This is the main view within AstroPlanner. From here you add objects by using the Plus symbol in the lower left corner fo the screen. You get a search function to find the object and add it to the list. You an also browse by what is visible currently in the sky, and filter those choices by object type (open cluster, galaxy, nebula, planetary nebula, etc.). Across the top of your screen, you get a readout for the current date and time, sidereal time, Julian date, GMT, and GMST. On the second row below that information you can select the telescope you intend to view your object with. Next to that, you can see the sun and twilight time, what the current moon looks like, as it's helpful to know how much of an impact the brightness of the moon will have with imaging. Then next to that is your site location, and a clock which you can set to show the object at different time intervals.

On the next row of information you see the ephemeris of the object during the night and month. This allows you to see the objects elevation during the darkest part of the night between sundown and sunrise and it's visibility over the month. Next you see see altitude and azimuth indicators from due north. This gives you an idea of how you will need to point your telescope to see the object, in the above image it's indicating you need to point east and slightly above the horizon. Lastly there is a tiny indicator of where the object is in the night sky.

At the bottom of the screen you see your object list, as well as the local sky chart (showing the object constellation where your object is. You can switch the sky constellation chart to show images from several astronomical databases like the Hubble Space Telescope raw images.

The Observations view in AstroPlanner

Observations 1.png

This tab highlights observations for the currently selected object. From here you can put in seeing and transparency conditions, note your field of view, and add any observations you made of the object during this particular time and date.

Observations 2.png

Additionally, you can add attachments to your observations. In this case, I added an image I took with my telescope of NGC7000. I left an observation note listing out the focal length and equipment I used for this session.

Field of view in AstroPlanner

Field of View.png

This tab allows you to select all of your equipment for the viewing session. In this particular instance you can see I picked the AT6RC scope, with a CCDT67 reducer, and the TeleVue Delos 4.5 eye piece. With the current object M33 selected, and a Hubble Space Telescope image loaded, I'm able to see what it would look like in my telescope's view had I been looking through that particular set of equipment. You can choose additional display options in the lower right hand corner and it will overall known stars, object names, etc into the view.

The Sky view in AstroPlanner

Sky view.png

In this final object view screen, the Sky tab, you can see a sky chart of where your object is in the night sky. You can turn on and off planets, stars, galaxies, etc using the display options to the right to fine tune the view and make it easier for you to spot your object in the night sky.

I hope this gives you a good indication of the use and benefit of having a detailed planning tool. AstroPlanner is available here and is priced at $45, which doesn't seem like that much for all the features that it offers.

Using EKOS to capture the Wizard Nebula (NGC7380) on the Mac

main_window2.png

This will be a general walkthrough of a typical capture session with my AstroTech AT6RC setup.

This should be a good walkthrough for someone not familiar with the system to enable capturing their own images. However, I'm not covering equipment setup in this post, but might cover it in the future in another post.

A few caveats with my particular setup. I break it down and set it up each night, so I require a new polar alignment before each session. My AVX mount doesn't fully support the park function in EKOS, so after a nights session, I cannot auto park, but others may have a mount that supports this feature.

Connect your equipment

The first step in my process is to set up all my equipment, connect it to the Mac laptop and start with an All Star Polar Alignment (I can't see Polaris, so use this method built into Celestron mounts). After this procedure is complete, I load up Kstars, then press the EKOS button on the top bar to launch the EKOS capture system. I then press the connect button to connect to my equipment which I have pre-setup within EKOS prior to this nights session.

Focus Module

focus_window.png

The mount is probably already aimed at your last alignment star from your polar alignment, and this is typically good enough to use for focusing. I select the Focus Module and then press the capture button. This grabs a single screen and displays it in the screen preview window. Since I have a motorized Moonlite focuser, I can select a star with the cursor (it puts a green box around it in the screen), and then press the Auto Focus button. This begins the auto focus routine where it begins automatically focusing in and out and measuring it's effects on FWHM (Full-Width Half-Maximum) which continually measures the width of the star to get it as small as possible after iterating multiple times. Now, we're in focus, and can move on to the next step. (Side note, if you don't have an automated focusing system, you can use the camera module's live preview feature and a Bahtinov mask to focus instead of using this module.)

Mount Control Window

mount_window.png

The next part of the process is to open up the Mount Control module, and select "Mount Control" in the upper right of the window. This will open a small control pad with arrows, and a target search to move your mount. I'll press the search icon and type in a target name for a simple, easy to identify target for plate solving. Usually I pick an open star cluster for this process. I selected NGC129, then pressed the GOTO button to slew the mount to that target.

Alignment Module

alignment_window.png

Now that I have slewed to NGC129, I press the Alignment Module tab to go through a plate solving process to improve my GOTO model inside the mount and EKOS module. The reason you want to do this is both so that you have increased slewing accuracy, and so that once you pick your target and slew to it, you have confirmation that this is in fact the target you picked. Additionally, this helps with the meridian flip and ensuring that once the mount has flipped, after passing the meridian line, that your target is picked up in the exact same spot it left off before the flip. 

Usually what I do in this first step is select Sync under Solver Action. Then I press Capture and Solve. All I'm doing here is plate solving the current position to tell the mount exactly where it's aimed. I had told it to aim at NGC129, but after this first solve, it shows the mount is way off. Not knowing for sure if this is an adequate target, I pick a new one using the Telescope Control and aim at M39, an open cluster. I once again set it to Sync, and press Capture and Solve. Now I'm fairly close to the target, but not quite in the green area. I press goto one more time now that my mount knows where it is, and then Capture Solve/Sync one more time and see if the last slew was closer to the target. Finally, we're in the green and good to go to our final imaging target for the night. I pick the Wizard Nebula NGC7380, and press goto. Once there, I perform a Capture and Solve/Move to target. This will perform multiple Capture and Solve routines moving the mount each time getting the target lined up perfectly. Once it's good the Capture and Solve process stops. Time to turn on guiding now.

The Guide Module

guide_window.png

With our target picked, and GOTO plate solved to the target, we're ready to start guiding. This process is fairly straight forward. Dithering is turned on by default (you can check it by going to the options button in the lower right corner of the window). Now, we press capture, this shows you a single image from your guide cam. Select a star with your mouse, and it highlights with a green box. Press Guide, and the guiding calibration begins. This process is automatic, and you can watch the steps it's performing in the text window at the bottom of the screen. Once it's complete, guiding starts automatically. Now it's time to program our image sequence and start capturing.

The Sequence Module

sequence_window.png

This is the final step for my process for an evening capture session. For the Wizard Nebula, I had planned on capturing it in bi-color over a two night period. Tonight is the first night, so I only plan to capture 7-8 hours of HA (Hydrogen Alpha filter), basically as long as I can before the sun comes up. Tomorrow night, I'll be capturing OIII (Oxygen III filter) for another 7-8 hours using the same routine. Since I have a cooled camera, the first thing I do here is set it's temperature to -15°C, and press the set button. The temperature quickly begins to lower. I can check that box next to the temperature, and the sequence will not start until the temperature has been reached. Next I plug in my Exposure time, I've set it to 180s, or 3 min images. A count of 240, which is more than enough to cover me to sun up. I make sure the type is set to "Light" for light frames (as opposed to dark, bias, or flat). I set the filter to H-a, then under file settings I name the files with a prefix, in this case NGC7380, and I check off Filter, Duration, and TS (Time Stamp) so that those are appended to the file names that I'm capturing.

Now I've set all the perimeters for my sequence. I now add the parameters to the sequence que by going up to the top and pressing the "+" button. This adds it to the right into the que.  If I lived in a dark area, and wanted to capture more than HA during the evening, I could change my parameters and add sequences for OIII, SII, or LRGB and just make sure that I only put enough time into each so that the sequence can finish by the end of the night. But since I'm in a light polluted area, I need as much time as I can spend on each filter, so I typically spend one evening per filter and get decent imaging results.

We're done now with setting the sequence, and we're ready to run it for the evening. You'll press the play button at the bottom of the sequence window, and your camera will start capturing images until the sequence is complete. You can now tab over to the main window and watch the images roll in for the evening, or head to bed like I do, ready to wake up by sunrise and take down all the equipment before it gets too hot outside. (I live in the south where it's quite warm during the day).

main_window.png

From here you can monitor the images that are being captured for the sequence you've plugged into the sequence editor.

Below is the final processed image from two nights of imaging. I processed it with Astro Pixel Processor, PixInsight, and Photoshop on my iMac Pro workstation. Full equipment details can be found at Astrobin.

the wizard.jpg

Getting Started with Astrophotography on the Mac

Deep sky object, the Crescent Nebula, was imaged over 30 hours on my  Explore Scientific FCD100 setup.  Using a more advanced program called  EKOS .

Deep sky object, the Crescent Nebula, was imaged over 30 hours on my Explore Scientific FCD100 setup. Using a more advanced program called EKOS.

Recommendations for your start in imaging on the Mac

There's a few things that need to be covered here as a starting point. First you have to make a decision as to whether you want to take photos of the planets and Moon, or if you want to take photos of nebula, star clusters, or galaxies. Basically, the decision between planetary, or deep space objects.

Planetary imaging on the Mac

Planetary is fairly straight forward. Large aperture scopes like 6" and above are great for this, and you don't need to have an equatorial mount. Any Alt/Az mount will work. A high speed web cam or astro camera and Mac laptop are the only additional entry level hardware requirements. Since most planets are relatively small, the larger the scope, the closer/larger they will look, and the more detail you can get out of your images.

Recommended starting software for planetary imaging:

  • Planetary Imager - for taking pictures or videos: free

  • SiriL - for stacking planetary images: free

  • PixInsight - for processing your planetary images to get the most detail out of them: $230 EUR

Unfortunately planetary processing software is a gap right now on the Mac. You need wavelet processing to get the most detail out of your images, and currently PixInsight is the only real option. There are two other apps that might run on older hardware and operating systems (Lynkeos and Keiths Image stacker), but they're not developed any longer, and crash often on modern hardware. They are however, free applications.

For more advanced options, you might switch out Planetary Imager for FireCapture.

moon.jpg
5SVgZxEaVcsW_1824x0_wmhqkGbg.jpg

Deep sky object imaging on the Mac

DSO imaging requires a little more effort. Because this type of imaging focuses on long exposure shots, where tracking your object across the sky accurately is a requirement, you'll need a German Equatorial Mount (GEM). These deep sky objects can vary greatly in size, with a large number of them being bigger than earth's moon in the night sky. Because of this, a large scope isn't a requirement to get started. In fact, it's preferable to start with a smaller scope, like an 80mm refractor. The reason for this is that the larger your scope, the more accurate your tracking needs to be, the better your mount needs to be to handle the weight and accuracy. The difficulty goes up exponentially with larger telescopes. So start small. All of the telescopes I use are relatively small (under 6" in size), and all fit on my entry level GEM mount, the Advanced VX by Celestron. 

Additional requirements are going to be a guiding camera and guide scope. This is essentially a small telescope mounted on top of your main scope, with a guide camera. This camera's job is to watch the star movement, and send corrections to your GEM mount when the mount isn't moving accurately. For entry level equipment, this is a necessity, as these mounts are far from accurate for long exposure imaging.

You'll also need a main imaging camera, and your options vary widely here. You have the option of using a DSLR (maybe you have one already in your possession), or a dedicated astrophotography camera that can do color or mono. Mono is a black and white camera, that when combined with color filters, can achieve a higher fidelity color image than a regular color camera can but with more effort and expense.

Recommended starting software for deep sky imaging:

For more advanced options you might switch out Astro Imager for EKOS. And Astro Pixel Processor for PixInsight, or Star Tools.

Cocoon Nebula.jpg

An Overview of Double Stars on the iPhone

IMG_1133.PNG
IMG_1139.PNG

Here's another well done, simple, and focused astronomy application that's been recently added to the App Store. Double Stars, as it's name implies is a quick way to look up and keep track of your double star viewing. What makes this app so special is that you can go on a Sky Tour of the night sky. Double Stars points out the best double stars that are visible from your location, and sets you up with challenges to see if you can split (magnify the stars enough so that they appear as the two stars that they really are) the nights doubles.

The sky tour has a nice location assistant feature which uses your phones gyroscope and GPS to locate your position, and let you rotate around and to the correct altitude to easily find the star pair.

IMG_1135.PNG
IMG_1138.PNG

This handy application can be a lot of fun at a star gazing party, or right in your back yard. Mark your favorite double stars, and mark if you were able to split them with your equipment. It can be downloaded through the iPhone App Store.

IMG_1134.PNG
IMG_1136.PNG

An Overview of MeteorActive on the iPhone

IMG_1128.PNG
IMG_1129.PNG

MeteorActive is a great way to stay on top of all the meteor showers headed your way. The iPhone application is very straight forward in it's premise. Detailing major and minor events, and listing them in date order allowing you to see how many meteors to expect on a per hour basis. I had no idea there were this many going on. Detail views of each meteor shower allow you to see the best viewing times during the evening sky, and during which hour they will peak. Every astronomer should have this app handy. It's available on the iPhone App Store and can be downloaded through this link.

IMG_1130.PNG
IMG_1131.PNG

An overview of Observer Pro on the iPhone

IMG_1115.PNG
IMG_1114.PNG

Observer pro is really an invaluable application to use for choosing targets in the night sky. It contains objects from all the large catalogs Messier, Caldwell, Herschel, NGC, and IC. You can also browse objects by type including Galaxies, Galactic Nebulae, Planetary Nebulae, Open Clusters, and Globular Clusters. The above views show the listing view and object view. In the listing view you can see the objects visibility during the 24 hour period. It's depicted by an orange/green bar. The best viewable time for that object is depicted in Green.  Bright blue depicts day, dark blue depicts moon light, and black is no moon during the night.

You'll notice some gaps in the orange/green bars shown above and that's because I have my local horizon measured in the application. The application uses mixed reality to view through your phone camera and trace the horizon of your sky (in this case, my back yard, which has several large trees that block my view). 

IMG_1123.PNG

It's best to position yourself where your telescope would be, then use your phone as a viewfinder and draw your local horizon tracing any objects that might be in the way. The result should show something similar to the above. Once done and saved, you can turn on local horizon, and have it show when objects go behind obstacles in your horizon.

IMG_1118.PNG
IMG_1119.PNG

These views depict where the object is during the night sky, and show a finder view to help you see what the sky and object location look like with no obstructions. Other views depict detail of each chart available to each object, as well as detailed information on each object.

Detail view showing the rise and set of the sun and moon, as well as the object rise and set time, and depicted in green is the best available time for imaging with no moon.

Detail view showing the rise and set of the sun and moon, as well as the object rise and set time, and depicted in green is the best available time for imaging with no moon.

This view shows detailed information on the object, including it's magnitude, location, and what type of object it is.

This view shows detailed information on the object, including it's magnitude, location, and what type of object it is.

To top it all off, you can export your local horizon out of the application and convert it to a horizon file for use in Sky Safari. I have my local horizon loaded into Sky Safari 6 Pro on both my iPhone and my Mac desktop computer.

Here's my local horizon file loaded into Sky Safari 6 Pro. To export your horizon file, you must  download  a converter program (from Joshua Bury the creator of Observer Pro), and also download a script to run the converter from  Processing.org , it will take your horizon file and output a PNG that can be imported into Sky Safari.

Here's my local horizon file loaded into Sky Safari 6 Pro. To export your horizon file, you must download a converter program (from Joshua Bury the creator of Observer Pro), and also download a script to run the converter from Processing.org, it will take your horizon file and output a PNG that can be imported into Sky Safari.

I highly recommend Observer Pro to anyone that owns an iPhone. It's an invaluable resource that allows for easily scanning the objects lists to see what's visible during the night sky. It also has a handy night mode for use in the field.

IMG_1125.PNG
IMG_1113.PNG

Celestron's C5 Spotting Scope: Maximize Your Grab-And-Go

IMG_9427.jpg

Make no mistake, this is a BIG, little grab-and-go. The biggest I could fit on the iOptron Cube Pro mount. The AT60ED wasn't cutting it for planetary views. The mount has a weight limit of 8lbs, so I opted for the largest scope with the longest focal length that I could get to fit on this class of mount. That happened to be Celestron's C5 Spotting Scope. Yes, Celestron sells this as a spotting scope, but at it's heart, it's really a 5" Schmidt Cassegrain telescope with XLT Starbright anti-reflective coatings. It's focal length is 1250 at a speed of F10. It's perfect for planet and moon viewing. It's got almost 3 times the focal length of the AT60ED and it's only 6lbs. Two pounds under the weight limit of the mount. This gives me room to add a few light accessories. It pairs well with eye pieces from 32mm down to about 6mm.

IMG_9429.jpg
IMG_9431.jpg

Imaging the Elephant Trunk Nebula - IC1396

IC 1396.jpg

I imaged this nebula over two nights for a total of 14 hours. There are 7 hours of H-a, and 7 hours of OIII data. Each done on a separate night. Thanks to the clean structure of EKOS when imaging, it organized all the data for me into proper folders. From here, I was able to load all of the data into Astro Pixel Processor at once, loading up each channel's info and specific calibration frames. From here I left it to process on it's own. It took about 2 hours total to go through all 500 images and calibrate them.

After this point, I was left with two final calibrated frames, one for H-a, and one for OIII. I brought them into PixInsight to crop,  and align, then stretch them to appropriate levels.

I then used PixelMath inside PixInsight to combine the two monochrome images into a bi-color pallet showing the result above. Finally I moved the file into PhotoShop to enhance the color and contrast slightly, as I prefer the controls in Photoshop for this task. Below are some other pixel math combinations for how you could combine the two images into other color pallets. You can see all the equipment details over on my Astrobin page.

Screen Shot 2018-07-08 at 3.45.07 PM.png

A Night At The George Observatory

IMG_0822.jpg

A few weeks ago, I was able to attend a 'special event' at the George Observatory in Brazos Bend Park just south of Houston Texas. We arrived at sun down, and were given treats, coffee, and paper star maps. There were about 15 in the group, and we were going to spend some time in the main observatory on the 36" Gueymard Research Telescope, one of the largest telescopes in the US that is available for public viewing. After a brief introduction, we were led up to the observatory where it was lights off. They began the viewing session by picking out a few objects, starting with Jupiter. We got a great view of the bands and moons, however the eye piece that was in place shortened the focal length quite a bit and it didn't look much more impressive than my 6" AT6RC.

Second target was M3, and the eye piece was switched to something with more magnification, I overheard it was a Takahashi eye piece. This time, I was impressed. What had only appeared as a blur to me on my telescopes, now appeared brilliant, and I could make out hundreds of individual stars in the cluster. Additionally, they have 11" and 6" refractors hanging off the side, and the views through the 11" refractor were equally impressive.

We moved on to two of the Leo triplet (M65 & M66) galaxies next. Both fit in the FOV. They were fairly dark still despite the light gathering power of the scope. And much brighter objects were in the night sky for our next viewing.

It was interesting to see the arcane controls on such a large scope. Dials read out coordinates in arc hours, minutes and seconds. Coordinates were plotted from object to object. Once at a known location, they could plug in how far to rotate from the current object to get to the next, and used an iPad to look up coordinates of objects.

In all, it was a fun night, and they stayed as late as we wanted to punch in coordinates for new objects. The observatory is open most Friday and Saturday nights to the public. There's an admission fee, but plenty of scopes to check out. In addition to the three primary scopes in the main observatory, they also have a 14" Celestron SCT, and 18" Newtonian reflector in the West and East domes respectively.

An Overview of EKOS Astrophotography Suite on the Mac

This is the main EKOS window. On the left are tabs that represent different sections of the application.

This is the main EKOS window. On the left are tabs that represent different sections of the application.

EKOS is the capture suite that comes as part of the KStars Observatory software package. It's a free, fully automated suite for capturing on Mac, Linux, and PC. It's not to dissimilar to Sequence Guider Pro on the PC. While the capture suite comes with KStars, you're not limited to using KStars. EKOS will also allow you to send commands to your mount from SkySafari on the Mac as well.

I'll break down it's use and capabilities screen by screen.

Main Window

In the main window shown above, you see tabs that represent each part of the application which include the Scheduler, Mount Control, Capture Module, Alignment Module, Focus Module, and Guide Module. From the main window you will see the currently taken image, the seconds remaining in the next image, as well as which image number you are on during the sequence, and the percent complete of the entire sequence with hours, minutes, and second remaining in your sequence. Additionally to the right of your image, you see your target and tracking status, focus status, and guiding status.

Scheduler

This is the Scheduler window, where you can pick your targets, and assign capturing sequences to them.

This is the Scheduler window, where you can pick your targets, and assign capturing sequences to them.

From the Scheduler, you can pick your targets, and assign them capture sequences (which are set up in the imaging module). Additionally there are some overall parameters you can set here for starting a session and ending a session. If you have a permanent observatory, you do things here like open and close your observatory with startup and shut down sequences, or set parameters for when to run your schedule based on the twilight hour, weather, or phase of the moon. The scheduler lets you set up multiple imaging sessions, mosaics, and more. And as the twilight hour approaches, it will start up and pickup imaging based off of priorities you set, or object priorities based on their visibility in the night sky. Imaging sessions can be set for a single night, or can be taken over multiple nights if it wasn't able to complete them in a single night.

Mount Control

Screen Shot 2018-04-22 at 9.30.25 PM.png

Mount control is fairly straight forward. This window shows the current aperture and focal length of your selected equipment. You can save multiple equipment configurations from this window for various telescope and guide scope combinations that you might have. Current tracking information is also shown in this window. If you select Mount Control in the upper right of the screen, it pops up a floating window with arrow buttons, speed and goto functions for manually controlling the mount. You can search for a target, and manually go to an object in the sky to start an imaging session without setting one up in the scheduler.

Capture Module

Screen Shot 2018-04-22 at 9.30.22 PM.png

From here you control all aspects of your imaging camera including setting up imaging sequences. For instance, I might have 7 hours of night time to image before the sun rises. I can divide that time up between each filter, and save the sequence of 120 captures, at 60s each at -20°C for each individual filter, and save that as a sequence which I can later load and reuse anytime I want to run that session during a 7 hour window. Or I could say I want 20 hours total on an object, and set all parameters for each filter to accommodate a 20 hour session, and save it. Or maybe I want one session for LRGB, and one for narrowband imaging. You can also set flat, dark and bias sequences. Flats have an awesome automatic mode, where you can set a pre-determined ADU value, and it will expose each filter automatically to the same ADU and capture all your flats in a single automatic session. It also supports hardware like the FlipFlat so that flat sessions can be run immediately following a nights imaging session. Additionally you can set guiding and focus limits for imaging sessions, and control when your meridian flip occurs.

Focus Module

Screen Shot 2018-04-22 at 9.25.17 PM.png

Here you can control all focus functions if you have a computer controlled focuser. I highly recommend getting one of these. Focusing can be set up to run automatically. It will capture a single image, and auto select a star, then run a sequence where it continues to capture, while moving the focuser in and out. Each time it is graphing the HFR on a curve plot trying to find the best point of focus. Depending on seeing conditions, it can get focusing down in 3-4 iterations, or sometimes 20. All parameters including threshold and tolerance settings for focusing are controlled in this window.

Alignment Module

Screen Shot 2018-04-22 at 9.30.16 PM.png

From this window you can polar align (assuming you can see Polaris), and also plate solve to locate an object center window or improve GOTO accuracy. Since I can't see Polaris from my location, I have to use my mounts built in All Star Polar Alignment process, then I can come to this window to capture & solve a target to improve it's GOTO accuracy. There are several nice features accessible here. You can load a fits file from a previous imaging session, it will plate solve the image, then move your telescope to that precise point to continue an imaging session. Or you can select targets from the floating mount control window, then capture and solve, or capture and slew to bring the mount as close to center of the target as possible. EKOS automatically uses this function during an imaging session to initially align to a target, and then realign once the meridian flip occurs.

Guide Module

Screen Shot 2018-04-22 at 9.30.11 PM.png

The guide module handles all guiding through your guide scope and camera. Press capture in the upper left, and hit guide, a star will be automatically selected, calibration starts, and once calibrated guiding begins. Additionally options can be set for dithering, and guide rate. For people who prefer PHD2, EKOS integrates seamlessly with it, and even shows PHD2's guiding graphs within the app and on your overview tab. I've not personally had any issues using the EKOS guiding, and it has an additional benefit of being able to reacquire a guide star after clouds interrupt your imaging session, and can continue the imaging session when it's clear again.

Overall thoughts

As someone who images regularly, and doesn't have a permanent setup (like an observatory), I like how much of the application can automate my nights imaging sessions. There is little else available on the Mac that is this full featured. The Cloudmakers suite comes in a close second for me, but is initially easier to set up and use. Additionally TheSkyX is also a full featured suite, however I've not used it. The setup process with EKOS isn't too difficult once you get an understanding of how the modules interact with each other and what all the options do. I hope this brief overview gives you enough of an idea that you can setup and use the software on your own. EKOS has a healthy number of contributors on the project, and regularly sees updates on a  monthly basis, and has good support through it's user forums.

The final image of M101 taken during this session that I captured the above screens from. This was actually 17 hours done over three imaging sessions.

The final image of M101 taken during this session that I captured the above screens from. This was actually 17 hours done over three imaging sessions.