Jump to content

Canada's top-tier Telescopes & Accessories
Be as specific as possible when reporting issues and *ALWAYS* include the full version number of the application you are using and your exact *CAMERA MODEL*
NEVER POST YOUR KEY IN ANY PUBLIC FORUM, INCLUDING THE O'TELESCOPE SUPPORT FORUM ::: IF YOU DO YOUR KEY WILL BE DEACTIVATED WITHOUT NOTICE!
  • 0

D5600 Shutter Frame Rate vs Live View *.avi Rate


Andante_2

Question

Hello,

This question refers to a D5600 body and BYN v2.1.0 using Planetary Mode and 5x Live View.

When recording using 5x Live View, BYN and my computer (2018 Apple Air, dual boot w/Windows 10, Thunderbolt port) are downloading frames at a rate of ~100 fps, actually 91.8 fps or some such. This is indeed what is recorded within the *.avi file, EXCLUSIVE OF THE SHUTTER SETTING. For example, a 600 frame collection at 1/10 s integration time should require 60s of clock time plus a bit of additional time for downloading each subframe window. However, the actual collection is complete in ~ 600 frames / 100 fps = 6s! I can only conclude that Live View is being sampled at ~100 fps regardless of the shutter time. This means the same actual frame (which is integrating over 1/10th s) is being displayed by Live View and captured into *.jpg files (which are subsequently compiled into an *.avi movie) multiple times per actual camera integration (i.e., shutter "open") interval.

Some background follows.... The iso level appears to be properly set within BYN because the planetary image brightens or darkens as appropriate within the Live View window of BYN. The same is true for the shutter time setting: increase the shutter time and the Live View image brightens, as it should. Take my previous example: if BYN is sampling the Live View stream at, say, 100 fps (or directly reading it out at that rate), I am essentially collecting 10 of the same images for each 1/10th s exposure. I obviously want to know how to collect a single Live View frame per camera integration time.

For what it is worth, and this may not be relevant: the download rate of ~100 fps is the same whether or not I turn the manual movie mode "on" or "off" in the BYN settings block. The difference here is that, with manual "off," the camera seems to set it's own iso and shutter speed, and the images are often overexposed. So ... I turn manual "on."

Thank you in advance,

Don

Link to comment
Share on other sites

  • Answers 13
  • Created
  • Last Reply

Top Posters For This Question

13 answers to this question

Recommended Posts

  • 0

You are correct, the shutter setting does not control the duration of each frame or the frame rate. It may only affect the brightness of each frame.

I would also say that the Nikon SDK does not download or create AVI video files. It downloads individual LiveView frames and BYN controls the assembly of those individual JPG frames into a video file by means of DCRAW, a video tool that is installed with BYN.

Link to comment
Share on other sites

  • 0

Thanks for the quick response,

OK, so the download rate of LiveView frames is "controlled" only by the diameter of the data firehose. In my case, the data rate is ~92 fps of 640 x 424 pixels per each frame. This means I am grossly oversampling my desired integration interval of ~1/5 to 1/50 s (depending upon the planet). Given that I can download in one burst only 2000 or so frames, and that concatenating smaller *.avi files into a single longer one requires 3rd party S/W I don't have, I am limited to 2000 fr/image. Oversampling reduces the number of "different" images, so that say, at 1/10th s integration time and 1/100th s frame capture time, I would reduce independent images by a factor of x10 to an equivalent stack of 200 images. Here are some possible fixes, and I'd like you opinion:

(1) jack up the iso, perhaps to 3200 or even 6400. Assuming 13-bit significant data at iso=200, I would lose a factor of 2^5 = 32 dynamic range at iso=6400 BUT preserve 13 - 5 = 8 bits of full-well dynamic range, which is the bit width of a *.jpg pixel anyway. However, I am unlikely to fill the well, so I may end up with only something like six bits, or a dynamic range of a factor of 64 ... not good,

(2) get an older, slower downlink somehow,

(3) perhaps (and I need your advice here), readout the entire frame instead of using the 5x option. Hopefully, this would slow down the downlink frame rate by ... a factor of as much as (depending upon the binning and thus resolution loss) x25? 'Trouble is, I'd lose resolution, right?

(4) Simply take a bunch of short exposure *.NEF images and combine them in Registax6. This cycles the mirror and kinda beats up the camera. I hate this option.

Your thoughts,

Don

Link to comment
Share on other sites

  • 0

The atmospheric distortion due to seeing varies rapidly and the purpose for taking lots of short images for planetary photography is to allow you to get lucky and capture a few images when the seeing is extraordinarily good. You use those good images and discard the rest. The images do not need to be too bright. You just need the planet to be bright enough that you can stretch it into a pleasing picture. Making it too bright by increasing the ISO or increasing the exposure setting limits how much you will be able to stretch the image once it has been stacked.

You might try increasing the Live View Throttle setting value to slow down the captured frame rate. However, if you slow down the fire hose, I do not believe that it will result in longer exposures. You would just have fewer of them over the same time period.

I do not believe that Registax6 can deal with RAW images. It needs JPGs or an AVI.

 

Link to comment
Share on other sites

  • 0

Thanks for responding,

Planetary Imaging Pre-Processor (freeware) can combine *.NEF (i.e., *.RAW) files into an AVI movie. Still, I heartily dislike doing this because it cycles the camera mirror way, WAY too much.

Changing the iso and "shutter speed" does affect the brightness of the image on the Live View feed. So ... better to have a lower iso balanced with a higher "shutter speed," even if the "shutter speed" is notional: the important thing is to get the brightness of the planet correct.

Right, I want to extend my "movie" over a longer time to better capture the changing seeing (which also oscillates over a few seconds between slightly poorer and better). I agree, basically, there is no way to change the "shutter speed" using the Live View modality, only to space out the frames that are collected.

Thx much,

Don

Link to comment
Share on other sites

  • 0

Hello,

Given our better seeing in the Aug-Oct timeframe here in NorCal, I am once again experimenting with planetary imaging. We had established (see above responses) that BYN simply records the data stream put out by the Nikon, which oversamples the Nikon D5600 LiveView image by a factor of ~ one order of magnitude.

To verify this is indeed the case. I tried a different configuration altogether. I elected to control my black & white ASI120mm Mini guide camera with SharpCap, and to image the planets that way. For reference, the ASI camera pixel size is 3.75 um, a good match to the D5600 sensor value of 3.89 um. However, the latter of course has a Bayer array, which worsens the color-by-color angular resolution by x2. 

I imaged both Jupiter and Saturn, and the differences were quite stunning: the guide camera images were much sharper and of higher contrast than with the Nikon D5600. I would note that the altitude angles of both planets (for differential refraction variations) were not all that different from last year, and that I used the same 8" Edge HD + x2 barlow combination for both cameras.

OK, so BYN only records the data firehose put out by the D5600, and I have an oversampling problem with Live View. In the absence of finding a way to tame Live View, I have three options:

(1) continue to live with glorious black & white images from the guide camera,

(2) buy a dedicated color astrocam (I hate this option), or

(3) is there anyone out there who'd been getting great planet images from their D5600 (or similar) Nikon + BYN? If so, please tell me what your camera/Live View settings were.

Thanks,

Don

Link to comment
Share on other sites

  • 0
14 hours ago, Andante_2 said:

For reference, the ASI camera pixel size is 3.75 um, a good match to the D5600 sensor value of 3.89 um. However, the latter of course has a Bayer array, which worsens the color-by-color angular resolution by x2. 

OK, ... I have an oversampling problem with Live View.

You've identified the issue with the not so Apples-to-Apples comparison between a Mono Guider/Planetary Camera and a Color DSLR - Resolution.  A much better comparison would have been between your D5600 and the ASI120MC Color Planetary Camera (that would be of the same Pixel Resolution).  If you have the resources and the inclination, pursue a good Color Planetary Camera to compliment your DSLR.  Else use the D5600 for Planetary Capture - doing what you can to improve Frame Rate (see above; also confirm high-quality USB2 cable with nothing competing on the Capture PC), and capture somewhat longer AVI sessions.

Link to comment
Share on other sites

  • 0

One other issue with your Nikon DSLR is that 5X zoom in LiveView is downsampled to fit on the LCD display. It is this downsampled image that is provided to BYN by the camera and the Nikon SDK.

This would not be the case with any Canon camera with LiveView, where 5X zoom LiveView is at the full resolution of the sensor array where 1 pixel in the LiveView image is exactly 1 pixel from the sensor.

Link to comment
Share on other sites

  • 0

Hello,

Yes, there are two major problems in using the D5600 for planet imaging. These are: (1) the unavoidable oversampling of the LiveView stream by BYN due to Nikon's choice to not allow changing the shutter speed manually during LiveView and (2) the downsampling of pixel resolution from the x5 LiveView. After much experimentation, I've found barely acceptable images can be obtained but only under conditions of excellent or near-excellent seeing. Images from my mono guide camera (ASI120mm Mini) have much higher contrast and angular resolution.

One thing left to try: use winjupos S/W to combine the B&W and color images to see if I can't have both color and higher resolution.

Conclusion: the Nikon D5600 (and by similarity, the D5300 and D5500 models as well) are a poor choice for planet imaging. This is not the fault of BYN in any way.

Happy observing always,

 

Don

 

Link to comment
Share on other sites

  • 0

Don,

Please realize that it isn't so "black&white" (pun?) that a Color Bayer Matrix Sensor produces a 1:4 pixel resolution of an equivalently sized mono pixel sensor.

The Debayer software algorithms use the content of adjacent Bayer Cells to "reconstruct" a good bit of the resolution lost by grouping 2x2 pixels together.  This results in a realistic Pixel Resolution that is about 1:1.4.  (Source:  long-ago discussions amongst AP Imaging Elite on Cloudy-Nights - google failed me finding the threads)

Also, you are misusing the term Oversampling in "the unavoidable oversampling of the LiveView stream by BYN due to Nikon's choice to not allow changing the shutter speed manually during LiveView".  This would be Over-Exposure or Under-Exposure - depending if the Frame Rate is Too Fast or Too Slow respectively.

Oversampling is when the Sensor Pixel Resolution exceeds either the Rayleigh Limit of the Optics or the Seeing Limit of the Atmosphere - resulting in multiple pixels recording the same minimally resolvable detail.  And the solution to Oversampling is usually Pixel Binning (recording a 2x2 set of Pixels as a single "Super-Pixel").

Being concerned that the Sensor Pixel Resolution is insufficient for your Optics means Undersampling.

Link to comment
Share on other sites

  • 0

Hello s3igell,

Thank you for your reply.

Consider for a moment just the "red" pixel array. In this case, we have an array the same spatial size as the detector but with only 1/4 the sampling point density. I don't see how any sort of interpolation "recovers" the angular resolution of the full dense array. I hear you that there was once a CloudyNights thread on this but I need to see the math.

"Oversampling" is a signal processing term. Now that I understand the communications confusion, I will use a more descriptive but longer phrase; something like, "the high rate of sampling of the LiveView stream vs the slower (user-unalterable) shutter speed means that the LiveView stream sends multiple repeats of each image captured by the camera." Hmm, "oversampling" in the signal processing sense doesn't sound so bad after all.

Happy observing always,

Don

 

Link to comment
Share on other sites

  • 0

Don,

s3igell said that De-Bayering recovers "a good bit" of the resolution lost by using the color matrix. Is it as good as shooting with a monochrome camera and LRGB filter wheel? Probably not, but with optimum conditions, skill, and practice one cam create some spectacular images with a one-shot color camera.

For a good technical discussion of the term "oversampling" in image sampling I would direct you to "The Handbook of Astronomical Image Processing" by Richard Berry and James Burnell.

The LiveView stream does NOT contain multiple repeats of each image. Each downloaded image was collected and read from the sensor at a slightly different point in time.

The idea of LiveView is to simulate what you see when looking through the viewfinder by taking short duration images and displaying them on the camera's LCD display. If this is done fast enough it appears like natural motion in real time. Each of these frames is only a few milliseconds of exposure and each one is unique. The Nikon Software Development Kit (SDK) library provides the ability download these to the PC and provides them to BYN as fast as it can. The rate at which new frames becomes available to BYN is a function of the processing time taken by the camera, the speed of the USB channel to transfer them to the PC, and the speed of the PC and BYN to display them and save them to the PC's storage drive.

In this operating mode, the shutter stays open and the concept of shutter speed is meaningless. The camera still allows you to choose a shutter speed which it uses to control the brightness of the image without changing the frame rate. Basically, for LiveView, the shutter speed selection just becomes like the position of a brightness knob.

When applying LiveView to planetary imaging, the idea of taking lots of short exposures is to try to cheat the blurring that is caused by atmospheric turbulence. It is called "lucky imaging" because seeing changes rapidly and if you are able to take images fast enough and for a long enough period a small percentage of frames will be better than the others. Software that is used for processing these images grades them for their sharpness and allows you to select only the best frames to combine into a finished image.

I hope this helps to clear up any confusion.

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Answer this question...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...

Important Information

This site uses cookies to offer your a better browsing experience. You can adjust your cookie settings. By closing this banner, scrolling this page, clicking a link or continuing to browse otherwise, you agree to the use of cookies, our Privacy Policy, and our Terms of Use