tag:blogger.com,1999:blog-89498102682378560912024-03-05T13:23:19.321-05:00Enigma To EurekaAnonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.comBlogger34125tag:blogger.com,1999:blog-8949810268237856091.post-78945635585227839872018-10-01T07:57:00.001-04:002018-10-05T14:46:40.326-04:00Powering PANOPTES<div align="left">
<div dir="ltr">
</div>
</div>
<div align="left">
<div dir="ltr" id="docs-internal-guid-229510b3-7fff-8392-8c5f-7ed2d293e2ed" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="font-family: "arial"; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;"><br class="kix-line-break" /></span><span style="font-family: "arial"; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;">The baseline PANOPTES design used in PAN001 includes three custom circuit boards: two protoboards and a custom PCB. The latter is rather expensive due to the use of 5 solid state relays that cost at least $20 a piece... Plus there is a footprint error in the PCB layout for each of those relays that requires repair by the person soldering the board.</span><span style="font-family: "arial"; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;"><br class="kix-line-break" /></span></div>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="font-family: "arial"; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;"><img height="832" src="https://lh4.googleusercontent.com/m2gSbk7aY12hbl7_WhpIIv0fVMmrc-1oapTYHxKOvpdi6GYrpqj943SdpPv_Qgsh3tb2nTwebHTCsvfToNPlR6z7mcDkv1rBmH8hVs0_Cl_IBiyrBllXuzIKCsIMbqOxGeEyINoC" style="-webkit-transform: rotate(0.00rad); border: none; transform: rotate(0.00rad);" width="624" /></span></div>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="font-family: "arial"; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;"><br class="kix-line-break" /></span><span style="font-family: "arial"; font-size: 11pt; font-style: italic; vertical-align: baseline; white-space: pre-wrap;">For each type of part we want to add to the layout of a PCB, we need to know the exact position of each pin of that part, what that pin does (e.g. a power input or specific signal) and the area covered by the body and leads/pins of the part. This info, called the footprint, is used when going from the schematic to a layout of parts and traces on a board, and helps avoid having two parts positioned such that they collide when you go to populate the board.</span><span style="font-family: "arial"; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;"><br class="kix-line-break" /></span><span style="font-family: "arial"; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;"><br class="kix-line-break" /></span><span style="font-family: "arial"; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;">Luc, an EE at Gemini South in Chile and the designer of the PANOPTES PCBs, has been designing a new PCB, the "Interface Board", two of which will be able to replace the three boards in the scope I built. We had 5 prototype quality boards made early this summer, after which several of us fully or partially populated and tested the boards. Aru and Montu, interns at Caltech, along with their mentor Nem, did the heavy lifting of fully building and debugging issues with two copies of the PCB for PAN012, the PANOPTES telescope they built this summer.</span><span style="font-family: "arial"; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;"><br class="kix-line-break" /></span><span style="font-family: "arial"; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;"><br class="kix-line-break" /></span><span style="font-family: "arial"; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;">Among the things we learned were that we should have paid for a slightly higher quality of board fabrication: there were some through holes that didn't connect to the appropriate traces. We also discovered that we had the wrong footprint for some parts; the first such issue was with the barrel jacks we use for connecting DC power cables to the board: the pins were in the right place, but the outline of the part was too small. This meant that the body of the barrel jack was sometimes overlapping an adjacent hole for a resistor.</span><span style="font-family: "arial"; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;"><br class="kix-line-break" /></span><span style="font-family: "arial"; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;"><br class="kix-line-break" /></span><span style="font-family: "arial"; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;">Based on the feedback that builders provided, Luc has updated the incorrect footprints, and greatly improved the layout of traces on the board. It was fascinating for me, not an EE, to review the updated board, with Luc guiding me through the process via a video call. We remained concerned that the AC current measuring circuit wasn't quite right so haven't yet ordered new boards. Instead, Nem and Luc have been testing alternate ways to condition (clean up) the signal from the AC current sensor so that we can feed it to an analog input pin of the Arduino.</span><span style="font-family: "arial"; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;"><br class="kix-line-break" /></span><span style="font-family: "arial"; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;"><img height="271" src="https://lh5.googleusercontent.com/BDn2ywvzQPBRohi5JGuz2YGehPwLZNF7YVaZJXP3bPalWuRvUzgug132RdYWgkPN0RCR8zAhh1sHtho2WtzWboNaCriX53_VTI1Dw5HDBo0MGbO6d3ku1DXn-_bNhKYCZOvk6Hui" style="-webkit-transform: rotate(0.00rad); border: none; transform: rotate(0.00rad);" width="346" /></span><span style="font-family: "arial"; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;"><br class="kix-line-break" /></span><span style="font-family: "arial"; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;">We're also starting to evaluate an alternate uninterruptible power supply that has two very useful features. It reports (by closing a relay) when the AC input fails (i.e. when there is a mains power failure). It also reports when the battery voltage is getting low</span><span style="font-family: "arial"; font-size: 14.6667px; white-space: pre-wrap;"> (also by closing a relay)</span><span style="font-family: "arial"; font-size: 11pt; white-space: pre-wrap;">; this happens as the battery ages, such that after a year or three the battery isn't useful as a backup. This would eliminate the need to have our own AC sensor and signal conditioning, but it reduces the number of UPS devices that we can use (i.e. fewer offer these features). Luc has cleverly come up with a way to use the same jack on the board to connect to the relay lines on this new UPS as well as to connect to the split-core transformer that we are using to detect AC by clamping around a wire carrying an AC current.</span></div>
<br />
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="font-family: "arial"; font-size: 11pt; vertical-align: baseline; white-space: pre-wrap;">I'm looking forward to seeing this new board in the flesh in the coming weeks. </span></div>
</div>
Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com1tag:blogger.com,1999:blog-8949810268237856091.post-58217522450993944852018-08-04T15:51:00.000-04:002018-08-04T15:51:30.105-04:00PANOPTES at RTSRE and iNATS, 2018I recently spent a week in Hilo, Hawaii attending two back to back conferences on behalf of Project PANOPTES:<br />
<ul>
<li>Robotic Telescopes, Student Research and Education (<a href="https://rtsre.net/">RTSRE</a>)</li>
<li>International Astronomy Teaching Summit (<a href="http://www.caperteam.com/astro101summit">iNATS</a>)</li>
</ul>
<div>
Olivier Guyon, Jen Tong and I delivered talks about PANOPTES. Olivier spoke about the historical background of the project, the challenges with using DSLRs for photometry and the the approach used to and science, I followed that with a walk through the process of building a robotic telescope as designed by Olivier and the rest of the PANOPTES team, after which Jen spoke about the application of Cloud Computing to Project PANOPTES. We wrapped up with a brief Q&A for all 5 PANOPTES team members in attendance (Josh Walawender and Kathy Guyon were supporting us from the audience). Most of the questions were familiar to us, such as:</div>
<div>
<ul>
<li>Why two cameras? Better <a href="https://en.wikipedia.org/wiki/Etendue">étendue</a> for the money. <i>Soon we'll have <a href="https://github.com/panoptes/panoptes.github.io/pull/17">an article</a> on the <a href="https://projectpanoptes.org/">projectpanoptes.org</a> site explaining that.</i></li>
<li>How much does it cost? Under 5,000 USD, including purchasing a few specialized tools that most folks won't have on hand.</li>
<li>Will it work in a city? Honestly, we're not sure yet. We'd love to have someone try, and if it doesn't work the scope could then be relocated to a more rural location.</li>
<li>Can it really work with a DSLR and an 85mm lens? Yes, at least on Mauna Loa. We're still evaluating the data from my scope located in Norton, MA. If that turns out to be too light polluted, I'll look to move it further from the suburban light pollution.</li>
</ul>
</div>
<div>
See <a href="https://projectpanoptes.org/faq">the FAQ</a> for many more such questions. Another category of questions was about how to use this for education:</div>
<div>
<ul>
<li>Is the existing data available? Yes, just let us know what you're looking for and we'll provide a pointer. We hope one day to have a decent web interface for self-service, but we're not there yet. If you'd like to help with building that, please speak up!</li>
<li>Can students build the scope? Yes, definitely! The current design was the result of a long effort to make the scope low-cost, weatherproof and easy to build. With the exception of the pier, I built my scope using the type of tools many homeowners have: a hand-held electric drill, a rotary tool (Dremel), a hacksaw, a vice, wire strippers and a soldering iron. These are all tools that a high school FIRST Robotics team would use. And we already have one high school student doing a build as a personal project, rather than as a school activity.</li>
</ul>
<div>
<br class="Apple-interchange-newline" />We were lucky enough to have our speaking on Monday afternoon, so we had lots of opportunities to talk in the hallways, at meals, etc. about PANOPTES, and also how it relates to the broader topic of astronomy education. We had several parties express serious interest in building a PANOPTES scope, one of whom was interested in trying out a ZWO camera in place of the Canon SL2 that we currently recommend. I'll be working to build those relationships and support them if they build one of the scopes.</div>
<div>
<br /></div>
</div>
<div>
A fun side trip was added at the last minute: a tour of the Mauna Loa observatory, altitude 11,000 feet! This is the home of PAN001, the first instance of the design that I built. Here's a picture of Jen and me along side PAN001, with Mauna Kea in the background:<br />
<br /></div>
<div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiP41bsQcjojVQGwrclIDl18tk5NLqZk5i5iVwaXizzWC-De_2acLs8WcG-eU7Bp54I66VfWYkRlaQlJYlnqxDRCgZm91XGLzCH2M6y5xHClufLtk1VT2Mt3d1txCFZtBASeJ4xrnISLtDw/s1600/IMG_0990.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1200" data-original-width="1600" height="300" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiP41bsQcjojVQGwrclIDl18tk5NLqZk5i5iVwaXizzWC-De_2acLs8WcG-eU7Bp54I66VfWYkRlaQlJYlnqxDRCgZm91XGLzCH2M6y5xHClufLtk1VT2Mt3d1txCFZtBASeJ4xrnISLtDw/s400/IMG_0990.jpg" width="400" /></a></div>
<br /></div>
Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com0tag:blogger.com,1999:blog-8949810268237856091.post-23544912885965373882018-07-08T20:41:00.001-04:002018-07-08T20:41:37.826-04:00Preparing for RTSRE 2018I'm heading to Hilo on the Big Island soon to attend the <a href="https://rtsre.net/">Robotic Telescopes, Student Research and Education Conference</a> (RTSRE), followed by the International Astronomy Teaching Summit (iNATS). I'll be speaking at RTSRE about Project PANOPTES, in a joint hour long session with two other members of the project, with a panel discussion about PANOPTES as our wrap up.<br />
<br />
The focus of my talk is the process of building a PANOPTES robotic telescope, and mentoring other builders (which includes helping to build the PANOPTES community). The main challenge is to keep my portion of our hour to just 15 minutes. Fortunately I have an opportunity to practice my talk at this week's July ATMoB meeting, during which I've been allotted 15 minutes too. Thanks, Glenn!<br />
<br />
Something I've been doing when preparing recent talks is to keep the number of words per slide <b>really</b> low (sometimes zero) and instead emphasize photos and diagrams. I don't want the audience spending their time reading, especially if I'm planning to say the same thing that is one the slides. For example:<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjcbxK81vx34TMOtzkEMH_8ab5qa9ZhX6232U16yZY91HTjFYBqAkQ82vk4kzfGBEEGvrSPKwA-ogvhqzNV4V6BcKLWUmDTWHbu-dZ7Mwf9X40wclyXisibdPrIKsHVyh00QdpDzthepfy4/s1600/Building+PANOPTES+%25281%2529.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img alt="Custom Telescope Pier" border="0" data-original-height="540" data-original-width="960" height="225" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjcbxK81vx34TMOtzkEMH_8ab5qa9ZhX6232U16yZY91HTjFYBqAkQ82vk4kzfGBEEGvrSPKwA-ogvhqzNV4V6BcKLWUmDTWHbu-dZ7Mwf9X40wclyXisibdPrIKsHVyh00QdpDzthepfy4/s400/Building+PANOPTES+%25281%2529.jpg" title="" width="400" /></a></div>
<br />
One thing I like about this is that I can get through a lot more slides if I'm not waiting for the audience to read the slide... or worse, reading aloud the slide myself.<br />
<br />
Wish me luck!Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com0tag:blogger.com,1999:blog-8949810268237856091.post-57014080016229596572018-06-13T16:25:00.000-04:002018-06-13T16:25:41.722-04:00A Brief PANOPTES UpdateThose of us contributing to Project PANOPTES have been very busy with the project, but this blog and the project's Facebook page give you the impression that the project is dormant. Social media isn't my thing, much as I wish I was better at it.<br />
<br />
So, what has been going on?<br />
<br />
On the science side, there is an increasing focus on getting the data analysis automated and producing new light curves. The last time one was published was years ago, based on an early prototype telescope and algorithm. That program worked one star at a time, under human direction, which clearly doesn't scale well to thousands or 10's of thousands of stars in an image. We've had multiple image processing meetings starting in early May, and Wilfred is focusing on this area as part of his PhD work at Maquarie University in Sydney. A challenge we encounter is that it is difficult to get everybody together (virtually, via Google Hangouts) at the same time due to the time zones we're in: US Eastern, Pacific and Hawaii, plus Australian Eastern. In addition, at least four members of the team are involved in night observing professionally, so may be sleeping when we meet, which is usually centered around noon in Hawaii. Ah, the glories of international collaborations. 😉<br />
<br />
On the community and communication front, we've successfully launched a Discourse forum at <a href="https://forum.projectpanoptes.org/">https://forum.projectpanoptes.org/</a>. For several years the primary means of communication within the team has been Slack, but we needed publicly visible discussions, which makes it easier for more folks to see what is going on, and for search engines to find and index the content. I've been very pleased to see that we've started to have questions answered by non-core team members, a vital step to growing the community.<br />
<br />
One critique the teams has received is that the instructions for building the telescope are scattered and incomplete (I know this all to well having made my way to the ends of the written instructions, and then had to figure a bunch out, or pester busy folks to fill-in the gaps I couldn't fathom). We've launched an effort to bring all those scattered instructions together into a <a href="https://docs.google.com/document/d/1lq0tqZupyAKxqMXlUmHHjXfBubg8RHyh7BRLj839f1Q/edit">single document</a>; we're most of the way through merging and cleaning up the existing contents, and have identified many gaps to be filled, but filling those will be a challenge that may take us some months to achieve. Volunteers are welcome!<br />
<br />
Regarding face-to-face outreach, here are some of the events with a PANOPTES presence:<br />
<br />
<b>March</b>: Wilfred presented <a href="https://www.socallinuxexpo.org/scale/16x/speakers/wilfred-gee">PANOPTES at SCALE</a> in Pasadena, CA.<br />
<br />
<b>April</b>: Josh, Joe, Sean and I attended <a href="http://www.rocklandastronomy.com/NEAF">NEAF</a> in Suffern, NY, where we manned a PANOPTES booth, and where Josh twice delivered a talk about PANOPTES.<br />
<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjUy8xZNgnkW06GP649ZgB4klg6DZLlh-dKOFqtlKscCaNQEzBK76cM1T3gVfuHEPDfhM-YNeSIVKg4NZKS0YfoTIin695VKDUpFXDqq1yAKNhSzrkYS-HO6yW7sDVRsOx1b6lK6c21_kU-/s1600/Image+uploaded+from+iOS.jpg" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjUy8xZNgnkW06GP649ZgB4klg6DZLlh-dKOFqtlKscCaNQEzBK76cM1T3gVfuHEPDfhM-YNeSIVKg4NZKS0YfoTIin695VKDUpFXDqq1yAKNhSzrkYS-HO6yW7sDVRsOx1b6lK6c21_kU-/s320/Image+uploaded+from+iOS.jpg" /></a><br />
<br />
<b>May</b>: Jen, Wilfred and I staffed a PANOPTES exhibit at <a href="https://events.google.com/io/">Google I/O</a>, designed by Jen, featuring a transit demo that drew in many visitors, as did a partially completed PANOPTES robotic telescope.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh-F-i3ODXEDexZLfFOn9j5krQp8nWw9ASd9fYHLERvehyphenhyphenUy9gNHkwaKtsjt-5h3JT7Q-FWukipptpFkdK8kY5DdsOf5Z6TbZsXmrHcl6r00qHkBdZ1OixxJRVYH3Dq_2kZ6vTGlJui3Y6n/s1600/IMG_20180508_134941.jpg" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh-F-i3ODXEDexZLfFOn9j5krQp8nWw9ASd9fYHLERvehyphenhyphenUy9gNHkwaKtsjt-5h3JT7Q-FWukipptpFkdK8kY5DdsOf5Z6TbZsXmrHcl6r00qHkBdZ1OixxJRVYH3Dq_2kZ6vTGlJui3Y6n/s320/IMG_20180508_134941.jpg" /></a></td></tr>
<tr><td class="tr-caption" style="font-size: 12.8px;">Transit demo, with webcam in foreground at right, star and planet at bottom center,<br />
and screen showing light curve at the top.</td></tr>
</tbody></table>
<b>June</b>: Olivier manned the PANOPTES exhibit at <a href="https://explore.jpl.nasa.gov/">Explore JPL</a>, featuring the JPL made copy of the PANOPTES design. Quite a challenge to speak to thousands of attendees solo! And this week there are two events with a PANOPTES presence: <a href="https://spie.org/Documents/ConferencesExhibitions/AS18-final-L.pdf">SPIE in Austin</a> has 4 of the original team members (Olivier, Nem, Luc and Wilfred) attending, with Wilfred giving a presentation, participating in a poster session, and holding a meet 'n greet; and the first two builders after the original team, Doug and Christina, will be a <a href="http://www.socastrosci.org/Symposium.html">SAS</a> in Ontario, CA, where they'll have a PANOPTES poster on display.<br />
<br />
Coming up in <b>July</b>, several of us will attend and speak at <a href="https://rtsre.net/">RTSRE</a>/<a href="http://www.caperteam.com/2018hilo/">iNATS</a> in Hilo, Hawaii (active volcano territory!). Our plans are for Olivier to kick things off with an intro to Project PANOPTES; I'll follow up with an experience report about building a telescope; then we'll have a talk about processing the collected data in the cloud; and finally we'll wrap up with an open panel discussion about PANOPTES.Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com0tag:blogger.com,1999:blog-8949810268237856091.post-80173906104044556522018-03-10T08:04:00.001-05:002018-03-10T08:04:01.133-05:00Envisioning a New Space Age at the MIT Media Lab<p dir="ltr">Today I'll be attending a <a href="https://www.media.mit.edu/events/beyond-the-cradle-2018/">conference</a> at MIT that explores the idea of a future in which the maker ethos extends to space, with hackable satellites, etc. I'm particularly interested in the idea of DIY experiments. With SpaceX and others drastically reducing launch costs, perhaps <a href="http://www.atmob.org">ATMoB</a> members can join forces to create a CubeSat with a camera and radio transmitter.</p>
Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com0tag:blogger.com,1999:blog-8949810268237856091.post-34688547530841561732018-03-08T15:51:00.000-05:002018-03-08T15:51:13.217-05:00Cost Reductions for Project PANOPTESI've been so focused on getting my scope (PAN006) installed, debugged and operating automatically that I've been remiss in providing written feedback on the build instructions. I've now taken care of doing so, and am engaged with developing a new parts list for the project, with some updates to the design (primarily simplifications). I'm really pleased that I've been able to drive down the price of the scope. It was nominally $5000, but the reality was that amount didn't include several hundred dollars of electronic parts and small hardware items. I'm not done with the list, but it looks like the cost will be under $4800 for the complete scope when I'm done ... as long as I don't add more than $250 more to the list. Among the options we're still considering are:<br />
<ul>
<li>Using a smaller backup battery; PAN006 ran for 2.5 hours on the current 12V 12AH battery in the Nor'easter storm on March 2nd, so a 5AH battery might be sufficient</li>
<li>Reducing the size of the custom pier's top plate from 8" x 8" to 6" x 6", which requires using smaller extrusions, but helps be making a collision between the mount and the pier less likely, which helps with...</li>
<li>Increasing the size of camera box: the previous box was just big enough for the Canon SL1, the smallest DSLR, and that camera is discontinued. Replacement cameras are all bigger.</li>
<li>Reducing the size of the control electronics box (maybe). The Pelican case is pretty expensive (around $175), and we might be able to save $50 with a smaller case, though space in the case might be a bit tight.</li>
</ul>
<div>
All of those savings are pretty important because the Canon SL2, the obvious replacement for the SL1, is more expensive. We'd budgeted $399 for the SL1, but the SL2 is listed as $549 at major online retailers. So, we've got a $302 bump coming to the price.</div>
<div>
<br /></div>
<div>
Fortunately, there are plenty of lightly used SL1 cameras for sale in the used market, but those are not available to all builders: the schools working on the 1,2,3 STARS virtual robotics exchange can't buy from eBay, so need a "real" storefront (online or physical) to sell them the cameras they need. I was pleased to find KEH Camera yesterday, which has reasonable prices on SL1's in good condition; the only problem is that the 'grade' of the camera is based on cosmetics, not the shutter count. Sigh.</div>
Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com0tag:blogger.com,1999:blog-8949810268237856091.post-80530753669943796692018-03-08T15:48:00.001-05:002018-03-08T15:48:12.431-05:00US-Italy Virtual Robotics ExchangeU.S. Department of State Bureau of Educational and Cultural Affairs -- there's a mouthful -- (ECA) helps to coordinate educational exchanges. In 2016 ECA created a new initiative called The Collaboratory, which is now helping to coordinate a virtual robotics exchange between 3 high schools in the U.S. and one in Pistoia, Italy, an effort tentatively called 1,2,3 STAR by some of the students. Each school will build a robotic telescope as designed by Project PANOPTES, and I'll be serving as a mentor/technical advisor on the effort. You can read more about the exchange on the <a href="https://it.usembassy.gov/lamerican-corner-youlab-pistoia-ha-presentato-al-pubblico-il-progetto-internazionale-collaboratory-123-star/"><span id="goog_1292242535"></span>US Embassy to Italy's website<span id="goog_1292242536"></span></a> and <a href="https://www.facebook.com/32706176871/posts/10155485541881872">facebook page</a>.Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com0tag:blogger.com,1999:blog-8949810268237856091.post-22656184518888359962018-01-17T12:45:00.003-05:002018-01-17T12:58:12.606-05:00What is in a PANOPTES robotic telescope?As I was working to build PAN006, my version of the PANOPTES robotic telescope design, I realized I had a bunch of questions about how all of the parts work together. To that end, I've written a page with what I've learned, <a href="http://www.projectpanoptes.org/panoptes_baseline_unit_description">PANOPTES Baseline System Description</a>, which has now been added to the <a href="http://www.projectpanoptes.org/">Project PANOPTES website</a>. I hope it helps others.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiPDReUjj9ua8kpLbbZOJ4Wo8YGDp2lGLQnggc4i9YSj2bkA2dF_TNlxGzomWCvq5bQgKCjc7lTimXSBTTlLrbAcANDqCtXwkKrezDadL_W3mUGEO9k5JP9qM7uDqLElI9ntsukE7DwfaYP/s1600/pan001_with_labelled_components.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="577" data-original-width="746" height="492" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiPDReUjj9ua8kpLbbZOJ4Wo8YGDp2lGLQnggc4i9YSj2bkA2dF_TNlxGzomWCvq5bQgKCjc7lTimXSBTTlLrbAcANDqCtXwkKrezDadL_W3mUGEO9k5JP9qM7uDqLElI9ntsukE7DwfaYP/s640/pan001_with_labelled_components.png" width="640" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<br />Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com0tag:blogger.com,1999:blog-8949810268237856091.post-65300822477222011622017-11-15T17:05:00.000-05:002017-11-16T14:18:55.128-05:00Remote login is not the same as local loginIt's now been just over a month since I installed my PANOPTES exoplanet survey telescope in dome 7 of the Mars Center for Science and Technology at Wheaton College. I had hoped it would be operational by now, but sadly we're not there yet. This is <strike>my tale of woe</strike> what I've learned along the way. My apologies for the length of this post; I needed to get it off my chest! There is a brief summary at the end if you want to cut to the chase.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgLO6PT1vFB9ux-SYRSt6GCLjjJMb-l1EdIgzlRHZ4Fa2mUVASUVlg44snYjN9sHtEZf_kQP2TWqUxtRp5I6-nd4Yaq9i5oA0mZBtZcrsq6znptixunFSgSO9ZHMadLj3ckA7HysyBYtRs/s1600/G0019449.JPG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1356" data-original-width="1338" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgLO6PT1vFB9ux-SYRSt6GCLjjJMb-l1EdIgzlRHZ4Fa2mUVASUVlg44snYjN9sHtEZf_kQP2TWqUxtRp5I6-nd4Yaq9i5oA0mZBtZcrsq6znptixunFSgSO9ZHMadLj3ckA7HysyBYtRs/s320/G0019449.JPG" width="315" /></a></div>
<br />
For context, I signed on in April, 2017 as a "beta test builder", meaning that I was testing the instructions for building a PANOPTES baseline unit, an automated telescope, including building 3 electronics boards, fabricating mounting plates for a dual camera telescope, water-proofing an equatorial mount, creating a pier to which the mount is secured, etc.<br />
<br />
The design includes a small computer (Intel NUC) running Ubuntu Linux as its operating system; the telescope is managed by software (POCS, PANOPTES Observatory Control System) running on the NUC. The instructions written to date don't go into much detail on how to setup the NUC or run the software, so I've been blazing a trail, especially w.r.t. Linux configuration, which is the focus of this post.<br />
<h3>
Planning Ahead</h3>
<div>
I knew that once the scope was installed at Wheaton, it would be behind a firewall, preventing me from direct accessing it via SSH. In time I would be able to get VPN access, but until then it would be vital to have some way to access the NUC, so I setup Chrome Remote Desktop (CRD) on the NUC. This would enable me to login to the NUC from home, so long as both the NUC and my laptop each had access to the internet. I tested it at home, where the NUC was connected to my home Wi-Fi system.</div>
<h3>
You can't get there from here</h3>
When I planned to go to Stellafane with my PANOPTES scope, I thought that I might want to login to its computer from my laptop, but knew that there wasn't much Wi-Fi at Stellafane, so I modified the wired Ethernet settings both my laptop and the NUC so that they had fixed addresses 10.0.0.1 and 10.0.0.2, with the intent of being able to run an Ethernet cable directly between the two.<br />
<br />
<i>(Note: I've since learned that Ubuntu, or Linux in general, has support for networks without routers or gateways, where each computer picks its own address and can discover others on the network.)</i><br />
<br />
By the time I went to install the scope at Wheaton on October 7th, I'd forgotten about the fixed address. When the NUC was connected to the college's wired network, it didn't appear on the network. This also meant that it didn't have access to the internet, so CRD wouldn't work either.<br />
<br />
We (Sean & Joe, Wheaton sophmores, and I) tried a number of things to figure out why it wasn't working, including monitoring the traffic on the local Ethernet segment to see if we could identify the address it was using. I eventually remembered the fixed address, and needed to find a way to login into the NUC to remove that. Unfortunately, I'd left the laptop that has an Ethernet port at home, so I setup my phone with a Wi-Fi hotspot with the same SSID and password as my home Wi-Fi, thus providing a path to the Internet. CRD was finally working.<br />
<h3>
You can't change that from here</h3>
<div>
I logged into the NUC, and opened the Network Settings, with the intent of removing the fixed address. To my surprise, the dialog buttons for editing the IPv4 settings were disabled. I thought perhaps I'd use some command-line or config file to "hard-code" the address of the Ethernet connection, but I could find no evidence of that. We poked and prodded for a quite a while before I gave up, asking Joe to disconnect the NUC from all of its cables and take it out of the control box, with the plan of setting it up at home and debugging the problem "at leisure"... if you could call it that.</div>
<h3>
You can't connect this to that</h3>
<div>
As I was driving the 45 miles from Wheaton to home, it suddenly occurred to me that I'd asked Joe to disconnect one too many cables: the NUC has a mini-HDMI jack, while my monitors use the full-size HDMI connectors, so I had an adapter cable plugged into the NUC. This would allow for dragging a monitor, keyboard and mouse out to the dome if absolutely necessary. So, I ordered a pair of adapter cables from Amazon (more time efficient than driving 90).</div>
<br />
Once home I setup the NUC in the loft above my garage, where it connected to the Google Wi-Fi point in the same room. Once I had some free time, I sat in the family room and started up CRD on my laptop, and selected the NUC as the computer to connect to. The wait cursor spun for a while, then I got an error message saying that it couldn't connect because of some problem with the network I was using. WTF. I'd tested this previously, why is it not working now! I was so frustrated. I figured that I'd have to wait for those adapters to arrive in a few days.<br />
<br />
When I was at work that week, I took the problem of removing the fixed address to the folks at my local Tech Stop (Google's internal help desk, among other responsibilities), to see if they had any ideas. I was asked if I could login to it remotely; well, that seemed unlikely, but to my surprise, I could login to the NUC using CRD. Wow, how was that working! Well, let's hold that thought and deal with the fixed address...<br />
<br />
It still wasn't at all clear why the Network Settings buttons for changing things were greyed out, but Frank at Tech Stop had the idea of launching the dialog from the command-line, where we could use <span style="font-family: "courier new" , "courier" , monospace;">sudo</span> to ensure it was running with elevated permissions. Thankfully, this worked, and he was able to remove the problematic link, leaving the NUC with a wired internet connection that would use DHCP to get an address. Phew. We still didn't know why the buttons had been disabled, but at least the NUC would be able to connect to the Wheaton network.<br />
<h3>
You can't connect to that from here</h3>
Around that same time I bought a pair of Chromecast devices (I had the first generation, and want to upgrade). Like the rest of the computing devices in the house, they get internet access via our three Google Wi-Fi Points (what they call the Wi-Fi access points/extenders). There is a single primary point in the basement, providing Wi-Fi and wired Ethernet to the rest of house, and two other points, one upstairs in a bedroom, the other in the loft near the NUC. I setup the first Chromecast in the family room, which went smoothly (though I do wish they would install software updates in the middle of the night, not immediately after network setup, just when I want to try using my new toy).<br />
<br />
My experience with the second Chromecast wasn't so smooth. It showed a code on the screen, the Google Home app had no trouble finding the Chromecast to be setup, showed me the correct code (the same one displayed on the TV), but then the process stalled. Somehow, the app wasn't able to talk over the home network to the Chromecast. WTF! The other setup was so smooth. I tried many times, but it just couldn't connect over Wi-Fi to the device right in front of me, with the Google Wi-Fi right next to the TV. I hate these sorts of random failures, so I left it for a while.<br />
<br />
Eventually, the fact that I could connect to the NUC from work but not from my family room led me to wonder about network topology and connectivity. Could it be that the Google Wi-Fi points weren't providing complete connectivity within the house?<br />
<br />
I decided to move the working Chromecast to the loft. It worked there. I moved the non-working Chromecast to the family room, and again tried to set it up. This worked. And continued working when I swapped the location of the two Chromecasts.<br />
<br />
I may have isolated the problem. I tried wandering around the house, disabling and re-enabling Wi-Fi on the device with me, so that I would connect to the nearby Google Wi-Fi point with strong signal. If found that if the initiating device is connected to the primary access point (as my phone was when I tried to setup the Chromecast), and the receiving device is connected to a mesh point (i.e. a Wi-Fi extender), then there isn't a path between the two devices. This may not even be a general situation; it may only apply to trying to convert a device->Google->device connection into a device->device connection (something to do with the STUN protocol, at a guess).<br />
<h3>
You Can't Do That From There</h3>
Once the NUC was reinstalled at Wheaton on October 14th, it joined the campus network and CRD was working. I was really relieved.<br />
<div>
<br /></div>
<div>
Now we needed to prepare for polar alignment, for which we're going to need the cameras working and in focus. So, let's take a picture using the command-line. The first thing I did was to simply try to enumerate the cameras (<span style="font-family: "courier new" , "courier" , monospace;">gphoto2 --auto-detect</span>). Nada. The cameras were missing. OK, a trip out to the dome to make sure they have power, that the USB cables are plugged in. Everything looked fine.</div>
<br />
<span style="font-family: "courier new" , "courier" , monospace;">lsusb</span> showed that the cameras were in fact visible to the NUC, but somehow we couldn't interact with them. This worked fine at home, why not now? Various prodding showed no sign of the problem. After a while, we called it a night, and I went home to <strike>stew</strike> study the problem.<br />
<br />
Again using CRD, I enabled debug logging in gphoto2, which showed it considering all sorts of USB devices as possible cameras, including the Arduinos, but not the actual cameras. I could see them with lsusb, get there properties, but they were invisible to gphoto2. But why?<br />
<br />
At some point I checked the permissions on the USB devices for the cameras (e.g. <span style="font-family: "courier new" , "courier" , monospace;">/dev/bus/usb/002/007</span>), and discovered that they were owned by group <span style="font-family: "courier new" , "courier" , monospace;">plugdev</span>, while all the other USB devices were owned by <span style="font-family: "courier new" , "courier" , monospace;">root</span>. Why would that be? Why would they not all be the same? Why had I been able to take pictures previously but not now?<br />
<h3>
You need permission to do that from there</h3>
<div>
I started using Unix BSD in the mid-1980s, at a time when user permissions were pretty simple. A user was represented by a line in the <span style="font-family: "courier new" , "courier" , monospace;">passwd</span> file and adding a user to a group simply meant adding the user name to the list of members of that group, part of a single line of text in the <span style="font-family: "courier new" , "courier" , monospace;">groups</span> file. Things have evolved considerably since then. While working on this <span style="font-family: "courier new" , "courier" , monospace;">plugdev</span> question, I learned that Linux has a feature called PAM, Pluggable Authentication Modules, whereby system administrators can configure the behavior of APIs used by <span style="font-family: "courier new" , "courier" , monospace;">login</span>, <span style="font-family: "courier new" , "courier" , monospace;">su</span> and <span style="font-family: "courier new" , "courier" , monospace;">sudo</span>, among others. This includes modifying the set of groups of which a logged in user is a member (see <span style="font-family: "courier new" , "courier" , monospace;">pam_groups</span> for more info).</div>
<div>
<br /></div>
<div>
It appears that on single-user Linux installations (e.g. for a workstation or laptop, as opposed to a server), the default PAM configuration grants membership in the <span style="font-family: "courier new" , "courier" , monospace;">plugdev</span> group (and some other groups) to locally logged in user (really, the session). The idea is that the physically present user may temporarily connect devices such as a camera, and should of course be able to access them, while by default preventing a remotely logged-in user from messing with such devices.</div>
<div>
<br /></div>
<div>
I later realized this same thinking applied to the problem with the Network Settings dialog: when I logged in locally, my session was added to an admin group, thus enabling the me to create the connection object with the fixed address. But later, when logged in with CRD, my session wasn't a member of this group, and thus by default it wasn't allowed to edit those settings; we started the Network control program with <span style="font-family: "courier new" , "courier" , monospace;">sudo nm-connection-editor</span>.</div>
<h3>
Go on, give yourself permission</h3>
<div>
While we installed the desktop version of Ubuntu (The PANOPTES situation is different, as is the case of using CRD to log into your personal machine from a remote location. So, what to do? I needed to permanently add the my account to the plugdev group, but how? A search soon turned up a page with lots of examples of how to modify a user's account using the usermod command, including modifying the groups of a user:</div>
<blockquote class="tr_bq">
<span style="font-family: "courier new" , "courier" , monospace;">sudo usermod -G <user> <groups> # DON'T COPY THIS</groups></user></span></blockquote>
<div>
So I made the change. Since one needs to log out and back in again for this change to take effect, and the NUC is single user, I just rebooted the system. After waiting a minute I tried to log back in again with CRD. No response. Whoops! What happened? That required a bit of study, study I clearly should have done before I ran that command.</div>
<div>
<br /></div>
<div>
It turns out that <span style="font-family: "courier new" , "courier" , monospace;">usermod -G</span> <b>replaces</b> the list of groups that the user is a member of. <span style="font-family: "courier new" , "courier" , monospace;">usermod</span> takes <span style="font-family: "courier new" , "courier" , monospace;">-a</span> as an option to indicate that the command should append to the list. Argh. That command was too powerful. But why couldn't I login?</div>
<div>
<br /></div>
<div>
Since I didn't yet have VPN access, I asked Sean and Joe to help out. Sean logged in via ssh and reported back on the set of groups that the user <span style="font-family: "courier new" , "courier" , monospace;">panoptes</span> was in. As expected, it was only in the groups <span style="font-family: "courier new" , "courier" , monospace;">panoptes</span> and <span style="font-family: "courier new" , "courier" , monospace;">plugdev</span> <i>(note: by default, when a user is created, a group of the same name is created and is the primary group of the user is that group.)</i> And we no longer had the ability to run <span style="font-family: "courier new" , "courier" , monospace;">sudo</span>, so couldn't fix that user.</div>
<div>
<br /></div>
<div>
Fortunately we had another user on the system, <span style="font-family: "courier new" , "courier" , monospace;">james</span>, that was the first created on the system. But Sean couldn't log in as user james. It took days for me to remember that I'd changed the initial weak password to something a bit stronger; thank goodness I'd recorded it in LastPass.</div>
<div>
<br /></div>
<div>
<i>(Note: by default, the first user created on Ubuntu has greater privileges, i.e. is a member of additional groups, on the assumption that this is the "owner" of the machine.)</i></div>
<div>
<br /></div>
<div>
Sean was able to add user <span style="font-family: "courier new" , "courier" , monospace;">panoptes</span> to the groups that user <span style="font-family: "courier new" , "courier" , monospace;">james</span> was a member of, after which user <span style="font-family: "courier new" , "courier" , monospace;">panoptes</span> was able to <span style="font-family: "courier new" , "courier" , monospace;">sudo</span>. Phew. But after a reboot CRD was still not working. Sigh.</div>
<h3>
You should dig a tunnel from here to there</h3>
<div>
With Chrome Remote Desktop not working, I finally asked Prof. Maitra at Wheaton about VPN access so that I could use ssh to login from home. To the surprise of both of us, the procedure was quick and easy. I soon had the VPN software running on a Chromebook and was able to login to the NUC.</div>
<div>
<br /></div>
<div>
Poking around, it wasn't immediately obvious why CRD wasn't responding when I tried to connect. I could see that the service had started, but what ever needed to happen to enable user <span style="font-family: "courier new" , "courier" , monospace;">panoptes</span> wasn't happening. Fortunately CRD isn't entirely opaque, but instead uses shell scripts to configure the service. Reading those scripts I discovered that it was looking for users that were members of the group <span style="font-family: "courier new" , "courier" , monospace;">chrome-remote-desktop</span>. So, another group to which I needed to add the user. After doing that, and restarting the service, I was finally able to use CRD to login to the NUC.</div>
<h3>
Summary</h3>
<ul>
<li>A local login session <i>may</i> have different privileges than a remote session, such as being added to user <span style="font-family: "courier new" , "courier" , monospace;">plugdev</span>. Whether it does depends on PAM.</li>
<li>When a USB device is connected, the virtual device representing it may have its configuration, including ownership, changed by rules in /etc/udev. For example, installing gphoto2 adds rules that set the group for camera devices to be <span style="font-family: "courier new" , "courier" , monospace;">plugdev</span>.</li>
<li><span style="font-family: "courier new" , "courier" , monospace;">sudo usermod -G</span> is dangerous! It removes the user from its existing secondary groups. Use option <span style="font-family: "courier new" , "courier" , monospace;">-a</span> to instead add the user to additional secondary groups.</li>
<li>Make sure you have more than one privileged user on your machine, and keep track of the passwords, else you may find yourself locked out. <i>(It is possible to boot into a root shell, from where you can do anything, but that requires you be present, and have a keyboard and monitor hooked up. Not so convenient for embedded or remote systems.)</i></li>
<li>Chrome Remote Desktop works great, except when you kill it.</li>
</ul>
Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com2tag:blogger.com,1999:blog-8949810268237856091.post-65445183522954515372017-10-02T15:22:00.001-04:002017-10-02T16:28:04.412-04:009-DOF Sensor Instead of Absolute Encoders?<div dir="ltr">
The PANOPTES baseline design uses an iOptron mount, an iEQ30 or an iEQ45. These do not have absolute encoders, so they can't always determine the position of the two axes. Normally they'll remember where they were when last used, but if the user has loosened the clutch to adjust things, the scope has collided with the mount, there was a failure to save state during power down, or some such, then the mount won't have accurate info, and needs to be manually homed: aimed at the celestial pole. This runs contrary to the goal of a robotic telescope: it should require as little human assistance as we can reasonably achieve.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
We've experimented with including a 3-axis accelerometer in the camera box, a sensor that allows us to determine which way is down when the camera is not moving. This is pretty useful, but it can't distinguish, for example, between pointing down on the east side of the mount and on the west side of the mount. Basically, for each position on one side of the mount, there is a symmetric location on the other side where the sensor output (i.e. which way is down) is the same.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
I recently came across an interesting little breakout board for an <a href="https://www.invensense.com/products/motion-tracking/9-axis/mpu-9250/">MPU</a><a href="https://www.invensense.com/products/motion-tracking/9-axis/mpu-9250/">-9250,</a> a device with three 3-axis sensors: an accelerometer, a gyroscope and a magnetometer (i.e. a compass).</div>
<div dir="ltr">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://cdn.sparkfun.com/assets/8/b/b/4/5/9DOF-3axes.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="273" data-original-width="779" height="112" src="https://cdn.sparkfun.com/assets/8/b/b/4/5/9DOF-3axes.png" width="320" /></a></div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
While I can't think of a way to use a gyroscope in this application, the magnetometer may allow us to distinguish different orientations of the camera box w.r.t. the magnetic field lines of the earth. Of course, that assumes that the local conditions, including the electronics and all the nearby metal in the mount, cameras and lenses, doesn't interfere. But interference is pretty likely.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
Fortunately, once the mount is aligned to the celestial pole, there are only two degrees of freedom: declination and right ascension. So not all combinations of down and north are valid: for any one direction for down, only two compass values make sense, and they are in opposite directions. So, even if the compass resolution is pretty poor, it may still be possible to distinguish between the two sides of the mount.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
A complication with using the <a href="https://banggood.com/MPU9250BMP280-10DOF-GY-91-Acceleration-Gyroscope-Compass-Nine-Shaft-Sensor-Module-For-Arduino-p-1100982.html">breakout board</a> is that the sensor is a 3.3V device, while the Arduino Micro that we use runs on 5V. The exceedingly minimal documentation for the board says it has a voltage regulator for the power line, but says nothing about level shifting the signal lines. So, I may destroy the thing just hooking it up to the Micro. Sigh.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
If anyone has pointers to previous work on this, please do share! Thanks.</div>
Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com0tag:blogger.com,1999:blog-8949810268237856091.post-50932129779646049042017-08-21T12:00:00.000-04:002017-08-28T13:07:49.096-04:00PANOPTES at Stellafane and Beyond<br class="Apple-interchange-newline" />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgPKgI-qv8BDW03w5xVMj9rtcPt5QrFtq3KyQnEB-SRfsDVZxCFfuLP9F0aaCgLmyLFMB7ygPdEgyRPRyh6I0l5jA9HMn4DiSvdjVwYE412YSspEp9PgRrKAuBZC5nbCfYHus61rDJRLOV8/s1600/DSC05972.JPG" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" data-original-height="1200" data-original-width="1600" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgPKgI-qv8BDW03w5xVMj9rtcPt5QrFtq3KyQnEB-SRfsDVZxCFfuLP9F0aaCgLmyLFMB7ygPdEgyRPRyh6I0l5jA9HMn4DiSvdjVwYE412YSspEp9PgRrKAuBZC5nbCfYHus61rDJRLOV8/s320/DSC05972.JPG" width="320" /></a>
My son and I went to Stellafane 2017 in July, where I had entered my PANOPTES scope in the Mechanical Competition; the other competition at Stellafane is for telescope optics, but the PANOPTES design uses off-the-shelf cameras and lenses... thank goodness.<br />
<br />
While I was setting up on the competition field, chatting with my neighbors, I learned that I wasn't the only one that had set Stellafane as a personal deadline. Apparently it is pretty common for telescopes to be (almost) finished the day before the competition; like many, among my reasons for entering the competition was provide a deadline and, of course, to give me the chance to share my passion with like-minded folks. It certainly helped me stay focused during the 3 months of the build leading up to Stellafane (I signed on to Project PANOPTES in April after hearing about it at NEAIC and NEAF from Josh Walawender).
<br />
<div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div>
<br />
<span style="font-family: inherit;">The roots of the Stellafane Convention are in the building of telescopes, especially the mounts, and some of the winners in the past have made truly beautiful scopes, with wonderful wooden frames or shiny brass clock drives. My "mechanical" entry was really about the electronics and the entire assembly, so pretty far off from the norm I suspect, and certainly there was nothing else like it this year. So, you can understand my surprise and delight when I learned that my entry had <a href="https://goo.gl/photos/cDhBnXPYvZeU6YTSA">won an award</a> in the "Special" category of the Mechanical Competition.</span><br />
<div dir="ltr">
<span style="font-family: inherit;"><br /></span></div>
<div dir="ltr">
<span style="font-family: inherit;">On the day of the competition, I spend 4 hours showing off the scope, sharing my enthusiasm for Project PANOPTES's exoplanet survey with lots of people, and was exposed how hard it can be to explain the goal of a project or design of an instrument. By far the number one question was "Why does your scope have two cameras?", followed by "Does it see in 3-D?". I'll work on answering that in detail in another post, but the short answer is economics: it is simply a cheaper way to get more light onto camera sensors than it would be if we had a single camera with a larger diameter lens, by at least hundreds of dollars, if not a thousand.</span></div>
<div dir="ltr">
<span style="font-family: inherit;"><br /></span></div>
<div dir="ltr">
<span style="font-family: inherit;">Among those who spent some time chatting with me about PANOPTES was a faculty advisor for the Astronomy Club at Mount Wachusett Community College. The college has a grant from NSF's S-STEM program for promoting an educated work force, which aims to go beyond just financial aid, providing personal and group support such as mentors and enrichment programs. Among the latter, they invite in outside speakers to talk to the group about their educational and career experiences. I'm now scheduled to speak to the group on September 18th; I'll share my somewhat rocky road through school and greater success at work (I'm better at learning by doing, or at least when sufficiently engaged), and the fun I've had building a PANOPTES scope. I've also been invited to give a brown bag talk at MIT about PANOPTES and the scope I've built for it.</span><br />
<span style="font-family: inherit;"><br /></span>
<span style="font-family: inherit;">Another pleasure that I garnered from working on PANOPTES is the new relationships with folks in New England and farther afield, many of whom have volunteered their experience and some have spent time helping me advance the build. I'm especially grateful to Noel Qiao, a neighbor who helped greatly with, among other tasks, preparing the Pelican case for hosting the electronics; and John Blomquist who has recently been instructing me in machining at the <a href="http://atmob.org/">ATMoB</a> clubhouse, helping me with <a href="https://goo.gl/photos/1SZCUkzDEcLrJJ5g8">fabricating the aluminum pier</a> to which the mount is secured.</span><br />
<span style="font-family: inherit;"><br /></span>
</div>
<div dir="ltr">
<div dir="ltr" style="color: black; font-style: normal; font-weight: normal; letter-spacing: normal; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;">
<div style="margin: 0px;">
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiq7yQHFSGSZQZLfQa_rdK4mfr6Z-6Yy67bbuT0JnGsmyyjzMm9C2i-jT-S87dXocQphOPGgCqmJgLWT0vye6OSNb6OUQuZP_zG2ychDkeiIRpXTq_MG2Avu2nEBSaFJEV6NZXB4YQlsBFq/s1600/IMG_20170826_164631.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1600" data-original-width="1301" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiq7yQHFSGSZQZLfQa_rdK4mfr6Z-6Yy67bbuT0JnGsmyyjzMm9C2i-jT-S87dXocQphOPGgCqmJgLWT0vye6OSNb6OUQuZP_zG2ychDkeiIRpXTq_MG2Avu2nEBSaFJEV6NZXB4YQlsBFq/s320/IMG_20170826_164631.jpg" width="260" /></a></div>
<span style="font-family: inherit;"><br /></span></div>
</div>
</div>
<div dir="ltr">
<span style="font-family: inherit;">Soon after those talks, where I'll have the PANOPTES scope for show-and-tell, I'm planning on installing it at <a href="https://wheatoncollege.edu/academics/departments/astronomy/">Wheaton College</a>, in Norton, MA, where the astronomy professor, Dr. Maitra, has generously offered the scope a temporary home in one of their domes. This is especially nice as it allows me to delay the weatherproofing of the mount, an operation that is pretty much one-way... once weatherproofed, you can't get at the motors, polar scope, etc.</span></div>
<div dir="ltr">
<span style="font-family: inherit;"><br /></span></div>
<div dir="ltr">
<span style="font-family: inherit;">More to come soon.</span></div>
<div dir="ltr">
<br /></div>
</div>
</div>
Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com0tag:blogger.com,1999:blog-8949810268237856091.post-47163034750759044472017-07-18T16:27:00.000-04:002017-07-18T16:27:38.684-04:00PANOPTES ready for StellafaneTo help me make progress on building my <a href="http://www.projectpanoptes.org/">Project PANOPTES</a> robotic telescope, I enlisted an external motivator: I entered it into the <a href="http://stellafane.org/convention/2017/2017-scope-comp.html#mechanical">Mechanical Competition</a> at <a href="http://stellafane.org/convention/2017/">Stellafane</a>, which is this coming Saturday. With some help from a neighbor, Noel Qiao, the telescope is ready to show. I'm relieved and happy!<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEikKQrc6n9NvV0ow5FLHDNHgcV0fZoSoj7K8Q_dUxM4uh5xIw8UDca0nmSYdu0R67euNKG6h6X7fWam_TxMQkmDqW2tRAX5pYn-Y_piqovKZZo6XgBpShTq2bv00N_5XE8vuw7o4VoWzuu0/s1600/IMG_20170718_142730.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1600" data-original-width="1573" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEikKQrc6n9NvV0ow5FLHDNHgcV0fZoSoj7K8Q_dUxM4uh5xIw8UDca0nmSYdu0R67euNKG6h6X7fWam_TxMQkmDqW2tRAX5pYn-Y_piqovKZZo6XgBpShTq2bv00N_5XE8vuw7o4VoWzuu0/s320/IMG_20170718_142730.jpg" width="314" /></a></div>
<br />
Here is an <a href="https://photos.app.goo.gl/XnZDz7WscuUoJC1H3">album with more photos</a>. If you're planning to be at Stellafane, stop by and take a look in person.<br />
<br />
I also signed up for Stellafane's "<a href="http://stellafane.org/convention/2017/2017-informal-talks.html">Friday Informal Talks</a>," so now need to come up with a brief presentation about Project PANOPTES, exoplanets, etc.Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com0tag:blogger.com,1999:blog-8949810268237856091.post-24414612584571599902017-07-06T13:46:00.000-04:002017-07-06T13:46:36.387-04:00Read the Instructions... Thoroughly!I reported recently on completing the assembly of the <a href="https://enigma2eureka.blogspot.com/2017/06/panoptes-power-board-assembled.html">power board</a>. The designers of the PANOPTES system and the beta-test builders (I'm in that latter group) had discussed the fact that the instructions were structured in a way that made for some awkward steps: some tall parts are added before adjacent short parts, etc. I resolved to come up with an alternate assembly process that would be focused on efficient assembly. I went through the existing instructions, the PCB layout and parts list, figuring out which parts needed to be inserted from the bottom of the board, which from the top, and which could go on either side. I also sorted them by height and by side of the board. Altogether this seemed to result in a pretty easy assembly.<br />
<br />
I celebrated Independence Day by continuing work on the build, focusing on finishing up the electronics. I finished creating cables for the sensors, and was finally ready for testing the boards. First up, the Power Board. I plugged a fan into the fan output, and provided 12V power to the board. The fan came on. Yeah! The LEDs showing the fan was on did not come on. Oops. I started probing the board with a multimeter, and discovered that the LED came on if I grounded some spot near it, which shouldn't have happened. So I decided I'd better re-read the instructions.<br />
<br />
By the time I got to assembling this third and final PANOPTES board, I was used to the preamble at the start of each instructions document: title, image of the schematic or layout, list of tools needed, etc. I had skipped right past those and proceeded with my study of the detailed instructions. In fact, I skipped right past the warning that said to read another document FIRST if you were working with V0 of the board... which I was.<br />
<br />
That document was a repair document. It turns out that the PCB layout was has an error. Well, 5 errors. The footprint for the Solid State Relay had two of the pins reversed, with the effect that each of the 5 SSRs was not being used as a switch, and 12V was flowing to the outputs at all times. Not good. But there was a simple fix: raise the SSRs up, with a pair of wires underneath providing a cross-over. Except they were already soldered in place.<br />
<br />
I decided to try the other option listed in the repair doc: cutting traces on either side of the two swapped pins on each of the 5 SSRs, for 10 cuts, and then soldering wires to pins to get 12V in and out of each SSR via the opposite pin from the PCB layout. This was a bit tedious. So much effort, my effort, could have been saved by more careful reading, by me. Sigh.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi6dYnn11s_eqOeCOuIU-Ovdzn_Uu0I1DrGCW95uKg3g7GVY8UsES8JhZiSTDG1KtHiLYTJh3EzW7ivPu6v1tEfjfuR_NCNr5IcwpVRSqlcibnZyrHYrnvZlsNV5qHkmo_9monXjBhd1fxz/s1600/IMG_20170705_170138.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1200" data-original-width="1600" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi6dYnn11s_eqOeCOuIU-Ovdzn_Uu0I1DrGCW95uKg3g7GVY8UsES8JhZiSTDG1KtHiLYTJh3EzW7ivPu6v1tEfjfuR_NCNr5IcwpVRSqlcibnZyrHYrnvZlsNV5qHkmo_9monXjBhd1fxz/s320/IMG_20170705_170138.jpg" width="320" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiETq8WJ-sjZ58Dq2eSvtrYMmWACsM9qqokY_MoGpShzJiwNsz8CwLR-cKwUx4pU0FVIXW3zfNUzGyqw2D6b-BiUlGcumZuIvSBbpFnDz_Fa90Hk4qT2lK2nCxoNRGyBF-1bdhM8lpnEj09/s1600/IMG_20170705_170149.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1200" data-original-width="1600" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiETq8WJ-sjZ58Dq2eSvtrYMmWACsM9qqokY_MoGpShzJiwNsz8CwLR-cKwUx4pU0FVIXW3zfNUzGyqw2D6b-BiUlGcumZuIvSBbpFnDz_Fa90Hk4qT2lK2nCxoNRGyBF-1bdhM8lpnEj09/s320/IMG_20170705_170149.jpg" width="320" /></a></div>
<br />
Fortunately the repairs worked. The lines are now switched based on inputs from the ribbon cable connector... I also screwed up the ribbon cable: after carefully studying the connectors, I still put one on the cable the wrong way around. Fortunately I ordered 3 connectors, just in case I screwed something up, and fortunately I didn't make the cable super short, though I now wish I'd allowed a bit more room. Now I need to test the other boards, and later I need to calibrate the current sensors on the power board (those red daughter boards in the picture above).Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com0tag:blogger.com,1999:blog-8949810268237856091.post-74154864373940570362017-06-25T14:32:00.001-04:002017-06-25T18:22:37.827-04:00PANOPTES Power Board Assembled I'm excited to report that I've finished the assembly of the Power Board for my PANOPTES telescope.<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><img border="0" data-original-height="1024" data-original-width="1600" height="203" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiGQhnh3er3ZKp8f-n6iSAlc0EgzguWiEMfUakqkL18vZeUhMGGOXfBYcP8ZhQRKgWYd-Qx8bMJ87CTQiPIz8hD2N7Zgl6nsjUAO2zaC73CZkHWWI-UJM0b-UOMJR2NZdmxbq5unAsMMqQu/s320/IMG_20170625_120009.jpg" style="margin-left: auto; margin-right: auto;" width="320" /></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Top View</td></tr>
</tbody></table>
<div class="separator" style="clear: both; text-align: left;">
The board takes in 12 volts DC, and distributes it to the rest of the system. In particular it has 5 switched lines for the Cameras, PC, Mount, Cooling Fan & Weather Station. It has an unswitched line for the Telemetry Board, which has an Arduino that controls the Power Board at the direction of POCS (PANOPTES Observatory Control System), running on the PC.</div>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj8Q6C6yX9E37t4IV0-g0D10bMgvCQsd4pul296QwdgGW9utyIYJwgk6pLqlvboc5yGltH-3yGiKFhiyAJQqTiSD4a0r9-lrTnP9pFumhSAxAmwFh4va-_OmTl2MMzqGaF8XeaHZKlEdDJf/s1600/IMG_20170625_120026.jpg" imageanchor="1" style="clear: right; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="1200" data-original-width="1600" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj8Q6C6yX9E37t4IV0-g0D10bMgvCQsd4pul296QwdgGW9utyIYJwgk6pLqlvboc5yGltH-3yGiKFhiyAJQqTiSD4a0r9-lrTnP9pFumhSAxAmwFh4va-_OmTl2MMzqGaF8XeaHZKlEdDJf/s320/IMG_20170625_120026.jpg" width="320" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Bottom View<br /></td></tr>
</tbody></table>
Next up is testing that it works, but before that a short rest... and probably some of my home and day jobs! 😉<br />
<br />Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com0tag:blogger.com,1999:blog-8949810268237856091.post-16127385657733350342017-06-19T11:55:00.002-04:002017-06-19T11:55:53.032-04:00Camera and Telemetry Boards Assembled<div dir="ltr">
I was delayed in assembling the camera board because I ordered the wrong power jack: the parts list specified a PCB mount jack from SparkFun, but I was placing a large order from DigiKey so it made sense to find an equivalent part there. Unfortunately there are many, many similar jacks available. I ended up ordering one (well, probably 20) with eyelet connectors rather than through-hole pins for PCB mounting. Sigh. I realized the problem as soon as I tried using the jack, then ordered the parts from SparkFun. By the time they arrived -- I didn't pay for express shipping, which would have almost doubled the price -- I'd finished the rest of the board and could just add them in.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
I then moved on to the telemetry board; it connects via USB to the controlling PC and via a ribbon cable to the power board, and provides the control software (<a href="https://github.com/panoptes/POCS">POCS</a>) on the PC with the ability to switch power on-and-off for various components, and to measure current, temperature and humidity.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
The original board layout called for a very awkward assembly step: soldering wires to the little stubs of pins poking through the board from the ribbon cable IDC header. The design choice was necessitated by the spacing of the pins on the header (2 rows of 0.1" spaced pins) and the choice of a breadboard-like circuit board as the base of the design. A normal breadboard has no spot for such a grid of pins, unless it is OK to short out adjacent pins (either in a row or a column). The design called for using an <a href="https://www.adafruit.com/product/723">Adafruit Perma-Proto Mint Tin Size Breadboard PCB</a>. It has a group of holes at each end of the board that are not connected to their neighbors in to which we could place the header.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
Assembling this is not only challenging to execute, but I was worried that the connection would be fragile. I was very relieved when I found that Technological Arts in Canada sells a <a href="https://www.technologicalarts.ca/shop/store/category/56/adapters/ribbon-cable.html?Treeid=79">breadboard adapter for the IDC header</a>. Based on that, I sketched out a new layout for the same schematic... well, I did change the power supply to the DS18B20 temperature sensors, which I tested on a breadboard before assembly.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
Here are the results of both builds, with the telemetry board on top, the camera board below:</div>
<div dir="ltr">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhc2_XOFAfFCcqIYJoswB-JvORvJSkBTLtxKUAHj8ojc5Jdcf0juwoz1Mwbx0z894Mr9UqkW591o6J9-zSmw3OOflLQkjhgNb2rJXOBDfigY50E2bfFcmapGq9iNlr-qhsAm7jT_S4NrR_W/s1600/IMG_20170619_111156.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1016" data-original-width="1600" height="203" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhc2_XOFAfFCcqIYJoswB-JvORvJSkBTLtxKUAHj8ojc5Jdcf0juwoz1Mwbx0z894Mr9UqkW591o6J9-zSmw3OOflLQkjhgNb2rJXOBDfigY50E2bfFcmapGq9iNlr-qhsAm7jT_S4NrR_W/s320/IMG_20170619_111156.jpg" width="320" /></a></div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
I rushed the layout of the telemetry board, so the layout isn't as clean as it could be. For example, I forgot that the incoming power is DC 12V, so can't be supplied to most of the components, as they require DC 5V. As a result, the power jack is at the top left of the board, adjacent to the power rail I planned to use for 5V.</div>
Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com0tag:blogger.com,1999:blog-8949810268237856091.post-45500366059781883162017-05-21T22:00:00.000-04:002017-05-21T22:00:07.071-04:00Controlling the PANOPTES CamerasThe head unit of a PANOPTES robotic telescope contains two cameras, a USB hub, and an electronics board for controlling power to the cameras and for recording data from sensors: humidity, temperature and acceleration. I ordered the parts for this board (and two other boards) from Adafruit, SparkFun and DigiKey. I got started on assembly today, soldering on the switches (solid state relays) for turning the cameras on and off, the voltage regulators for stepping down from 12V to 9V for powering the cameras, an accelerometer for measuring the orientation of the head unit, and an Arduino Micro to manage things.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgWZ1akdAe2eD4iduDXhbweYbZvQS4Pf37eLC0AX7FwPGEPltMBKBfh_SZo1F3QNxadPW2xRfG4LH2K9bXECmDhIrklKbtHMfk9lf_u_0aBw7lBhHgp-i033nm8PHVh8n90P4kEFxLToMCR/s1600/IMG_20170521_205740.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="106" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgWZ1akdAe2eD4iduDXhbweYbZvQS4Pf37eLC0AX7FwPGEPltMBKBfh_SZo1F3QNxadPW2xRfG4LH2K9bXECmDhIrklKbtHMfk9lf_u_0aBw7lBhHgp-i033nm8PHVh8n90P4kEFxLToMCR/s320/IMG_20170521_205740.jpg" width="320" /></a></div>
<br />
I bypassed adding the power jacks because I ordered the wrong ones. Sigh. Yet another order to be made.Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com0tag:blogger.com,1999:blog-8949810268237856091.post-65721840756576516362017-05-05T16:14:00.000-04:002017-05-05T16:14:05.567-04:00Drilling & Tapping Mounting PlatesOne of the first tasks in building a Project PANOPTES unit is to drill & tap holes so that each pair of plates (3 ½” thick aluminum plates, and a Vixen-style dovetail plate) can be screwed together, and so that the cameras can be securely attached to the camera plate. The instructions call for doing the drilling one plate at a time, but my experience is that using a handheld electric drill will not produce holes that align from plate to plate. I choose to use clamps to hold pairs of plates together for drilling, thus ensuring that the holes are pretty well aligned.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhurH9cyM9G8YUDVILdVKMgAQFS7ZzNlDX4YT3jW9AE_YP9-YABxOVnTFHrEM7CQ398a3WPf-D0lfoQfXIJiWEquQJsipdQ8CF4NckzRRic3kKf2T8dodxmcTQ6C6J47EfQJw3B5FxvZls/s1600/DSC00005.JPG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="280" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhurH9cyM9G8YUDVILdVKMgAQFS7ZzNlDX4YT3jW9AE_YP9-YABxOVnTFHrEM7CQ398a3WPf-D0lfoQfXIJiWEquQJsipdQ8CF4NckzRRic3kKf2T8dodxmcTQ6C6J47EfQJw3B5FxvZls/s320/DSC00005.JPG" width="320" /></a></div>
<br />
<br />
The nice thing about this approach is that even if my drill holes aren’t particularly perpendicular to the surface of the plates, or not all in the same direction, it doesn’t matter, as they’ll each serve their purpose of pulling the plates together, but not moving it sideways (assuming the angle isn’t too severe).<br />
<br />
I drilled the holes using successively larger diameter drill bits so that each one was making the hole a bit larger, but not having to make the full diameter cut. I started with 1/16” bits, but that was really too small; I’d recommend ⅛” as the starting point.<br />
<br />
The instructions also specified metric bits, but I didn’t have any of those, nor are they readily available (e.g. Home Depot and Harbor Freight stores don’t have them in stock). The goal of course is to produce through-holes in a pair of plates, after which one will be tapped to accept a screw, and the other will be widened a bit further so that the screw will slide smoothly through the plate on its way to the tapped hole. Given that, I used the imperial (fractional inches) bits that I owned, stopping just before hole reached 5mm diameter. The M6-1.0mm tap I bought included a drill bit of approximately 5mm, which I used as the last bit to go through both plates.<br />
<br />
Since I normally use drill bits for drilling through plaster and wood (e.g. for hanging pictures, and other such tasks around the house), I don’t usually think about the sharpness of drill bits. But I found that the bits we had at home were struggling to cut through the 6061 aluminum plates. The bits are Ridgid brand, purchased from Home Depot. When I was shopping for the screws I decided to try the Milwaukee brand bits that were available in a nice package with two each of lots of commonly used small sizes. Wow, they were a lot sharper than the old bits, and greatly improved the rate of cutting.<br />
<br />
Summary: clamping plates together helps with alignment, and fresh, sharp bits get the job done faster.Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com0tag:blogger.com,1999:blog-8949810268237856091.post-120406906187973462017-04-22T17:21:00.000-04:002017-04-22T21:15:35.672-04:00Machining without machine tools<span style="font-family: inherit;">I'm going through the process of fabricating the 3 mounting plates for my <a href="http://www.projectpanoptes.org/hardware/camera_box.html">Project PANOPTES</a> automated telescope, shown here:</span><br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEifh3QPiu-hTmG49999FtiEoHA84ML-2M2lPM0nczFs9F328QK9OmVms0xiJGh_9s5eMREb5dg668K0X2-vIUqkbV2qPHSBEgRlx5aAZWwOpQ6Ia_-E8F_Gnb292HWtFzRhP44tBdj3fT_4/s1600/housing_3.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="180" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEifh3QPiu-hTmG49999FtiEoHA84ML-2M2lPM0nczFs9F328QK9OmVms0xiJGh_9s5eMREb5dg668K0X2-vIUqkbV2qPHSBEgRlx5aAZWwOpQ6Ia_-E8F_Gnb292HWtFzRhP44tBdj3fT_4/s320/housing_3.jpg" width="320" /></a></div>
<span style="font-family: inherit;">These attach the cameras and their enclosure to the equatorial mount. The goal is that folks without specialized equipment can build the telescope. The plates come from the vendor 12" long, but need to be cut down to around 200mm (7.87"). The recommendation is to use a hacksaw, so here is my first attempt:</span><br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgmRP8jAbsau-fhd-9TC9d9BUYUrHywvyyq7p6ImPXnqZI-92SzggcmGPeoohpPOMKh5tA3pemp4Mko1pFlx_rAc_vtdQ4nmTPcmts41w0O5htLbeSrR0S0VojVxbXgGhzmWCcdBhQ9ce-x/s1600/IMG_20170420_151650-COLLAGE.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgmRP8jAbsau-fhd-9TC9d9BUYUrHywvyyq7p6ImPXnqZI-92SzggcmGPeoohpPOMKh5tA3pemp4Mko1pFlx_rAc_vtdQ4nmTPcmts41w0O5htLbeSrR0S0VojVxbXgGhzmWCcdBhQ9ce-x/s320/IMG_20170420_151650-COLLAGE.jpg" width="320" /></a></div>
<span style="font-family: inherit;">OK, so I need to improve my hacksaw technique.</span><br />
<span style="font-family: inherit;"><br /></span>
<span style="font-family: inherit;">Searching around the web, it appears that I should scribe the line pretty deeply, and all the way around. Some beeswax or other lubricant was suggested in some places, presumably to slow the process of cutting so that it doesn't veer off course quickly.</span><br />
<span style="font-family: inherit;"><br /></span>
<span style="font-family: inherit;"><b>Update</b>: I tried again with more deeply scored lines, and sawing across the width of the bar, but that didn't appear to be very effective -- the hacksaw blade still wandered. So I made a little jig in the form of a piece of wood to clamp against one side of the line to be cut. This yielded a very nice straight edge, though it wasn't quite perpendicular to the long edge.</span><br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjI69J6kay0ijI53XgTd0_0amNTl9SNXA3oCi5WB49qhHCwY3Xhq-WHUFLpxPfS6hOj2dftabwCUWN0iGmURyG6PYbVQR8mpqEKhJSfzlOFoM6AlJg2_L-5Dx-1wWko0CAacDO8Am3Z0du5/s1600/IMG_20170422_201320-COLLAGE.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjI69J6kay0ijI53XgTd0_0amNTl9SNXA3oCi5WB49qhHCwY3Xhq-WHUFLpxPfS6hOj2dftabwCUWN0iGmURyG6PYbVQR8mpqEKhJSfzlOFoM6AlJg2_L-5Dx-1wWko0CAacDO8Am3Z0du5/s320/IMG_20170422_201320-COLLAGE.jpg" width="320" /></a></div>
<span style="font-family: inherit;"><br /></span>Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com2tag:blogger.com,1999:blog-8949810268237856091.post-84167148112109522532017-04-16T16:38:00.002-04:002017-04-17T13:13:54.075-04:00Let's hunt exoplanets together<div dir="ltr">
Late nights and I don't get along. I get grumpy and want to go to bed. This might not seem like such a problem, but I'm also keen on astronomy. Except for our sun, stars are out at night. Here in New England darkness falls pretty early in the winter but is also accompanied by bone chilling weather. And warm summer nights are swarming with mosquitoes. </div>
<br class="Apple-interchange-newline" />
I own binoculars and an 8" <a href="https://en.wikipedia.org/wiki/Dobsonian_telescope">Dobsonian</a> telescope, but they are rarely used. The telescope spent at least 10 years in the basement of our current house, and 7 in the previous house. Given that I'm very much a techie, I'm drawn to the fancy mounts and cameras that some amateurs use, and which litter the pages of the astronomy magazines, but it's hard to figure out how I could make use of them: when my wife and I shop for a house we tend to be drawn to houses with views of nature. Our current house is on a small lot with plenty of trees, a nice view across a playground to a meadow beyond, but no room for a backyard observatory. When I do drag the telescope out, it's only out to the driveway.<br />
<div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
Basically, I'm not cut out for stargazing, so I've mostly been an armchair astronomer, reading about the field in magazines and books (I highly recommend <i>The Glass Universe</i> by Dava Sobel).</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
Nonetheless I continued to be interested in being a more engaged amateur astronomer. I took a step in that direction recently by joining <a href="http://atmob.org/">ATMoB</a>, the Amateur Telescope Makers of Boston, a club with monthly meetings at the Harvard-Smithsonian Center for Astrophysics in Cambridge, and a clubhouse and observing field in Westford, on MIT property. The highlight of the monthly meeting is a presentation by a guest speaker, with topics all over the astronomy map. I've enjoyed the camaraderie of chatting with other astronomy enthusiasts at dinner before the meetings and up at the clubhouse, where I've helped with a couple of work parties and joined in the stargazing a couple of nights.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
ATMoB members are also eager to help others, and offered to help me get my telescope restored to working order. The first attempt to take the telescope to the clubhouse was a bust: it's primarily made of thick plywood and sonotube, and I threw my back out loading it into the car. Sigh. Eventually the scope and I made it there, where a fellow member helped me to get the scope back into collimation (i.e. getting the mirrors lined up so the light hits the eyepiece at the correct angle).</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
I also discovered <a href="http://www.rocklandastronomy.com/neaf.html">NEAF</a> and <a href="http://www.rocklandastronomy.com/neaic.html">NEAIC</a>, events in Suffren, NY held by the <a href="http://www.rocklandastronomy.com/">Rockland Astronomy Club</a> in early April. I attended this year (2017) and was very impressed by the talks. I was particularly intrigued by the call-to-action of <a href="http://www.twilightlandscapes.com/">Josh Walawender</a>, speaking on behalf of <a href="http://www.projectpanoptes.org/">Project PANOPTES</a>. He described a project right up my alley: help with the hunt for exoplanets by building a small, robotic (i.e. automated) telescope that will contribute to a growing effort to find stars whose light dims on a regular basis due to a planet crossing between such a star and our telescopes on Earth. I attended his talk twice, first at NEAIC and then again at NEAF, and dragged him to join me for breakfast so that we could talk more (yes, his pitch really did strike home with me).<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://upload.wikimedia.org/wikipedia/commons/e/e5/Exoplanet_transit_detection.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="177" src="https://upload.wikimedia.org/wikipedia/commons/e/e5/Exoplanet_transit_detection.png" width="320" /></a></div>
<br /></div>
<div dir="ltr">
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiqjAkGUu3KUKoSv3aUvep6PS-UJAXyNfwB7Vxwb7P8BHe6vj61ZVR2igMbtf1_gPSnwKDJXEAL1ZfD9HyEtzmmnx1D5gqNZQfbEy8GS90zk9RJTpdLRtPG5oGsBNR6TVJRSw10NlsjU0_G/s1600/transit-method-animation.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="246" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiqjAkGUu3KUKoSv3aUvep6PS-UJAXyNfwB7Vxwb7P8BHe6vj61ZVR2igMbtf1_gPSnwKDJXEAL1ZfD9HyEtzmmnx1D5gqNZQfbEy8GS90zk9RJTpdLRtPG5oGsBNR6TVJRSw10NlsjU0_G/s320/transit-method-animation.gif" width="320" /></a></div>
<br /></div>
<div dir="ltr">
I've signed up to build such a telescope, and have started by ordering the mount, one of the two cameras and a few other parts. I'll do some initial fabrication and debugging before moving on to order more parts. Updates to follow.<br />
<br />
If you'd like to join in or just learn more, visit the project website, <a href="http://www.projectpanoptes.org/">www.projectpanoptes.org</a>. If you're in the Boston area and want to know more, let me know via the comments or show up at an ATMoB monthly meeting... I try to make it to most of them. And I'll be looking for a very specific kind of help from friends: after I get the telescope debugged, it will need a "room with a view"... well, a secluded spot with a good view of the sky. By secluded I mainly mean unlikely to get unwanted attention. I'm sure that as a kid I would have found such a thing quite an attraction -- hey, let's open the box and see what's inside! The additional requirements for such a site are: firm ground on which to place a tripod or pier, power and internet access of some kind. <b>Please</b> let me know if you're interested or even just willing to help out.</div>
</div>
Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com2tag:blogger.com,1999:blog-8949810268237856091.post-33806243607386765812014-07-22T19:00:00.000-04:002014-07-22T19:00:00.927-04:00Sound Source Localization AssumptionsI've been reading a number of research papers, etc., related to sound source localization as I attempt to build such a system (e.g. for traffic jam vs. open road classification), especially those about Time Delay of Arrival (TDoA) estimation, combining evidence from multiple microphone pairs, and calibrating the locations of the microphones in an array. Some of those papers have been clear about their assumptions, which helps in understanding the limits of their designs.<br />
<br />
The approach I'm taking involves digitizing the microphone inputs and computing the cross-correlation of the signals recorded by a pair of microphones at various time delays (e.g. how does the sound that arrived at microphone 1 compare to the sound that arrived at microphone 2 delayed by 1 ms, or delayed by 3 samples, etc.). Let's see what assumptions I'm making...<br />
<ol>
<li>The <b>speed of sound</b> is constant across the region of interest (sources to microphones) over the short-term (e.g. 30 minutes). The speed of sound varies with changes in temperature, humidity, and air pressure, and also wind increases the speed in the direction of the wind, and decreases it in the opposite direction.</li>
<li>For <b>planar</b> sound sources (e.g. for cars on surface roads, for fairly flat ground), we have <b>at least 3 microphones</b> in the plane of the sound.</li>
<li>For <b>non-planar</b> sources (e.g. planes, people sitting and standing in a conference room), we have <b>at least 4 microphones</b> in a sensible non-planar arrangement; for example, 4 microphones in a tetrahedral arrangement gives us the best omni-directional resolution, because poor resolution from one pair (e.g. when the sound source is on the same line as the two microphones, but not between them) is compensated for by the other pairs.</li>
<li>Not all microphones need to be paired up (e.g. we might have 4 microphones in one array, making up 6 pairs, and then another 3 microphones in another array, contributing another 3 pairs), so not all constraints below apply to every possible microphone pair, just those pairs used for TDoA estimation.</li>
<li>For my "traffic detection, at a distance" project (the <b>far-field</b>), I want to be able distinguish between cars in the two lanes of the road (coming and going from the congested intersection), out to a distance of around 250 meters. The road width is about 25 feet (measured from aerial image), so a separation of around 2 meters would be enough to distinguish the two directions. That amounts to an <b>angular resolution</b> of 0.5 degrees... pretty darn small. I've not done calculations yet to show whether this is feasible.</li>
<li>For <b>near-field</b> applications (e.g. aiming a camera at a speaker in a conference room), we'd like to be able to distinguish objects 30cm apart, out to around 5m, or an angular resolution of about 3.5 degrees.</li>
<li>The microphones in each pair are <b>not too near each other</b>, else our position resolution will be diminished, because the maximum TDoA is small. For example, with microphones 1 inch apart, the maximum TDoA is about 94 microseconds; with sound recorded at CD rate (44,100 samples per second, about 23 microseconds per sample), a sound can arrive at the second microphone at most 4 samples later than at the first, so at best we can say that a sound arrived from one of 9 (4*2 + 1) "directions" (broadly defined) relative to the microphone pair.</li>
<li>The microphones in each pair have <b>similar frequency response</b>, else it is hard to compare their signals to compute TDoA.</li>
<li>The microphones in each pair have similar response to sounds <b>arriving from any direction</b>; this is actually very hard to achieve, as microphones typically have some directionality to their response, even those known as omni-directional. If we have omni-directional microphones that all have the same orientation, then sources far away will appear to be coming from essentially the same direction relative to the orientation of the microphone for all of them; but for a nearby sound source, the sound might be at very directions relative to the microphone orientation, which can change both the amplitude response, and the frequency response.</li>
<li>The <b>analog to digital conversion</b> (ADC) of the microphones is at the <b>same sample rate</b> for all microphones.</li>
<li>The microphone through ADC path has similar response for both microphones in a pair (e.g. they have the <b>same gain and electrical noise</b>).</li>
<li>The signal of interest is large enough by the time it is picked up by the microphones and converted to digital to be distinguished from noise (i.e. the <b>signal-to-noise</b> ratio is high enough, whatever that is).</li>
<li>The <b>sampled data is aligned</b> (i.e. we have a straightforward means to get all the samples recorded "at the same time"). For my experiments so far this was easy because I'm using a 4-channel recorder, but for a more complete prototype I'll need a new solution. My friend Bent recommends, based on his personal experience, the Motu 8pre, which has 8 microphone inputs; even better, he is willing to loan me his for some experiments!</li>
</ol>
<div>
I'm sure I'm making even more assumptions than I've included here (and some of these are really "desired features").</div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com1tag:blogger.com,1999:blog-8949810268237856091.post-11625436412488537652014-07-18T07:26:00.000-04:002014-07-18T07:26:23.895-04:00Mirror, mirror, on the wall, is there a fairer route in the land?Some years ago I was often in a traffic jam approaching a T-junction near my house. I had a choice of taking the direct path, or taking a detour on to a circuitous route before reaching the potential traffic jam; naturally, I couldn't see the traffic jam before having to make that decision. This was before I had a smartphone with Google Maps, and before the traffic data in Google Maps was detailed enough to include that local backup.<br />
<br />
This led me to ask this question: could I build a system, installed on my property, able to determine whether or not there was a traffic jam on the direct route? Such a system could publish this data to the web, and I would examine it before leaving work, saving myself anywhere from a couple of minutes, to 15. Not earth shaking, but it would be nice.<br />
<br />
If there was a direct line of sight from a window in my house to the road, I could just install a camera, and upload photos every few minutes, or on demand. Of course, there isn't.<br />
<br />
However, standing in my yard I can hear the road noise of fast moving vehicles, and the rumble of idling trucks in the distance, but I'm not usually sure what direction that rumble comes from. So, could I build a system using microphones placed outside the house (mounted on the wall, or maybe in the yard), which could compute the directions from which the multiple sounds are arriving, or even better the<br />
locations of the sound sources? And having such an ability, could I then classify different types of sounds, such as idling cars in a traffic jam, a single idling truck at a loading dock, and vehicles moving quickly, preferably with enough spatial resolution to distinguish the slow moving cars approaching the T-junction from those quickly moving away?<br />
<br />
I decided to try working on idea early last year, thinking of it as a passive acoustic sonar, though I've since learned this is often referred to as acoustic localization or multi-lateration.<br />
<br />
My friend Bent loaned me a couple of microphones, which I used as the left and right channels of a 3 microphone array, the center microphone being the average of the 2 internal microphones of a TASCAM 4-channel recorder. Bent and I recorded ourselves talking as we walked back and forth in a line parallel to the line of the 3 microphones, perhaps 10 feet away. I then wrote some Python code to read the WAV files and perform cross-correlation of the channel pairs using numpy, but was dismayed at that results. The stereo effect was very evident when I listened to the left and right channels, but the results from my program showed nothing intelligible at all. I worked on it for a bit, but dropped the effort in favor of skiing.<br />
<br />
After the end of ski season this year, Bent and I reviewed the code, and discovered that I just wasn't using numpy correctly. Sigh. After fixing those, we got this image:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi-K2tsKAG_gbCIQmPcMreIJqeJUXHr35l63vvhWR01AwobJTeWjDIYdeA-KtjJHHSvd5Jqo_6-XFn8OXfUvPCtbVc4X3I1a6KAnwvmQRpNqnZ-URdllf1pwZKZtuUGLgf-q2QSEL9a76-T/s1600/correlate.py.67to79.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi-K2tsKAG_gbCIQmPcMreIJqeJUXHr35l63vvhWR01AwobJTeWjDIYdeA-KtjJHHSvd5Jqo_6-XFn8OXfUvPCtbVc4X3I1a6KAnwvmQRpNqnZ-URdllf1pwZKZtuUGLgf-q2QSEL9a76-T/s1600/correlate.py.67to79.jpg" height="417" width="640" /></a></div>
<br />
The horizontal axis is time (12 seconds in total, with 10ms windows), and the vertical axis is the offset (in samples) between the sliding windows of samples from the left and right microphones, with the value being the cross-correlation between the normalized windows. I used matplotlib to render the results as an image, which used a default color map, where blue colors indicate low correlation, and red is high correlation. The bright line across the image is where there is zero offset between the windows of samples (i.e. a high correlation value there implies there was a sound source on the line perpendicular to the mid-point of the line between the two microphones.<br />
<br />
More to come.Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com0tag:blogger.com,1999:blog-8949810268237856091.post-74326445193600458872011-11-22T22:48:00.000-05:002011-11-22T22:48:50.680-05:00Creating a Basic Isometric MapLast year Christian Weber wrote a <a href="http://www.cw-internetdienste.de/2010/05/creating-a-basic-isometric-map/">helpful introductory article</a> showing how to use CSS to layout tiles for an <a href="http://en.wikipedia.org/wiki/Video_games_with_isometric_graphics">isometric</a> <a href="http://www.mapeditor.org/">map</a>. However, the screen shot and <a href="http://www.cw-internetdienste.de/isomap/">demo</a> looked a little odd; note the black V below where you can see "underneath" a tile.<br />
<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgHsVk4iU7zvEDqt4Z93kblA7Ak-LWwkduzUh_8yvj_iMdzKzdwyWOP6NiffotW-SjUN5ovv3ssp2F_w8KqzDWPRLmRkdoC_Q7Tmk0-6h2_i1h_noh2MpAbOI2wFgSxhV_cJMfoJ-4Z4zS7/s1600/cw-isometric-error.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgHsVk4iU7zvEDqt4Z93kblA7Ak-LWwkduzUh_8yvj_iMdzKzdwyWOP6NiffotW-SjUN5ovv3ssp2F_w8KqzDWPRLmRkdoC_Q7Tmk0-6h2_i1h_noh2MpAbOI2wFgSxhV_cJMfoJ-4Z4zS7/s1600/cw-isometric-error.jpg" /></a></div><br />
The problems were partly caused by placing tiles adjacent to each other when they didn't have matching sides (i.e. a side that is all grass should butt up against an all grass side of another tile). In addition, the CSS styles for the tiles had the wrong offsets for identifying the positions of the sprites. The code in the article compensated for the style errors with some "corrections", which of course made it more confusing.<br />
<br />
I've posted a <a href="http://jsdabbler.appspot.com/home/jamessynge/isomap/random-isomap.html">corrected version here</a>. It has a table indicating the type of each tile side, allowing for the generation of a random map were each tile's neighbors are appropriate, such that there are no gaps. For example:<br />
<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh_nF7n0MO-9c9iOe9didoFgo7SWrpMY1yCwGJDH7OcHH-ULSxPjP7F1W33LWC89NXE9nkQTl3briTX7HGST-MVOLsf_D0YJ7l1TwmrIdZJp0Aktd7scayPP9kdmBZUGy3cmJFFB8DYOSAk/s1600/Isometric+Tile+Map+Test.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="177" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh_nF7n0MO-9c9iOe9didoFgo7SWrpMY1yCwGJDH7OcHH-ULSxPjP7F1W33LWC89NXE9nkQTl3briTX7HGST-MVOLsf_D0YJ7l1TwmrIdZJp0Aktd7scayPP9kdmBZUGy3cmJFFB8DYOSAk/s320/Isometric+Tile+Map+Test.png" width="320" /></a></div>Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com1tag:blogger.com,1999:blog-8949810268237856091.post-10702357846234444582011-05-08T10:37:00.000-04:002011-05-08T16:16:11.242-04:00Writing a Windows Service in Java<style type="text/css">
tt, .jms-inline-code {
font-family:"Courier New", Courier, monospace;
font-size:1em;
}
</style><br />
I've got a couple of Java programs that I want to leave running permanently on my laptop, so I set about creating a Windows Service. I investigated several of the alternatives (<a href="http://wrapper.tanukisoftware.com/">Java Service Wrapper</a>, <a href="http://yajsw.sourceforge.net/">Yet Another Java Service Wrapper</a>, etc.), but having recently discovered <a href="http://jna.java.net/">Java Native Access</a> (JNA), I decided to see if I could produce a fairly lightweight solution.<br />
<br />
JNA (and <a href="http://sourceware.org/libffi/">libffi</a> on which it depends) provides a means to dynamically create a bridge between Java and native libraries, a feature I've been wanting for years. I'd used JNI in the past, but I find it rather brittle. I've been pleasantly surprised to find that even when I made various mistakes with JNA, the JVM didn't crash; instead, JNA threw relatively meaningful exceptions, such as when it was unable to bridge the gap between a call from Windows to a Java method I wrote to accept that call (i.e. the argument types I used weren't appropriate). For example, the Windows Service API requires the service to implement a <a href="http://msdn.microsoft.com/en-us/library/ms685138(v=VS.85).aspx"><tt>ServiceMain</tt></a> function which will be called by Windows:<br />
<pre class="brush: java">VOID WINAPI ServiceMain(
__in DWORD dwArgc,
__in LPTSTR *lpszArgv
);
</pre>That lpszArgv is a pointer to an array of pointers to strings that will be passed to the ServiceMain function. I wanted to define a type, ReceiveStringArray, extending Pointer, as the type of lpszArgv in Java. Unfortunately, JNA could not bridge the gap from the native arguments to ReceiveStringArray, so I had to fallback to using JNA's Pointer as its type, and fortunately Pointer has a method, getStringArray, that handles exactly the translation needed here.<br />
<pre class="brush: java">interface SERVICE_MAIN_FUNCTION extends StdCallCallback {
/**
* ServiceMain is the main method of the service. It should return only
* once the service is stopped.
*
* @param dwArgc
* @param argv A pointer to an array (of length dwArgc)
* of pointers to strings.
*/
void ServiceMain(int dwArgc, Pointer argv);
}
</pre>It may be that I was missing some appropriate constructor in ReceiveStringArray, or that a JNA TypeMapper was needed to handle initializing a ReceiveStringArray in this situation.<br />
<br />
Another interesting challenge I had was that the first of my callback functions, <tt>ServiceMain</tt>, was being called at the expected time, but it was also being called when I expected a different callback function to be called instead. I'm not certain, but I think this is because JNA doesn't really care about the name of the method in the interface; there must be a single method, and that is the one that will be called. I had created two interfaces, but only a single class implementing them both. I suspected that I needed to have a separate class (or instance?) to receive each type of callback.<br />
<br />
<b><tt style="font-size: large;">StartServiceCtrlDispatcher</tt></b><br />
A typical Windows Service is basically just a program that runs in the background, with no direct interaction with the user, with a few extra required interactions with Windows. When the program is started it must call <tt><a href="http://msdn.microsoft.com/en-us/library/ms686324(v=VS.85).aspx">StartServiceCtrlDispatcher</a></tt>, passing in a table of services to be started (typically just one). The dispatcher starts another thread to run <tt>ServiceMain</tt>, then it dispatches various Windows events to the service (e.g. when Windows is shutting down), and returns once the service stops.<br />
<pre class="brush: java">import com.sun.jna.Native;
import com.sun.jna.Structure;
import com.sun.jna.platform.win32.Advapi32;
import com.sun.jna.win32.W32APIOptions;
public interface ExtendedAdvapi32 extends Advapi32 {
ExtendedAdvapi32 INSTANCE = (ExtendedAdvapi32) Native.loadLibrary(
"Advapi32", ExtendedAdvapi32.class, W32APIOptions.UNICODE_OPTIONS);
class SERVICE_TABLE_ENTRY extends Structure {
public String serviceName;
public SERVICE_MAIN_FUNCTION serviceProc;
}
boolean StartServiceCtrlDispatcher(SERVICE_TABLE_ENTRY[] lpServiceTable);
}
</pre><div>The API doesn't include a length parameter to tell the dispatcher how many services are implemented by the program. Instead, the last entry in the table must have NULL pointers. Therefore, while the program doesn't need (at this point) the name of the service it is implementing, the field must not be NULL. Instead, set it to an empty string. For example:</div><pre class="brush: java">ExtendedAdvapi32.SERVICE_TABLE_ENTRY entry =
new ExtendedAdvapi32.SERVICE_TABLE_ENTRY();
entry.serviceName = "";
entry.serviceProc = someServiceMainFunction;
ExtendedAdvapi32.SERVICE_TABLE_ENTRY[] serviceTable =
(SERVICE_TABLE_ENTRY[]) entry.toArray(2);
boolean result =
ExtendedAdvapi32.INSTANCE.StartServiceCtrlDispatcher(serviceTable);
</pre><br />
<b><tt style="font-size: large;">ServiceMain</tt></b><br />
<div style="margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px;">The <tt>ServiceMain</tt> callback function is invoked on another thread from the main thread, and shouldn't return until the service stops (usually when Windows is shutting down, but it can also be stopped and started via a control panel). The function is passed an array of strings (as an argc and argv, just like a C program's main). The first string in the array is the name of the service. The following elements of the array are the "Start Parameters". These can be set in the service's Properties dialog box:</div><div class="separator" style="clear: both; text-align: center; margin-top: 0.5em;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgg66G-CfKha9a0KnslBwh9PsvjeAI5lcHT87B-84RcgMPthKAN_IGrOj0KwcqDya6G1qC4fdAgYms0ckOMMFqa_1RIt6d2_IhQ_hsaX40CdwrTcWI5TgNZkLTevGVMAexJfa6cr64__6AH/s1600/windows-service-start-parameters.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgg66G-CfKha9a0KnslBwh9PsvjeAI5lcHT87B-84RcgMPthKAN_IGrOj0KwcqDya6G1qC4fdAgYms0ckOMMFqa_1RIt6d2_IhQ_hsaX40CdwrTcWI5TgNZkLTevGVMAexJfa6cr64__6AH/s1600/windows-service-start-parameters.jpg" /></a></div><tt>ServiceMain</tt> needs to call <tt>RegisterServiceCtrlHandlerEx</tt> to provide the service control dispatcher with a function to be invoked when Windows notifies the service of events (e.g. when Windows is shutting down, a user logs in or out, or there is a change in connected hardware).<br />
<pre class="brush: java">public interface ExtendedAdvapi32 extends Advapi32 {
interface HandlerEx extends StdCallCallback {
int serviceControlHandler(int serviceControlCode, int eventType,
Pointer eventData, Pointer context);
}
class SERVICE_STATUS_HANDLE extends HANDLE {
public SERVICE_STATUS_HANDLE() { }
public SERVICE_STATUS_HANDLE(Pointer p) { super(p); }
}
SERVICE_STATUS_HANDLE RegisterServiceCtrlHandlerEx(
String serviceName, HandlerEx handler, Object context);
}
</pre>Next, <tt>ServiceMain</tt> must call <a href="http://msdn.microsoft.com/en-us/library/ms686241(v=VS.85).aspx"><tt>SetServiceStatus</tt></a> to inform Windows that the service running, and the types of event notifications the service wants to receive.<br />
<pre class="brush: java">public interface ExtendedAdvapi32 extends Advapi32 {
static final int SERVICE_WIN32_OWN_PROCESS = 0x00000010;
class SERVICE_STATUS extends Structure {
public int serviceType = SERVICE_WIN32_OWN_PROCESS;
public int currentState = 0;
public int controlsAccepted = 0;
public int win32ExitCode = W32Errors.NO_ERROR;
public int serviceSpecificExitCode = 0;
public int checkPoint = 0;
public int waitHint = 0;
}
boolean SetServiceStatus(SERVICE_STATUS_HANDLE serviceStatusHandle,
SERVICE_STATUS serviceStatus);
}
</pre><div>For example:</div><pre class="brush: java">public interface ExtendedAdvapi32 extends Advapi32 {
static final int SERVICE_RUNNING = 0x00000004;
static final int SERVICE_ACCEPT_SHUTDOWN = 0x00000004;
static final int SERVICE_ACCEPT_STOP = 0x00000001;
}
SERVICE_STATUS serviceStatus = new SERVICE_STATUS();
serviceStatus.currentState = ExtendedAdvapi32.SERVICE_RUNNING;
serviceStatus.controlsAccepted = (
ExtendedAdvapi32.SERVICE_ACCEPT_STOP |
ExtendedAdvapi32.SERVICE_ACCEPT_SHUTDOWN);
ExtendedAdvapi32.INSTANCE.SetServiceStatus(serviceStatusHandle,
serviceStatus);
</pre>At this point the service can do its job, but it must also have some way to be notified that it is time to stop (e.g. via a flag shared between the <tt>ServiceMain</tt> thread and the service control handler). When <tt>ServiceMain</tt> learns that it needs to stop, it must tell Windows that it has stopped before returning.<br />
<pre class="brush: java">public interface ExtendedAdvapi32 extends Advapi32 {
static final int SERVICE_STOPPED = 0x00000001;
}
SERVICE_STATUS serviceStatus = new SERVICE_STATUS();
serviceStatus.currentState = ExtendedAdvapi32.SERVICE_STOPPED;
serviceStatus.controlsAccepted = 0; // Accept no more notifications.
ExtendedAdvapi32.INSTANCE.SetServiceStatus(serviceStatusHandle,
serviceStatus);
</pre><br />
<div><span style="font-size: large;"><b>Service Control Handler (<tt>HandlerEx</tt>)</b></span></div>The service control dispatcher calls the service's control handler when the requested events occur. To support just shutdown and stop events, this suffices:<br />
<pre class="brush: java">public interface ExtendedAdvapi32 extends Advapi32 {
static final int SERVICE_CONTROL_SHUTDOWN = 0x00000005;
static final int SERVICE_CONTROL_STOP = 0x00000001;
// Must return NO_ERROR for this, not ERROR_CALL_NOT_IMPLEMENTED.
static final int SERVICE_CONTROL_INTERROGATE = 0x00000004;
}
public int serviceControlHandler(int serviceControlCode, int eventType,
Pointer eventData, Pointer context) {
switch (serviceControlCode) {
case ExtendedAdvapi32.SERVICE_CONTROL_INTERROGATE:
return W32Errors.NO_ERROR;
case ExtendedAdvapi32.SERVICE_CONTROL_SHUTDOWN:
case ExtendedAdvapi32.SERVICE_CONTROL_STOP:
// TODO Signal ServiceMain to stop.
return W32Errors.NO_ERROR;
default:
return W32Errors.ERROR_CALL_NOT_IMPLEMENTED;
}
}
</pre><br />
<div><b><span style="font-size: large;">Encapsulating the Windows API</span></b></div>To avoid polluting the 'pure' Java with all of the above, I defined the following simple interface that my services would implement (which I can use on other operating systems):<br />
<pre class="brush: java">public interface ISimpleService {
int run(String[] args);
void stop();
}
</pre>The return value from run could (in a slightly more complicated solution) be used to set the <tt>SERVICE_STATUS.serviceSpecificExitCode</tt> field.<br />
<br />
These two classes are used to invoke the methods of <tt>ISimpleService</tt>:<br />
<pre class="brush: java">class ServiceControlHandler implements HandlerEx {
private final ISimpleService service;
public ServiceControlHandler(ISimpleService service) {
this.service = service;
}
public int serviceControlHandler(int serviceControlCode, int eventType,
Pointer eventData, Pointer context) {
switch (serviceControlCode) {
case ExtendedAdvapi32.SERVICE_CONTROL_INTERROGATE:
return W32Errors.NO_ERROR;
case ExtendedAdvapi32.SERVICE_CONTROL_STOP:
service.stop();
return W32Errors.NO_ERROR;
default:
return W32Errors.ERROR_CALL_NOT_IMPLEMENTED;
}
}
}
class SimpleServiceMain implements SERVICE_MAIN_FUNCTION {
private final ISimpleService simpleService;
private final SimpleServiceControlHandler handler;
private SERVICE_STATUS_HANDLE serviceStatusHandle;
public SimpleServiceMain(ISimpleService simpleService,
SimpleServiceControlHandler handler) {
this.simpleService = simpleService;
this.handler = handler;
}
public void ServiceMain(int argc, Pointer argv) {
if (argc < 1 || argv == null) {
// Missing the service name.
return;
}
try {
String[] args = argv.getStringArray(0, argc, true);
String serviceName = args[0];
String[] startParameters = Arrays.copyOfRange(args, 1, args.length);
serviceStatusHandle =
ExtendedAdvapi32.INSTANCE.RegisterServiceCtrlHandlerEx(
serviceName, handler, null);
setServiceStatus(ExtendedAdvapi32.SERVICE_RUNNING,
ExtendedAdvapi32.SERVICE_ACCEPT_STOP |
ExtendedAdvapi32.SERVICE_ACCEPT_SHUTDOWN);
simpleService.run(startParameters);
} finally {
setServiceStatus(ExtendedAdvapi32.SERVICE_STOPPED, 0);
}
}
private void setServiceStatus(int currentState, int controlsAccepted) {
SERVICE_STATUS serviceStatus = new SERVICE_STATUS();
serviceStatus.currentState = currentState;
serviceStatus.controlsAccepted = controlsAccepted;
ExtendedAdvapi32.INSTANCE.SetServiceStatus(
serviceStatusHandle, serviceStatus);
}
}
</pre>And a function to create instances and start the service control dispatcher:<br />
<pre class="brush: java">public static void runSimpleService(ISimpleService service) {
SimpleServiceControlHandler handler =
new SimpleServiceControlHandler(service);
SimpleServiceMain serviceMain =
new SimpleServiceMain(service, handler);
SERVICE_TABLE_ENTRY entry = new SERVICE_TABLE_ENTRY();
entry.serviceName = "";
entry.serviceProc = serviceMain;
SERVICE_TABLE_ENTRY[] serviceTable =
(SERVICE_TABLE_ENTRY[]) entry.toArray(2);
ExtendedAdvapi32.INSTANCE.StartServiceCtrlDispatcher(serviceTable);
}
</pre><br />
<b><span style="font-size: large;">Example Service</span></b><br />
Here is a trivial service that just waits to be stopped.<br />
<pre class="brush: java">public class WindowsServiceHandlerExample implements ISimpleService {
private final CountDownLatch latch = new CountDownLatch(1);
public int run(String[] args) {
try {
latch.await();
} catch (InterruptedException e) {
}
return 0;
}
public void stop() {
latch.countDown();
}
public static void main(String[] args) {
WindowsServiceUtil.runSimpleService(new WindowsServiceHandlerExample());
}
}
</pre>Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com6tag:blogger.com,1999:blog-8949810268237856091.post-83697022850561616412011-02-26T15:54:00.000-05:002011-02-26T15:54:14.489-05:00Distributing WorkDuring November and December of last year I developed my checkers program to the point where I could run all three of the experiments conducted by Fogel and Chellapilla. The longest ran for 5 days on my 2 processor laptop, which indicates that it'll take months to run some of the experiments I have in mind.<br />
<br />
Since I'm doing this exercise for fun, and I like building "systems", I decided to make a system for distributing work (games to be played) to other machines in the house. I've developed a somewhat complicated Java web app (using embedded Jetty) that manages a jar repository and runs (in subprocesses) Java programs whose descriptions are uploaded. The description consists of jar names and jar hashes (to avoid running the wrong version of a jar as I'm developing), a main class name, and command line. The Java program is provided with a work area in the file system, to which the remote client can upload any necessary files prior to starting the program, and from which the remote client can download any files after the program is done, including files containing the output of the stdout and stderr streams.<br />
<br />
I say "somewhat complicated" because I didn't make the decision to use embedded Jetty until late in the process, and if I'd made it earlier, I would have used more of Jetty's facilities to handle the file management.<br />
<br />
To simplify creating the description of a Java program, I created a mechanism for walking the current process's classpath, creating jars for unpacked entries (e.g. the bin folders of my Eclipse projects), and uploading them to the various servers.<br />
<br />
Next up I'm working on how to transfer units of work from the main program to the workers, and how to get the responses back. My current plan is to have the main program run an embedded web server (Jetty) with a servlet that a worker will contact to get a unit of work (e.g. two checkers players to compete), and will then re-contact with the result of the unit of work (e.g. the game outcome).<br />
<br />
This mechanism is certainly sufficient to the task, but feels a bit awkward. I think I'd like some variant of a ThreadPoolExecutor that supports distributed threads, but I've not located nor invented such a beast. Sigh.<br />
<br />
More later.Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com0tag:blogger.com,1999:blog-8949810268237856091.post-16138097923922956572010-12-29T09:15:00.000-05:002010-12-29T09:15:25.714-05:00Yes, Virginia, Testing is Important!In my zeal to optimize my <a href="http://enigma2eureka.blogspot.com/2010/12/evaluating-checker-board-with-feed.html">checkers playing program</a>, I included a number of optimizations in the search algorithm: minimax with alpha-beta pruning, plus a memory of previous partial search trees to assist in ordering of future searches, and to skip evaluations when the results are known. Too many optimizations, as it happens.<br />
<br />
I wrote quite a few JUnit tests for the basic checkers board manipulation code, but wasn't sure how to go about testing the search algorithm. I wrote some very basic tests that the pruning was happening as expected, but left it at that.<br />
<br />
I then wrote a system for evolving the neural networks, and repeated the evolution experiments described by Fogel and Chellapilla. Again, I wasn't quite sure how to evaluate the results, as I'm not yet ready to play hundreds of games online using the best evolved network as my evaluation function.<br />
<br />
So, I studied the last generation's networks and the games they played, and noticed that they'd all played the same first move... and the same first reply... and the same next reply! The first three moves were all the same. Wow! Did they all stumble upon the best opening moves? Or perhaps all of the surviving networks were descended from a relatively recent network that made that choice?<br />
<br />
I decided to confirm that initial generation of networks was much more random in its play. Unfortunately, upon reviewing their games, I found they too played the same first three moves. Sigh. Back to the drawing board.<br />
<br />
To track this down, I tried shuffling the legal move choices presented to the search algorithm, but that made no difference. Next, I implemented minimax and compared its choices to those of my enhanced alpha-beta search. As you might expect, they didn't agree. So, I implemented a basic alpha-beta search, which did agree with minimax. In the course of debugging, I found several mistakes in the over optimized alpha-beta implementation, and I don't think I've found them all. Fortunately, testing will help me figure out when I've got it wrong, and hopefully the next 1000+ generations of evolution will be more fruitful.Anonymoushttp://www.blogger.com/profile/05694832952534074799noreply@blogger.com0