## Howard Hughes Medical Institute @ Stanford

After my PhD I left UW-Madison for Prof. Mark Schnitzer’s group at Stanford and HHMI, where I worked with laser physicists and biologists to design automated neuroscience systems to increase experimental throughput.  These robotic systems are more than a convenience, as they can eliminate the need for sedatives, permit re-sampling of the same neurons over long durations, enable simultaneous observation of multiple, disparate brain regions, increase experimental controls, and reduce experimenter workloads.  I found this work interesting because these systems naturally inhabit unexplored design regimes and required varied and creative systems, mechanical, electrical, and software engineering.  Here’s a longer overview of their research.

My primary project was reconstructing, improving, and rearchitecting the fly picking robot, seen here in it’s original form.

This robot is the first step of an automated fly experiment system, where we need to gain custody of a freely behaving fruit fly and prepare it for subsequent experiments.  This paper has more details on the overall vision.

I also designed and built a 5D remote center-of-rotation kinematic mechanism to enable observation of challenging areas of the mouse brain.  I’ll include a longer discussion in a later post.

I chose to live in Palo Alto and bike commute 6-8mi every day.  This was great for my health (collarbone excepted) and avoiding the commute helped make Palo Alto more tolerable.  My second apartment was near Page Mill and most weekends I biked west and up one of the great Portola climbs.  Running the Stanford Dish, or in the Santa Cruz Mountains was also quite fun, though I missed the forest and verdant trail running I had in Wisconsin.  And it was great being 3 hours from Tahoe skiing, though I was never convinced that it was winter when it was 3 hours away.  There is much to criticize on the quality of life in the Bay Area: I found the prevalent socioeconomic class distinctions jarring and I became increasingly doubtful that they would act to fix their broken citiesAlas.  I have some thoughts on the way out of this more generally, we’ll save them for the longer post.

## If I Were: Macy’s/JCPenny

The first way to view the ongoing struggles of department stores like Macy’s, JC Penny, Yonkers, and others is by likening them to their internet adversaries. From this vantage we see their hip, sprawling, prime-retail stores as grossly inefficient warehouses, bleeding margin on

• customer acquisition: advertising to get people in the door
• labor: presentation, refolding, cashiers, and cleaning
• storefront: cost/sqft to be in the (strip)mall

These costs recur, so that for any item we can imagine (and if we had the data could calculate) the daily cost to sell that item, or similarly view how every additional day on the shelf further decreases the potential profit on that item.  Now, the whole idea of department stores is that you can use profits in one seasonal department to offset losses in others, so this is simplistic, but it communicates the basic challenge of retail.

One way to move more product and offset the cost of physical retail is to also sell online, using your brand loyalty to compete directly with the online-only retailers.  In some cases this loyalty can sustain higher prices, but in many cases it seems that department stores must price-match online-only retailers product-for-product.  And since the department stores’ cost of inventory is higher than online-onlys’, they must accept reduced profits. (Many were able to make up this loss by their close relationships with leading brands, potentially giving them access to wider product variety, better product targeting to regional stores, and likely better terms.)

But people still shop for clothes in-person, suggesting that the stores provide some value that they’re not capturing today.  So the second view on department store struggles is their historic value proposition of convenience, selection, and reasonable cost/frequent sales.  With retail items costing slightly higher than online, we’re left with an immediacy that next day shipping can’t quite match and a product selection that, while decreased in breadth from online, can be classified and filtered to a much greater extent by personal criteria.

Given this, I wonder how a showcasing model would change these underlying businesses.  Instead of selling customers in-store items, the retailer should prefer, say, two-day shipping from the regional distribution over the depletion of the in-store inventory.  I think the retailer has a choice as between selling an in-store unit with 50% of the original margin remaining versus inventory from a more cost-efficient warehouse where, say, 95% of the original margin remains.  If ship-to-home is the default, advertised-sale price, the retailer could still sell in-store items at slight $5/5% markup, as a soft preference for selling items from the retailer’s most efficient units. Moreover, by shifting the retailer’s distribution strategy from inventory-on-shelves to more efficient warehouses, they might better compete with online retailers by aping their efficiencies–in no world does it make sense to expose 5 identical products of every single item to customers, this is just an artifact of the era when the store was the warehouse. So, if I were Macy’s/JCPenny, I’d seal the deal in person and deliver in two days. ## debator//debater ## A machine to help us communicate… The last time I had free nights and weekends, 2011, our country was growing predictably chipper over the coming election, so I spent some of that free time considering debaterdebator. The idea was simple: build a machine to help us communicate. I wondered whether personal relationships could be leveraged to draw friends into better discussions on political things, whether that discussion might benefit from ready access to conversation aides, and if the participants could be encouraged beyond the cable news/talk radio and red/blue strawmen. Fundamentally, I thought (and think), that people desire increased good for their selves, family, friends, and broader culture, that they would find it disconcerting to disagree with people they already trust and interact with, and that these relationships would have the best chance of drawing people together, into some greater understanding of and respect for our mutual interests and individual concerns. Now, this idea struck some friends as obviously bad; that once you start down the partisan debate path, forever will it taint the relationship. I think that’s both wrong and unfortunate. It’s unfortunate because it is the willful maintenance of a veneer, a retreat from true, honest conversation and, actually, a diminution of the friendship. That’s Facebook, that’s the echo chamber we have today. Instead, the (benevolent!) platform could strategically choose topics for conversation and then support the users in the formulation and conduct of a discussion. So, in the case that uncle Jimbo had posted/tweeted articles and messages critical of the latest IPCC report on climate change, and nephew Jimmy the converse, our platform would begin with their relationship and detect that the article content and audience graph between these two relatives only minimally overlapped. From there it would have prompted each person to review the other’s endorsements and encourage them to pose questions to each other. Since neither participant is an expert in climate change, the platform would use the same article graph to find and suggest bridge articles that appeared to span the difference in perspective and might serve as a basis for shared knowledge and increased accord. By encouraging each person to question the other (filtering, at least crudely, against attacks) within the context of a shared set of information, I hoped that they would come to a better understanding of each other’s perspective, their own, their culture, and their world. And this would be valuable to them, to our society, and to the many interested parties. I believed this interaction was enabled by reasoning over social platforms and the simple encouraging of people to seek explanations for their positions and, in citing them, attempt to defend their sources and arguments against those mustered charitably by their friend. So, the platform should be a quest to discover truth, at first between two friends and likely reaching a trivial depth, but it had the potential to stoke something other than partisan, tribal cynicism, which we have far too much of. I believed a this system was technologically possible 5 years ago (2011); it is even more the case today, and this is slowly being realized in a predictably breathless manner: Those all referred to Facebook, and indeed their seat high atop Mt. Data enables them to rain benefits and harms on those below according to, essentially, their Ferengi whim. While there are rational fears about the power of big-data (with the right analysis Facebook can create knowledge, and hence powers, that no other entity save the NSA can approach), it need not be exercised so insidiously. I very much believe technological advances are tools that we can choose to apply for our benefit; so the contrast between what Facebook is doing and what I wanted to do is the user’s choice. Facebook wants to drive greater engagement to expose their users to more ads, to encourage consumerism. The problem is that Facebook is not helping people; it seeks to capitalize on their preferences, fears, and weaknesses, rather than aid their discovery, growth, and participation in the world. It and many other platforms today provide a useful service, but they could do so much more. ## Enable your customers Here’s one of many articles describing the challenge of maintaining personal privacy amidst the spread of data brokers http://www.mcclatchydc.com/news/nation-world/national/national-security/article166488597.html The core and often unacknowledged challenge here is the asymmetry, that people will generally modify their behavior (self-censor in some contexts) according to their surroundings and present purposes. This ability falters when we do not know what our friends and adversaries know about us, preventing us from leveraging our friendship or guarding against their power. So far we’ve avoided the utopic and dystopic extremes, but the classic example of Target illustrates the potential benefits and dangers: “My daughter got this in the mail!” he said. “She’s still in high school, and you’re sending her coupons for baby clothes and cribs? Are you trying to encourage her to get pregnant?” The manager didn’t have any idea what the man was talking about. He looked at the mailer. Sure enough, it was addressed to the man’s daughter and contained advertisements for maternity clothing, nursery furniture and pictures of smiling infants. The manager apologized and then called a few days later to apologize again. On the phone, though, the father was somewhat abashed. “I had a talk with my daughter,” he said. “It turns out there’s been some activities in my house I haven’t been completely aware of. She’s due in August. I owe you an apology.” These situations and extremes can be avoided. While the first responsibility is on companies to ethically and honestly interact with their customers, that is essential, I can see the temptation to wring customer data for pennies, and that once some return has been achieved to expand these datastores. As Target stated in the preceding, “We’ve developed a number of research tools that allow us to gain insights into trends and preferences within different demographic segments of our guest population.” and “Our mission is to make Target the preferred shopping destination for our guests by delivering outstanding value, continuous innovation and exceptional guest experience…” Years ago, I complained on twitter that companies like Roundy’s, in the notoriously low-margin grocery sector, have customer loyalty programs but retain all of the data and conclusions for their internal use. Hiding the purpose and mechanism of these programs is one way to avoid customer ire, but this tactic is one misstep or abuse from cataclysmic failure. I think it better to share the results with customers. Given the limited shelf life of many products and limited shelf space for all the rest, customer predictability is directly related to efficient stocking and pricing, but they could do much better than coupons in creating and maintaining product demand. Specifically, grocery stores should email or have an app that forecasts my likely grocery needs and serves as a starting point for the weekly grocery list. It is quite easy to curate a list of things that a given customer uses on a weekly and monthly basis and is probably close to running out of. This is a service to the customer, an aide to their busy life, but most importantly it shifts the customer’s mindset from one of generally needing to get groceries to a plan to go to a particular grocery store for these specific items…and some others. Coupons could be issued in the email/app for ancillary purchases (Cool Whip® to go with the planned ice cream), but they’re made more valuable because they can be applied to the customer’s shopping list, reinforcing the commitment and articulating the expected savings. At this point it does not matter if other stores have the same prices and coupons, they can’t match the convenience. This system would naturally encourage online/pre-bagged grocery pickup and it could give a very meaningful perspective on the customer’s consumption trends. Again, they have the data and are currently using it to improve their processes while denying the same benefits and insights to their customers; if they open the access both grocer and consumer can understand their consumption, but they can also collaborate and discover a more efficient relationship. So, I do not see a reason for companies to hide their insights from their customers and believe that the negative effects, both in bad PR and missed opportunities, exceed whatever benefit there is in deceiving customers. As we both know, it’s been too long since I had a Twinkie.. ## Solar Trains < Solar Railways When I first saw the headline “India’s first solar-powered train makes its debut,” I envisioned something that actually grappled with their substantial energy requirements and somehow leveraged their uniquely distributed infrastructure. But, instead, the described pilot project puts solar panels on passenger cabin roofs for climate control. So, small potatoes. But recalling 2014’s Solar (freakin’) Roadways (mostly a bad idea), the better idea is to create solar railways, like: Simply take the solar panels from the moving train and install them between the tracks. I don’t want to write a long post, so some bullets in favor of this idea… • generated power isn’t moving, so it’s easier to efficiently feed into the railway/municipal grid (the power density of liquid fuels is largely unrivaled by any sort of storage, so I’d bet it is better, on the whole, to optimize trains for the efficient use of energy than to dictate what source it comes from, let the electricity go to wherever it can be best consumed) • trains aren’t made heavier from the panels, or substantially modified (though, given that most freight trains are diesel generator feeding electric traction, it would be cool if tracks had an electrified rail for mountain climbs and descents) • rights-of-way and site-prep are minimal, need only design the panels to flex with or be isolated from train deformations • panels are typically exposed to the sun along rural stretches • panels would be cleared of debris by the regular passing of trains and kept free of encroaching weeds/branches by the same • train-ground drag would be reduced by their smoother surface • a distributed source of power for many rural uses and some limitations • panels are not angled to the sun (a Fresnel lens built into the glass protector could reduce losses with latitude) • total collection area is limited to very long, thin sections directly beneath or adjacent to the rail • these long, thin collection areas would require longer power transmissions than the same collection area would otherwise require (unless the rails themselves can be used as em waveguides–antennae–and efficiently) • the installation is not secured, theft/tampering more of a challenge than with other rail infrastructure I’m sure I’ve missed some attributes, comment if interested. I think the transmission issue is the most limiting, though while briefly searching about rails-as-waveguides, I saw this article about railway electrification…as it says, this is so obvious I’m a bit amazed it hasn’t already occurred. So, maybe we can electrify rail corridors and install some generation in that developed but otherwise unused land; sure seems better than a few panels powering the AC. ## #NetNeutrality As today’s Day of Action on net neutrality draws to a close, I want to post a couple of thoughts from my FCC comment here. First, it’s important to say that net neutrality is not the ideal solution, competition is. We do not enjoy the fruits of competition because of substantially captured agencies like the FCC and due to the heavy lobbying at state and city levels. Major ISPs claim to innovate as they write and support ever more burdensome regulations that greatly limit the ability of new entrants to compete, while the federal level has approved of so many mergers that the incumbents are almost entirely free of peer competition. Every notice how the flashy Charter and Comcast commercials pitch the same bundles year after year? Net neutrality is not the solution, but its presence is far preferable to its repeal. Fundamentally, I want my ISP to be a dumb pipe. Many bristle at this, but, really, all that I want is for them to convey my page requests and uploads to the address I specified and to communicate the responses to me. No more, no less. Do this better than any (hypothetical) competitor and you have my business. I do not want my data being inspected for ad targeting, page modification, third party sale, etc., and I certainly do not want the pleasure of paying you multiple times to do your job. I don’t care for cable tv or phone service or your crappy jingle. Stop wasting money on advertising and improve your quality of service, this stuff is only as complicated as you make it. Your customers cannot escape your monopoly, as your industry’s worst overall consumer sentiment ranking shows every year. I agree that encouraging or requiring every ISP to build out and maintain entirely parallel paths to each potential customer is a foolish and wasteful proposal, a telco strawman. Rather, we need local loop unbundling, whereby the infrastructure is separated from the customer service and features, just as Republic Wireless is permitted to operate on AT&T and Verizon networks. It is not out of nobility that AT&T and Verizon allow Republic’s use, but rather Congress’ recognition that spectrum is limited and should be leveraged to the greatest possible degree. As Mike says, “So, the fight at the FCC matters, but the end game is Congress.” Big changes to make US ISPs competitive and efficient businesses cannot come without Congressional action or antitrust action, but we can try to prevent things from getting worse by supporting net neutrality. I cut the cord years ago because I could not justify the expense of cable tv nor bear the incessant inanity and banality of almost every channel, and I fear that giving the same cable and media companies the same control over the internet will result in the same destruction. My FCC comment: As a recently-graduated engineer, the internet is central to my work, hobbies, entertainment, and intellectual pursuits. Never before has such wealth been made so widely available, and never before has it reached so deeply into each user’s life. Expanding the equitable access to this body of knowledge and culture is a noble goal for the FCC to pursue, and while net neutrality is not the ideal mechanism to achieve this, its continuation is far preferable to its repeal. Fundamentally, I want my ISP to be a dumb pipe. Many bristle at this, but, really, all that I want is for them to convey my page requests and uploads to the address I specified and to communicate the responses to me. No more, no less. Do this better than any (hypothetical) competitor and they’ll have my business. I do not want my data being inspected for ad targeting, page modification, third party sale, etc., and I certainly do not want to pay multiple times to ensure my data is safely and securely transmitted. My communications are extensions of myself, and when there is only a single provider to communicate them to the wider internet, that provider is in a position of power over me. I trust my ISP to accurately, impartially, and indeed ignorantly convey random inquiries, moments of frustration, deep conversations, cultural interactions, searches for truth, and experiences out of my past. Only in the days of dial-up, pre-cable internet, did I enjoy service provider competition. Since then, living in Wisconsin and now California, there has only been a single broadband provider, Charter or Comcast. Any other internet service providers have been far too limited for regular use due to fundamental deficiencies in their technology. To keep the rate reasonable, as a college student I played the common game of threatening to cancel service so as to remain at introductory rates. It always amazes me that prices go up while the service stays constant. It also amazes me that the FCC, with subpoena authority, has no interest in the internal shifting of profits from ISP services to loss-leading tv and phone units, and takes consumer prices as remotely indicative of a healthy market. While Charter and Comcast continue to produce expensive, flashy commercials for the same crappy bundles, year after year, that tells me that their advertising is not performed for competitive reasons, and it certainly hasn’t improved their customer satisfaction rantings. Instead, they desire to maintain their image as being in and with the times, to dull the pain as they abuse us. Our country is neatly divided between the major ISPs, and until Congress gets its act together to force local loop unbundling, net neutrality and regulations like it are the only check on my local monopoly’s power. Ben Conrad ## PhD Defense: Redundant Design and Adaptive Control of an Interleaved Continuum Rigid Manipulator This past Monday I completed my PhD in Mechanical Engineering, bringing to a close my years as a student and my time at UW-Madison. While I expect to break out parts of my research and dissertation in greater detail in the coming weeks, for now I want to post my dissertation and a recording of my presentation. Many people have contributed to my experience at Wisconsin, so I thank my parents, my advisor Mike Zinn, my committee, and many friends for making these years enjoyable and fruitful. My dissertation (.pdf, 105MB) and abstract: Continuum manipulator compliance enables operation in delicate environments at the cost of challenging actuation and control. In the case of catheter ablation of atrial fibrillation, the compliance of the continuum backbone lends an inherent safety to the device. This inherent safety frustrates attempts at precise, accurate, and fast control, limiting these devices to simple, static positioning tasks. This dissertation develops Interleaved Continuum-Rigid Manipulation, by which the hysteretic nonlinearities encountered in tendon-actuated continuum manipulators are compensated by discrete rigid joints located between continuum sections. The rigid joints introduce actuation redundancy, which an interleaved controller may use to avoid continuum nonlinearities and dynamic excitations, or to prefer particular configurations that may improve task accuracy, permit greater end-effector forces, or avoid environment obstacles. Two experimental systems explore the potential of these joints to 1) correct for actuation nonlinearities and enhance manipulator performance and 2) increase the manipulator’s dexterous workspace. These experiments expose important design and control observations that were not apparent in the general robotic and continuum literature. My presentation: slides (.pdf, 8.4MB) ## A Longer View On Academic Publishing Platforms And Innovation TLDR: Patentese values superficial bets on the future of technology and society, to the detriment of technology and society. I’ve been watching the relationship between researchers, publishing platforms, and IP evolve with some interest. To choose some examples: What technology will the next Amazon/Google/Facebook/Uber deploy? Today’s unicorns consistently apply a novel assortment of known technologies to some large market beset by some inefficiency. In the absence of oracles, the next unicorns are guessable. If you can observe the pace of innovation in any particular field — say the number of publications per year — you quickly learn what technology sectors are interesting to academics and also receiving funding. If you see this rate of innovation increase, that signals that something new has occurred. And because researchers and funding agencies like to be fashionable, a quick semantic similarity analysis across the literature in that increase will give some sense of what the excitement is about. Most researchers want to make ‘life’ better, and most people are happy to pay for a better life. Since researchers are generally only excited by progress towards their discipline’s goals, and since those researchers inhabit the same reality as their eventual consumers, what excites researchers will probably, eventually, ideally, be valuable to society. So, if you notice increased innovation in some sector and realize that, by the nature of that sector, growth will impact many, well that’s worth paying attention to. Having some intuition, it’s time to place bets. The form of the bet would differ by institution: funding agencies can target for some desired effect (say, Rep. Smith) and trolls could weight their acquisitions by maturity and potential scope (today’s trolls as amateurs). So, such a learned-intuition technology is agnostic to the ends (as ever), but, the actors differ greatly in their ability to amass the underlying database, and in the rewards for applying it. Who is positioned to leverage the learned-intuitions? Funding agencies are doubly disadvantaged: grant reporting is sporadic and does not approach the rigor of (generally privately-held) peer-reviewed journals, while the ends will always be subject to the shifting winds of bureaucratic debate. (Moves toward open access, data, and analyses would remedy both of these, though they may be stymied lobbying.) With the government’s manifest inability develop and apply new technologies, pessimism is warranted. Certain, other entities are advantaged as this relatively cheap method can extract more value out of already-held repositories. I picture this as a(n) hourglass, where fields of research may be analyzed (both semantically and in the author/referenced/viewer graph) to identify emerging trends. This is the broad, upper funnel. The observed trends may be entered into patents, where the patentese (the stilted language encountered in patents) can hide the absence of a reduced-to-practice, coherently-understood innovation. This is the hourglass’ neck, with one ideal being a single patent that draws inspiration from many observations and thereby is able to make broad claims across products (the hourglass’ expanding lower chamber). A form of this speculative patenting occurs in many university tech-transfer offices today, who grasp at any IP in a projected-to-be-sexy market, but is greatly improved by the intuition wrung from a large database. The problem, the social harm, is that the standard of proof differs between the literature, patents, and the market, and so do the rewards. Researchers, ‘the literature,’ broadly value interesting, well-posed, and thoroughly-explained experiments. New works are valued by their novelty, by the degree to, and manner in which they solve the associated problem…and not by their sweeping claims. Good work is not immediately and individually rewarded, but appreciated in aggregate through greater grant success and honorariums. The market has a related interest in things that work, that solve the consumer’s problem…and little patience for those that do not. It does not reward ideas but their execution, and there is a federal agency to restrict claims to reality (the FTC in its consumer protection role). Between the researcher and market lie patents; on one side patents draw inspiration from disparate developments and on the other they seek to claim parentage of broad swaths of future products. The reward is (US, typically) a 20 year monopoly over all possible renditions of the claimed idea, far beyond that which was realized during the original application. Whereas both the literature and the market incent specificity, the patent system incents vagueness. Combining the aggregation of the academic literature into large databases in machine-readable forms with big-data analyses can* yield patentable claims. While there are probably big-data ways of evaluating market potential for determining the risk of particular claims, I suspect researcher interest to be a good proxy. (At least in some domains, as great interest and excitement in the newest particle or complexity does not a market indicate.) I imagine the typical result to be patents like Myriad’s BRCA-1 test or the CRISPRs; things that are close to the literature dressed up in patent language. The patent application is not going to be reviewed for actual utility (as FDA does) nor is the patent examiner going to verify that the claims are possible, only that they are plausible given the state of knowledge. As the burdens and perverse incentives (see the paper) of the examiners are widely known, entities might craft patent applications whose background summary and prior art are not representative of the literature but tilted to their benefit. Again, it does not matter (to the applicant) whether the claimed innovation actually functions, but only that it appears plausible. The risk of discovering this (potential) impracticality is reduced by the patent thicket, where the number of granted patents is more important than their quality (courts are similarly burdened in testing the leveraged claims). Without reforming the incentives, rewards, and norms of the (US) patent system, I fear that they will become an even larger vehicle for rent-seeking. Who’s the master of these databases, and really, the knowledge they contain? For, as I mentioned, the semantic analysis of the literature may be put to other, more socially-useful purposes. It is important to remember that the purpose of patents is “To promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries;” the very advance of technology has rendered sub-optimal the current form of the patent system. I am not against patents, but I do ask that they be useful. *an assertion I think true, whether this is possible today or tomorrow is debatable ## Getting started in Simbody [latexpage] I’m testing Simbody for mechanical system modeling, here are some quick notes. # VisualStudio 2016 setup I followed the x64 VisualStudio instructions, placing the final installation in C:\Simbody. After creating a new C/C++ Win32 Console Application and a blank source file (to access the C/C++ options), I entered these project properties: Configuration Properties | VC++ Directories | Executable Directories = C:\Simbody\bin Configuration Properties | C/C++ | All Options | Additional Include Directories = C:\Simbody\include Configuration Properties | Linker | All Options | Additional Library Directories = C:\Simbody\lib Configuration Properties | Linker | All Options | Additional Dependencies = libblas.lib;liblapack.lib; pthreadVC2_x64.lib;SimTKcommon.lib;SimTKmath.lib; SimTKsimbody.lib (all in the lib directory) I’m no expert on VS; the above work for me. # A basic check of the dynamics Let’s model the elementary mass/spring/damper system and calculate the natural frequency and damping from the response: See github and the many comments in verifyDynamics.cpp. Included there is a basic PositionReporter that writes positions into a given text file. The values given in the figure allow calculation of the system’s response, with some relevant parameters being:$\omega_n = \sqrt{\frac{K_{spring}}{M}} = 2.2361$[rad/s]$\zeta = \frac{C_{damp}}{2 \sqrt{K_{spring}M}}  = 0.0112 $(underdamped)$\omega_d = \omega_n \sqrt{1-\zeta^2} = 2.2359$[rad/s] Now, let’s measure these same parameters from the simulated response. Running verifyDyanmics.cpp produces a csv file with the x position written out every 0.1sec. Plotting this in Matlab gives: The response is the black line, with blue x’s indicating the identified peaks and valleys. The dashpot causes the initial 10m/s velocity to decay exponentially, with the response envelope given by$A e^{-\zeta \omega_n t}$. If we measure the time between peaks or valleys, we find the damped time constant$t_d$, which is related to the damped natural frequency by:$\omega_d = \frac{2\pi}{t_d} = 2.3639 $versus$2.2359\$ calculated above.

The above response also draws the envelope function, with the cyan giving the calculated envelope and the red dashed Matlab’s 1D exponential estimate.

So, this was a very simple check on Simbody’s physics and a good excuse to figure out how to easily move data out of Simbody.

## PhD Preliminary Examination

I gave my prelim presentation this past January, recapping my work with Prof. Zinn on Interleaved Continuum-Rigid Manipulation. The presentation went well and I enjoyed the audience’s and committee’s questions, most of which centered on things that would be very fun to look at given infinite time. As I hope to graduate soon, I’ll only be able to look into a few of the most fundamental questions, saving others for later students. With that, here are links to my document and narrated slides, followed by the prelim’s abstract:

Preliminary Thesis [45MB PDF]

Slides [6.1MB PDF]