Spearfish: Real-Time Java-Based Underwater Tracking of Large Numbers of Targets

my name is Bob cross I’m from the Department of the Navy I’m gonna talk to you today about not frameworks not groovy not Scala this is Java pretty much pure and simple a lot of Java desktop solving problems that maybe you didn’t know existed all right so we’re talking about this spear fish underwater tracking system that we use at a bunch of the u.s. Navy ranges and several of our allied ranges use it as well in one variation or another spearfishing is 100% java it’s entirely written in Newport there’s six or seven core people who’ve been working on it for somewhere between seven fifteen years possibly a little bit more in some cases we’re deployed on Windows and Linux we run on laptops we run on you know basically commodity hardware in general people get a little sqeezed out when we deploy on laptops because it’s not very impressive but I can run most of the real-time operations of the auto tech range from this vendor supplied laptop right here we can track to quite a deep depth we can track as deep as our Navy assets can go and that brings me to a very important point everything I’m going to say today is unclassified the things that I’m talking about do have classified performance metrics I will not be coy if there’s something that I can’t speak about I will try to point you to a source and I have the public affairs contact information at the end of the brief if there’s something you’d like to ask more about and if I don’t know which condition it’s a and I’ll tell you you know we’re not gonna fool around here we can track along a lot of targets and we can track them very fast we’re talking about commodity hardware like I said now modern multi-core processors allow us with the basic concurrency that we get from Java to go at least a hundred times real time in most of the scenarios that we’re dealing with and when I say we eat our own dog food I’m not kidding this is actually a system that I’ve taken out to see in situations where I had the opposite problem that you would normally think of the water was too flat so the kind of missions that were actually talking about and I when I use missions you know very carefully we’re talking mostly about training exercises where we have officer candidates for instance prospective commanding officers who are learning how to drive a submarine and they’re chasing other submarines around they’re chasing targets they’re getting chased by helicopters that are looking for them we can have service ships doing the same sort of thing gunfire exercises we also talk about test and evaluation of systems and that could be a new weapon type for instance or a weapon with new control software on it and we also have sort of new scenarios and that’s the catch-all where we take some piece that we have in inventory and we try to get it to do new things sometimes things that you did not know where possible that’s always fun so this is the problem space in which I operate the ocean hates all of us it hates me in particular but humans in general alright there’s nothing about the ocean that makes the tracking problem any easier anything you would normally think of you know a lot of what I’m going to talk about today has parallels to GPS tracking just some of the same vocabulary and the applications that we’re talking about nothing about GPS works nothing about lasers nothing about radio waves none of that works underwater it’s great at absorbing energy except for acoustic energy certain frequencies carry for quite a long ways because the ocean water is not really compressible it’ll carry that energy you know as far as we want it to in certain frequency bands unfortunately there’s a bunch of stuff in the water that’s already making noise squarely in some of the bands that we care quite a lot about in the final problem is that the Navy really doesn’t care about my convenience they’re out there to do their job and if I can track a better track that comes straight down from the Admiral right so here are some of the major system components and the java portion is if we follow this curve all the way down and up to the left everything from detection for reports up those are the java components that i’m going to be speaking about but we’ve got system in the water that would be the big red oval that could be a submarine that could be a weapon that could be a submarine simulator that little green triangle is what we call a ping er there’s a box or a ring or something it’s basically a transducer it’s an emitter in the water that sends out signals and that curved line that line is curved for a reason okay that none of our signals go in a straight line you know GPS deals with this a little bit our curves are dramatic and we get down to our hydrophones which is the microphone on the bottom I’d usually

way deep down in the water so we’re talking about 2,000 4,000 meters down we have hard lines going back to shore that are talking to the signal processor system that’s where we have real-time tag tagging this is not job that is sitting in a digital signal processor box also made a new port but that’s running you know Linux kernel and that’s dealing with signal processing cards and they take all that racket and they turn it into detection reports that I can then turn into track and put up on the screen and that’s the spear fish underwater tracking display system so the ping that we send out it’s an encoded signal it’s in most of the cases that I’m dealing with nowadays we have essentially an identifier which is 76 bit long which is a Hamming code it is essentially counting up from 1 to 12 so we have 12 different 76 bit codes this pair that we’ve got mounted to the box just transmits those pings for me usually once a second if we’re down at offset for instance we’ve got sound going through the water and it goes roughly 1,500 meters per second it is never going exactly that speed for any time any time that I think I know what the speed is is gonna change because of depth or salinity or temperature there’s a hydrophone down at the bottom and the detection report is just essentially taking the acoustics that came in to me or it came in to the system and it’s turning into data that I can play with on the Java side all right so the goals of the deployed system range safety is absolutely the most important one and again this is not a problem that most people are going to be talking about today we have for instance two submarines they are actively hiding from each other and trying to find the other one because they’re hiding they don’t know where the other one is and I don’t know if you’ve seen Hunt for Red October where you know who this is neat we can see the submarines and it’s really close together leaving aside the question of the fact that it’s really dark underwater and you wouldn’t be able to see the submarines if they were that close there would be a huge brouhaha as we would send messages up from our bi-directional phone saying no you guys you’re way too close because you cannot turn like they do in the in the movie they’re like fighter planes so we based Bay on our end we have to basically detect and track range participants this could be a submarine that just clicked off its pinger it could be a surface ship it could be an exercise torpedo dropped from a helicopter that’s going to go down and then come back up to the surface and need to be recovered so we’re now tracking it on the surface so the recovery boat can drive out and pick it up we’ve got a scale and degrade based on how much data data is going on we can never drop down below the data flow that’s coming in so we can never run it below real-time and when I say real-time I’ll talk about exactly what that means we need to let somebody know if there are problems and we need to provide some accuracy and accuracy is a term that changes based on context we have absolute accuracy which is relative to essentially GPS accuracy you can have relative accuracy where if I know that these two submarines have it askew and they’re off by some number of meters off to one side as long as I know the separation between them they’re not going to hit and the Admiral won’t get mad at me all right so real-time different system components have different real-time capabilities when we’re talking about the conversion of the acoustics into detection reports we need SEL millisecond accuracy hopefully down in a microsecond world we’re talking about seventy six bit signals at 13 kilohertz ish that’s a fairly long signal we’d like the time tag to be up right on the leading edge as it arrives at the hydrophone sometimes that works sometimes it doesn’t remem what I said about the ocean not caring all right depending on how the track accuracy goes that starts eating into my error budget okay if the signal processor can’t time tag to some level or accuracy there’s nothing I can do past that point it’s not garbage in garbage out but it’s less delicious you know going through the the recipe we have some latency and again there’s nothing we can do about this speed of sound roughly fifteen hundred meters per second if you think about the depth of the water you know we have hydrophones that are easily three thousand meters deep that’s two seconds straight down best case so already I am behind real-time and as I I need as I’ll show you I need to accumulate data from many of the hydrophones so there’s nothing I can do about it I have to wait for the signal to go out to those phones collect it and then turn that into a track as quickly as I can but that transit through the water that’s just a consequence again ocean not caring we need to be multi-threaded in the sense of we can’t hold up our processing anything that we put on the screen can’t

slow down the processing of the data most importantly we can never ever lose any of the data that we receive okay there’s a hundred percent data retention that is actually a requirement okay there are software requirements when it comes to display in terms of they really really really want to see all the data on the screen but if you have to sacrifice one it’s the data retention in the database that’s the most important all right so this is essentially the funnel of data as I’m talking about the whole system like I said we’ve got a lot of acoustics going on ping-ping-ping we’ve got snapping shrimp we’ve got mammals which are a phenomenal benefit to the ecology and kind of a hassle when it comes to tracking turns out that some of our tracking capabilities can produce quite a lot of interesting information about the mammals and we’re funneling it all the way down and as we neck it down further and further we’re trying to turn it into a location all right so like I said a paying is in a coded encoded acoustic signal the payload as far as I’m caring there’s a target ID one through 12 or 2 1 2 through 12 codes so that allows me either on range 12 items could be submarines could be weapons could be targets could be surface ships or if I’m using what are called frame pings up to 63 63 targets would be a lot for the Navy to field at one time it is not a lot for tracking tracking is fine with that much all right so each one of these painters has a repetition rate just like in GPS where we’re talking about a one-second signal quite often we’re talking about a one-second pinger in certain circumstances we use a different repetition rate in for instance a vehicle that does not move very fast because like I said we’re pushing all these acoustics through the water if I can avoid a little bit of noise pollution I can potentially get a more accurate track on things that I care about a lot like for instance weapons care a lot about weapons and these pingers tend to point down okay this becomes a problem if for instance my weapon gets to the end of run comes towards the surface now it’s emitting energy away from the hydrophones that I’m listening on and that can be a problem again ocean not care so then we have this concept called a splash a splash is anything else that’s tracking mammals that’s tracking gunfire scoring that’s tracking simulated weapons that we push out of a helicopter and if you look super super closely you can see me sitting in the portside gunner seat of that uh-60 and this is one of those situations where I bring the pictures home and show them to my kids who I add up on the screen earlier they’re like dad has the coolest job ever okay because this is the low flight where we’re down at about a hundred feet up and we push the big marker buoy out of the the helicopter makes basically a kabloosh the bigger one on the bottom is from a 1,500 feet and that made quite a kaboom it was really neat it took a long time as I was watching it fall okay so backing up again and I I know that I’m going through this several times but nobody else in JavaOne is talking about anything like acoustics so I’m doing this super quick overview of some of this system components so that i can get to the java part that’s coming we’re almost there alright so hydrophones detecting sounds the sound is converted to a voltage not useful to me at this point they take that voltage in the signal processor and turn that into a ping or a splash detection report they start turning it into data that I can process now limitations on the signal processing side like I said the water is noisy not only can it mean that we potentially lose data we can also get corrupt data coming in on the detection report this is a real problem no one cares I still have the requirement to track we have these bad angles that could mean that I’m getting further and further off range I’ll show you during the demo the actual range coverage area again you know the guys operating the range want to be able to say this particular submarine is coming on range I expect them to be on site at such-and-such time they want a large footprint rain safety you know and also rain scheduling all right now we’re in the Java world okay this is everything that I deal with all the time the tracking part of the problem I’ve said detection reports a lot of times like I said detection reports can get corrupted if we got potentially bad data so we have to run through a validation process that is essentially a set of rules that says here’s my raw stream that’s coming in from this particular hydrophone based on the logical rules that I can go with what subset of data here is probably valid for the target that I’m looking for at this time then I have to turn it

into a localization that’s saying take the data from multiple hydrophones the valid data that came out of the validation process as we’re coming down sort of through the tentacles take a bunch of those validated data streams start turning those into positions now I have to couple that with the sound velocity profile like I said at the beginning the sound never goes in a straight line so all the geometry that we’d like to try to solve is extra complicated and squishy and requires a lot of approximation all of this work to get one point on the screen at time it’s a lot of installation one point per second usually okay some of the limitations like I said 1500 meters per second plenty of the assets that we’re talking about can travel in excess of at least 1% of that speed this is not a problem that you deal with in GPS with Lightspeed there is no time when anything you can track with the GPS system goes point one see it doesn’t happen yeah I’ve been to us all the time no one cares okay a not a nautical mile just for rule of thumb I’m gonna flip between units every now and then I can do math better in meters but a nought is about if you’re talking about a meters per second that’s about two knots so it’s okay for doubling and you’ll see immediately again the ranges like to use whatever units they use so they’re using feet and they’re using yards and all sorts of crazy numbers that I don’t use anymore and like I said we have plenty of latency in the whole equation all right now getting into requirements other than the tracking we also have this weird sort of derived requirement that is everything has to be deterministic all the time so we’ve got a data flow architecture where all of this data is coming in through the hydrophones and going through validation etc if I rerun that exact same data no matter what speed I push it into the system the output has to be identical this is a requirement it’s kind of frustrating because a lot of your parallelization options don’t apply in that situation or the architecture gets a little bit more complicated we have to run essentially with a flexible buffer to deal with this latency issue as we’re trying to capture all the data that’s required to track a particular system in a particular point on the range the depth varies on the range we could be looking at hydrophones that are 1500 meters deep some of them are 900 meters deep some of them are 4000 meters deep on the same range so we’re constantly varying this buffer to say how long do I have to wait before I get all of the useful data and a lot of what I’d like to be able to paralyze too bad has to be single-threaded because of all this did the determinism requirements alright so backing up a little bit when I was talking about the validation process this is kind of what it looks like if you look at this bottom row I’ve got you know a time axis going by on the bottom and okay I’m just gonna describe it I’m not gonna make you look at the numbers except that big bars are good green and yellow bars valid everything else less good all right so on the bottom hydrophone seven is receiving what looks to be about one ping per second and it looks like we’ve got mostly yellow and green that’s all looking great there’s a little bit of blue way off to the left there up there on the hydrophone one we’ve got a little bit of validated that seems to have started everything else quantitive looks like a snow storm nothing really good contributing to the track at this point I need more data than this before I can start tracking again this is one of the layers of frustration that we deal with it looks like we’re receiving data we won’t have track at this point so you have your range customer who’s then yelling at you saying I see numbers like yeah but you’re not gonna have track yet so this is what some of the sort of data input looks like this top line again I’m not going to make you read it you can see there are 12 codes on the sequence pings they’re running through essentially 0 to 15 and then they roll over basic counter going over and over there’s also this M and L ll that’s an encoded depth there’s a depth sensor on the payer for once somebody’s trying to make my life a little bit easier and it’s trying to tell me I’m roughly at this depth except they’re not going to tell me the real number they’re gonna tell me four bits at a time so we’re gonna tell me a high nibble or a low nibble and they’re thinking well the low level will probably change more often so once a second I get part of the numbers unfortunately this is the most easily corrupted part of the ping so as I’m doing data reconstruction I’m sitting there trying to do bit matching between all of these different hydrophones trying to figure out how many of these ones and zeros should actually be on down at the bottom we have a different pattern instead of saying okay I’m counting all the way up to 15

I have sort of a syncopated rhythm between two different codes you can see this 1 1 1 1 2 2 1 2 1 2 that’s what I would call a framed ping where instead of telling me I am ping 0 I am ping 1 I’m ping – it’s saying somewhere in this if you keep track of where you are in the pattern without you can deduce what index this is this actually makes things fairly complicated and this is a little bit down in the weeds but again it goes with the theme of no one makes anything easy for me this allows the Navy to have more targets on range at the same time actually more targets than they can have currently deploy and most of the exercises that we deal with and you’ll see if I’m using a sequence ping which I’m not going to use in the example data today I have mostly green a little bit of yellow the frame pains are mixes of yellows and greens when they’re validated all right all of that was validation now we’re talking about localization and this is where anybody who’s worked on a GPS type system some of these words will sound familiar I have validated data that came out of my validation process I have known hydrophone low locations on the bottom and finding all those hydrophones once they’ve been deployed down to sub meteor accuracy is an experience that is a fairly heavy-duty evolution they do an extensive hydrophone survey and they mathematically compute using multiple GPS antennas all right the hydrophone must be here 4,000 meters down that’s very important and then they use those positions for upwards of 20 years because funnily enough things don’t really change down there very often so I’ve got my hundred frelling locations I have that ping ordering it’s the sequences or the frames and I have my sound velocity profile now using all of that I need to come up with a position and this I’ll show you what this looks like this is a spherical loop this is a spherical tracking problem sort of in action and as you can see it doesn’t seem to be lining up quite right this is the tracking that you would normally use this is what is often used in the GPS tracking algorithms is hyperbolic tracking what they do they say all right I know that I received a signal in this case I’m the middle from somewhere on this curve in the center I know that I have a time of arrival at hydrophone a I have a to right time of arrival at hydrophone B and as I look at those two times as ping wana rise at those two phones it arrived at hydrophone one wasn’t second earlier then it arrived at hydrophone two so I’m looking at a curve where the time difference of arrival is identical so in the center it could be time is one on the you know connecting to a time two and then time two time three times three times it’s times four so that’s gonna drive a curve through the water so that’s the harder one this is really the easy one once I know the actual time of admission which again in the GPS situation usually do because those clocks are going out once the second exactly on the time if I have a what’s called a synchronous pinger in the water I can do spherical tracking and that’s just saying all right I know the time of arrival I know my time of omission my delta time there divided by the speed of sound through the water that’s going to define a radius so I have a radius of possibles ations around hydrophone a I have a radius of possible positions around hydrophone B if I look at where those two things interact intersect there’s two possible positions if these hydrophones are sitting on the bottom and I’m looking at essentially top and bottom I am fairly certain that my submarine is not beneath those two hydrophones because there’s dirt there okay so with those two phones I can make a reasonable guess right that’s really not quite enough because like I keep saying sound doesn’t travel in straight lines in the water so what we do is we use what’s we use a ray tracing approximation to pretend that it does what we do is we cast a whole bunch of rays through these sound velocity profiles and say how about the pinger was here talking to this hydrophone well that would be this particular ray path well imagine there was just a straight line let’s take that transit time divided over this particular straight line path and call that an effective sound velocity profile essentially what we do is we build a great big lookup table doesn’t take too terribly long it is currently infeasible for us to do this every ping which is once per second for every submarine and weapon that’s currently in the water but by doing that we are able to meet essentially our engineering requirements you know the accuracy that we lose by not doing the full rate race isn’t any worse than what we’re dealing with with the Ackerson snapping shrimp that are messing with

our pings so like I said we pre compute all the data and then we store a whole bunch of date tables we can store per month we can store per day we can store per exercise or we can just say look fifteen hundred meters per second we don’t have time to measure and like I said we’ll get relative accuracy at that point everything will be skewed kind of the same direction all right so spherical tracking there are three kinds but here’s essentially how the mathematics work like I said if I have two phones I have some left-right ambiguity I can’t but pick between those two possible solutions if I add another phone so now I’m listening on three phones so I’ve listened long enough to know I’m getting data from hydrophone C it looks like in the 2d case and by the way we never assume that water is flat we understand that the earth is curved we do deal in Cartesian coordinates just because it’s easier to do the math but in the Z plane you know there’s actually a curve a water latitude longitude depth let’s just be clear right we’re not dumb so we’ve got the three phones and as you can see all right those three phones together give me one possible location they only intersect in one place people who have done this sort of thing is you’re talking about solving systems of equations actually look at that a little bit sideways because you think whew I only just barely have enough data to pick that position if I was to add another phone what if that had potentially disagreeing information in this case hydrophone D maybe it’s got some bad data maybe it picked up a multipath you know instead of getting a straight line path to the bottom maybe it picked up a ping that went up and then came down maybe it’s just got garbage some reason there’s a false detection or whatever now I’ve got more data maybe I need to bias my solution so I can potentially skew kind of up there to the upper right or maybe I can throw out that phone and a little bit more error tolerant and like I said if I add more data my tolerance for error only goes up in the two dimensional case in the three dimensional case now I can actually drop a hydrophone again right I’ll imagine the idea to a whole bunch of intersection diagrams using hyperbolas I can’t do that in PowerPoint it’s awful so this is the three Hydra phone case essentially you need one more phone for hyperbolic tracking than you do in spherical tracking right in the tracking scenarios on the ranges that we deal with we have standard conditions the submarine is largely driving in straight lines for relatively long periods of time we have targets that are more or less doing the same thing surface ships looking for all of these things again largely going in straight-line paths life is pretty good in most scenarios now the submarine starts launching weapons it the weapon has a ping er the submarine has a ping er which one wins at the signal processor because they’re in the same place they’re emitting it’s the same transit time so the sounds are arriving at the phones at the same time there’s contention in the water now so it’s going to take some time before the weapon separates from the submarine and we can begin tracking each one of those and what we call the end of run by the way these are exercise torpedoes they’re firing at each other but they don’t make holes in the other submarine they turn away and then they come up and they have to be recovered we put different heads their weapon hits on the top there are war shots generally they’re a different color and everything at the end of run so they’re out of fuel they need to be recovered they go vertical and if they’re going from relatively deep water it can take them quite a while to get back to the surface like I said the pinger emits down except down is now to the left the hydrophones are down here so all that Energy’s going that way there’s an excellent chance that I’m not tracking anymore and that’s all kinds scary because now there’s a weapon in the water doing something and then it shows up on the surface and now it’s rolling around in the waves because the waves in Hawaii are really quite large and it gets hard to track again because the pinger is supposed to be pointing down but it’s rolling around back and forth sometimes it’s pointing up it doesn’t go well through the air all right now all the way back out of the mathematics now we’re into the user interface requirements world still and Java what we’re trying to do in this world is look at this big firehose of data it’s really multiple simultaneous fire hoses of data and we’re trying to provide to the expert operator of which I am one which is sometimes to my own consternation as they send me out to the water to do stuff we try to give the operator many slices of and views on the same data so we want to say all right operator you need to be able to look at this data stream so we have that time series graph maybe I need to look at the numbers maybe I need to look at the bars

maybe I need to turn that whole graph on its side maybe I need to look at the visitor positional data and all the time series the core speed in depth don’t try to read this this is the numeric chart that an operator would normally deal with the numbers are too small but essentially we’re looking at a standard table view where we’ve got course speed and depth and all the other tracking mathematical parameters that I care about but we also have the emission time that would be this column right over here this emission time this is the last time that I had a good track on this target that is a very interesting number because I want that number there to be very close to that number there because if there’s a large Delta between those two times I haven’t had track on say this weapon for seconds and or minutes and this becomes a range safety problem rather rapidly so again this is a mission time here I’m talking about this MTG number down here yes that’s what that’s why I called my master time number okay so that’s master time for the whole system these numbers over here these are calculated in terms of Master time they’re tracked delayed and they’re the track delay column has gone off to the right their contact time is when I first detected this thing this is data that the range cares about a lot when did this submarine get detected and tracked and when was this weapon actually launched was it launched on schedule people get scored on this sort of thing and track acquisition time quite often track acquisition time precedes contact time because the contact time is when validation kicked in and said yep I’ve got validated on this thing it turns out that validation can then backtrack and say well perhaps I had valid data all this time but I had to buffer up enough to say yeah I think I’ve got this guy and then I can say back up say about four seconds and so that sometimes confuses people and we’re looking at seven homes in solution this is hyperbolic track you can see we’ve got quite a lot of hydrophones validating right this is another way of looking at the data that’s coming in again don’t try to read what I’m looking at here is the auto tech hydrophones this is the sample data that I joined up in my basement last weekend we’re looking at the hella tech hydrophones we’ve got multiple targets in the water this bottom here is all the raw data the green is showing me validated data so we’ve got a whole bunch of phones that are validating data coming in on these tracks if you’re an expert operator again you’re counting up the data per second that’s arriving you expect to see about one ping per second if you see multiple per second arriving then you’ve potentially got bounce paths going on or some kind of reverberation this is all that data but now I’m looking at it in terms of numbers again no reading the numbers are too small anybody is super curious I can play it up here after the show and we can look at the real numbers that arriving Green is good yellow is good white and blue are bad what this is showing is that I’m not only got more data than I expect I probably am listening to multiple targets on these phones I may have ID contention and I know I do because I’ve made the data stream that’s what it looks like if you don’t use the numbers okay this is that time series data that I was talking about before new data on the right you know cascading over towards the left these are the hydrophones I’m listening in and you look at that it looks kind of like a firehose and this is definite ID contention tracking is doing fine with this scenario I have greens and yellows carrying on just fine but the operator is looking at this and saying I don’t understand this at all so if I turn the data sideways yet again I’m listening to a hydrophone 45 on target one two so I have these two IDs that in a frame pinger that I was talking about before one one one one two two etc this is the actual submarine that I care about over here is something else and there is something else yet again this is some other target that’s using a one for instance maybe it’s a one three maybe it’s a one for this scenario happens all the time it’s actually very convenient when you’re watching a weapon behave close to another target because you start getting unfair docked Oberer scenarios and you can watch the behavior for instance if you have a sudden change if you add for instance a slope upwards and then downwards that’s potentially a target or a weapon locking on change of speed and I’ve done that during some of the exercises and it’s actually quite fun you really can be you know Demetri the

sonar operator all right a little bit of history I promised in the the talk summary that I would talk about sort of the history of concurrency on this problem that we’re trying to solve here and that’s that this system began before Java really was an operational language indoor platform it was towards the days when C++ was really becoming viable fish I was in graduate school back and I was using C++ but I wouldn’t recommend it to other people back then so some of the code that is in the platform right now dates back to those days this is something that I’ve heard in multiple talks while I’ve been here this week is that the burden that you put in the code right now the framework or the implementation or anything like that the lifespan is long some number there can be some average number I don’t know what that number is cuz it seems to be getting bigger as time goes on I would like to say that we’ve dealt with most of our horrible threading problems I cannot say that we’ve dealt with a hundred percent of all of them horrible yes all of them new our requirements haven’t changed during that time you must not have data loss concurrency back in the day poor implementations of concurrency lead to deadlock in data loss so that was the motivation for solving some of these things so our goal is no data loss no interference between interface and the actual processing this is an example of one of the problems that some of the early implementation code sort of inflicted on ourselves this is very pseudocode pseudocode a eyes to try to illustrate what the problem was back in the day someone decided that we needed to have a client-server implementation rather than a single process and so this was you know RMI looked like a really good idea Barham I potentially a good idea you know to say alright here’s my user interface event I want the system to do something unfortunately the implementation that the person chose was no longer with government was to then make another RMI call from the server back to the user interface which is an immediate implementation of deadlock it’s just deadlock that you haven’t detected yet because who knows what’s gonna happen you click the button to fast and now the entire system is locked up because it’s trying to process one blocking call while the other blocking call is reaching back and so that was when I arrived at the government that’s one of the things that I started doing first was just clicking the button a whole bunch of times it’s like yep I locked the system again like no no no no can’t have this so in later days what we said was hey you know eventbus it’s not bad it’s not the perfect message-passing architecture but it isn’t bad and it wasn’t hard to put in place because you could see a lot of the parallels most of what we’re trying to do in the user interface world is not time-critical the submarine is six to eight seconds behind where we think it is we’re already late so there’s no reason for us to sit there and say it is super critical for me to click this button and see an immediate change on the server side I would like it to proceed a pace but more importantly I don’t want to lock my user nerve interface or my processing I don’t want to inflict any damage on either side and so what we do is we’d say all right my detection reports are coming in I’ll pop them in a list you know I’ll make that a synchronize list so that I’m not going to have contention well I’m not going to have an actual data loss problem but now I’ve potentially got blocking between these two processes but it’s better than it was before I can pour that data through my event bus and at least allow my data acquisition to proceed apace maybe my user interface may have a deadlock but I would rather kill the client side and let the server side keep going compromise at the time well like I wasn’t happy with that compromise for very long and so we very quickly started moving into that concurrency from Java six in Java seven not talking about really sophisticated implementations it’s more or less like the copy-on-write array law or the ArrayList that we’re dealing with where we’re taking these detection reports and just pushing them into a data structure that later on we can send off to J free chart without having the locking in-between we want our data collection to proceed apace who want our displays to show all the data that’s coming in frankly if I’m running at a hundred times real-time all of that data is going to be cascading off the screen so fast I’m really just going to be doing qualitative analysis rather than quantitative I don’t have a real requirement for real-time speed at that point or super high accuracy I just want it to go and this allowed us to essentially then go to you push it into the swinging folk later and you know alright user interface do

your thing do it as best you can and my favorite implementation of user interface speed up is last one wins the last day did come in put that on the screen carry on no locking just go go go great so why am I talking about basics right what we’ve established sort of proof by existence is that old systems get more and more thread unsafe as the time goes past I don’t know that I could produce enough data to prove that conclusively but I’d say certainly emotionally the older it is the less thread safe it is and what I’m looking for as the team leader is ways for me to minimize that problem again I’m not trying for a hundred percent correctness all the time I’m trying for most correctness most of the time as long as I don’t lose my data and essentially I’m looking for easy ish solutions best value in Java 7 most of the basic data structures and the components that I’m dealing with they’re no complex frameworks just pour it into copy-on-write and you know let the display put it up on the screen as fast as it can hmm good I’m on sketch all right I’m gonna show you some data you drink of water it is super important that we all recognize that what I’m about to show you is unclassified I need to see nods okay thank you you laugh alright I have made up all this data nothing that I am going to put on the screen is derived from any Navy system you know allied or enemy no enemy or other alright I have put together what looks like a surface forces sub exercise multiple weapon shots the weapons after they reach the target or miss will rise to the surface and then will sit in a to not surface current which is going north-south and they’ll drift waiting for recovery I’m not going to show you the recovery process I’m not show you the whole thing at one time speed because frankly anti-submarine warfare the abbreviation ASW also stands for awfully slow warfare right I work for the naval undersea warfare center we’re rooting for the submarine in this exercise so alright this is these are the Bahamas my pointer alright this is the island of Nassau this is Andros Island if you go to Google Earth actually the NASA world wind system that we’re working on right now if anybody works in a continuous deployment environment you know that you get those good releases going out on a regular basis our quarterly release was a few days ago and I as the team leader said no I’m not bringing this to San Francisco it’s a little bit too fragile to demo right now so sadly I’m going to show you something that isn’t quite as pretty as Google Earth maybe I’ll get another chance another time if you google for auto tech you’ll see you’re right about there is site 1 and I don’t know if anybody saw the show on the History Channel about a law tech or they call it area 52 Google that that is a super funny show oh my goodness we have hydrophone cables that are running out here into the water and they’re covering sort of this area in here I’m going to put on the screen in a minute the hydrophone locations out of context for the picture the locations are actually potentially a little bit sensitive or for official use only so I can’t show you both things at the same time but I will show you the grid of hydrophones that we’re dealing with just out of the context of the shoreline here soo when you switch out of that by the way those are my kids they think my job is awesome you’re a little bit bigger than that now alright so as we can see things are moving very slowly our friend the submarine is down here our friend the surface ship is coming down to the south and we’ve got near-perfect data coming in I’ll increase speed in a moment but I just wanted to show some of the major user interface components let’s bring up a speed chart I would like to do a strip chart of speed let’s do an auto scale I’m using J free chart here because I work for the government and the key word there is free also J free chart I really really liked it very powerful very low sort of maintenance and development cost

so nothing in particular is happening actually my colors have switched here so pretend the red is blue because my red is that guy oh no I’ve got it right as you can see our submarine friend here was cruising along very fast and has done some sort of a speed change he was coming on range and has acquired the target not really I’m not modeling any of that stuff but it’s something like that and it’s decided okay I need to slow way down because speed is noise he’s turning that turn is something that they don’t talk about in hound 4 at October is the submarine has what’s called a towed sonar array does anybody heard about this sort of thing before all right it’s towing that out and not really and is waiting for it to straighten out came this direction and is going that way to resolve that ambiguity the high towed sonar array provides a line of hydrophones which gives you in the spherical tracking case that you know left/right ambiguity problem he’s trying to resolve that because as he came in couldn’t really tell is my target on the right is my target on the left okay so again let’s just look at what this data looks like I can see here these are times of arrival and you might almost be able to read those back there you can see that none of these times are quite the same and there’s potentially a fairly large disconnect between them but if you look for ping F here ping F thing F thing F those are arriving at different times at the various hydrophones because of the separation I promised I’d show you what the hydrophones look like we’ve got quite a few hydrophones slight one is right about there ish roughly our scale here is yards because people like making my life difficult they like yards you can set yards in the in start-up but you can see as we’re talking about the square footage of the range we’re talking about zero to over 30 thousand so yards are meters if your reader of xkcd and we’re talking about a good sort of 40 by 40 kilometers you know rough order of magnitude it’s a very large area and in the model that I’m using here you can see that the whole range is lighting up you know I’m receiving these pings on a whole bunch of Hydra phones and I’m doing a very straightforward acoustic model here so the real data would look slightly different but it’s representative so let’s zoom back in a little bit and I’m going to speed up just so that we can have a little bit more excitement we’re still going slow tough okay slow warfare we’re not kidding all right so right about here I’m expecting a weapon launch not a real weapon and I’ll see a new stream of data I will see a new tracker appear up here and if I look at my what I call my hide strip view yep there’s a weapon launch let’s look at this guy right here like I said this is the data sort of that that hide strip you hear in the background that time series view tall bars are big are better that’s good this is that same data it’s only looking at hydrophone 46 which is this one right here and it’s looking at time versus fractional time so if you imagine going left from right here on the bottom these are integer seconds that vertical part is the point whatever remainder so our weapon here had a run out and the course changed erected and the weapons acquired now our surface ship to counter fire down the same bearing thinking AHA I have detected the submarine you looked in exactly the wrong place to make matters worse our friend the submarine fired three more weapons because why not you know this is Hunt for Red October world you might as well all right so we’ve got a swing to miss here that’s gonna be scored as a hit these are two misses and that’s a probable hit as well well look at all this data okay you can see this is very high slope that’s probably a high velocity target in this case and then an inflection point probable change in speed all right now we’ve got yet another shot coming from the surface ship down to the submarine it’s using a different ID also shared here because I made this data for specifically this chart here and it looks like it’s closing at relatively high speed but it looks like it’s gonna miss unfortunately for the submarine I made this data set

and so it’s kind of course change and a speed change that’s that inflection point right there and it’s coming right up the back and there’s not very much that it can do that little picture right there is very interesting you see how there’s sort of double green coming in here that’s where the data is so close together that validation it’s having trouble telling these targets apart and that’s the end of run so what we saw there was surface data it’s fine submarine data we’re looking at multiple weapons and we saw the scenarios that we have to deal with in terms of all right we’ve got things that are moving very fast for instance one of the validation parameters that we have to keep track of is maximum speed I’m not going to put that number on the screen because it’s unclassified but let’s just not that’s edible in run time where you say all right one of my validation parameters is this is the fastest I expect something to go in this particular case I was using multiples of 10 m/s for everything on the screen because it was easier for me to do the math that way and if you look at my notebook there’s all sorts of charts as I was trying to figure out which way I wanted things to go but you can see we get a lot of data pollution very fast you see that chart in the back you know there’s a lot of information pouring through the system and I’m not zoomed in very far but in for instance when I was right in here where these two guys were very close you would have seen a little bit of jitter this is considered acceptable because I don’t have data loss when these two targets are very close together the tracks are going to be affecting each other a little bit that is considered acceptable partly because they rarely get as close as I made them get because the weapon will turn away I can’t model that because I didn’t want to sit there and do the geometry it took me all day to do the data set as is alright so let’s see back to the slideshow that’s the end of the demo and that’s the end of the talk the contact information here if you have critical feedback please feel free to contact me here if you would like to say something nice this is the mailing address for the Public Affairs Office feel free to say attention captain Kramer you know dr cross is a really nice guy and that’s all for today there are any questions I have a few minutes thanks very much by the way yes sir you’re talking about mammal impact the mammal impact group there are two groups that operate our office if by the way if that’s not my group if you’d like detailed information contact the Public Affairs Office and they’ll give you far more information they publish quite a lot of information it turns out well we have two groups one of them is specifically mammal impact and mammal impact could apply to any sort of operation industrial or anything like that if you talk about for instance the China lake out there where they do actual missile impacts on dirt there are plenty of mammals out there as well but when we’re talking about sea life these there’s the mammal impact modeling team there’s also the people who listen to the hydrophone data and track individual whales and their behavior these installations for particularly the ones that Autec and in San Diego and out in off of Kauai in Hawaii have gathered enormous amounts of mammal information data species we thought were almost extinct turned out to be like rats you know it and the behavior of modeling that we get out of that is fascinating but it’s not my area so I can’t really speak to it so yes contact the Public Affairs Office or google it I’m sorry your hand over here first oh yes the everything here had distribution statement a on it so this is for all use every algorithm that I used and just talked about today has been published and has been out there for quite a long time we have civilian systems that want to do some sort of track accuracy and you can do that in Narragansett Bay which is where I live you can do that at one of the other installations you have to contact the range and there’ll be a public affairs office there as well so yes I don’t have that information in my head that will they’re no joke those if you knew at Cod Navy mill you there’s phone numbers and there’s contact information yeah yes sir

say again we have a we have a variety of test methods it’s sort of a cascading scale you have your basic unit tests we have a bunch of test procedures where we’re using archive data or simulated data like something like this usually it those data sets were driven by scenarios that we saw on site we have unclassified and classified labs in my building that are not classified no higher than the secret level which is my personal clearance where we know what we expected to see and this is what we saw instead so there’s multiple levels we like to have all the tests pass before they would get out to the fleet sometimes that works great usually what they do is they find a new scenario so I guess what when you have three weapons that come directly at each other the track looks poor like well of course it’s like how often the time between say for instance me writing new code to actually being on range is no more than 90 days so we’re writing code all the time it’s driven by customer requirements right now we’re primarily focused on the display side you saw that – I plot that’s a J free chart plot I really like J free chart nothing about that is sexy when you’re doing demos to the Admiral it doesn’t really look pretty Google Earth and such things they look pretty end they also give you a better geodetic display of what you’re trying to show so all the time but you sir on this particular team like my team is pretty large and they they keep putting more people on my team some of them are testers which I think are great I love testers the core developers of the people who are still with government six to seven six yes the okay the the actual okay I can tell you the components I work for the government so nothing I say in the next few minutes condones recommendation of any particular vendor or supplier etc etcetera you know everything that well no not everything that we use is open-source we do use open-source partly because the price is right we do contribute to open source projects on this software forged mill side if you have a and cat card which is a military ID card you can access that the forward side the development process is a modified feature driven development where essentially we have a pool of cases that are prioritized they happen to be in fog buds because that plugs into a very low ceremony development process we have a small number of people so there are two different repositories that go on the back end subversion is the older side the new systems are going into kiln which is a flavor of material which comes with our fog website license which we paid money for there was a competitive bid process that’s the retarder and there’s a Eclipse and NetBeans depending on which one you like the best in between did I answer your question okay good it’s Jenkins now we were using cruise control for a long time yes sir the actual incoming data the okay the transmitted data is ordered so it’s you know its first ping is ping one then there’s ping two and then it rolls over to ping one again after it because 16 of those so that’s one of the things that allows me to correlate between two different hydrophones this ping one and that ping one must be the same if for instance I had a very rapid ping rate in very short what’s called baselines different distances between the hydrophones I can get what’s called a frame ambiguity where it’s rolled over too fast and now I don’t know which ping one I’m talking about but yes there’s a distinct order and if you think about it most of the time it’s you know going from ping 1 to 16 it’s 1,500 meters per second we’re talking upwards of 10 kilometers at that point so I can cover a pretty large footprint as I up the ping rate that area begins to neck down so it’s good okay yes sir on the airborne side I’m not talking I didn’t show any of the airborne data we’re also responsible for taking in the radar data at law Tech the systems at the other ranges also provide radar data it’s an older display system that doesn’t pass through this code base this particular code base and those generally produce for me a tyskie a time space position

information packet directly and they they produce it however fast they want to a lot of those if they were old missile control radars or upwards of 20 times a second which is relatively low bandwidth you know compared to everything else that we have to deal with yes sir the the question is on but would we use machine learning I would say machine learning in an unsophisticated sense is very much on our R&D plan because what we would really like to do and we’ve talked about this a lot and it’s not even a funding question it’s literally we don’t have enough minutes in the day to start pulling into this is to say we have years and years of archived data to say this particular target was at this point which phones can hear it because I’d like to be able to produce per phone a hearing volume because their art I said the bottom doesn’t change very often it does over a period of 20 years change some we had one hydrophone down at law tech fall into a hole or something like that where literally the hearing volume change that it fell into a ditch or something happened or it got caught on a trawl line or something like that where all of a sudden it wasn’t hearing things that it should have been hearing before and it physically moved so it had to be resurveyed and it turned out that that was an issue we would like to be able to detect that I would also like to be able to say alright I have a GPS source for this particular surface target I know it’s pinging these phones can’t hear it something’s wrong or there’s an occlusion you know I have a I have what’s a called a ping er Pole where I can temporarily mount a ping er in the water and put an acoustic tracking source on something for a little while it might be on one side of the boat so I can’t hear over there is that good sure yes sir we do mostly on the positions I would really like to look into calman filtering on the input but we haven’t done that yet it’s 2 o’clock so you guys could go if you wanted to thanks very much for coming