<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>orthoimagery - Jake Coppinger</title>
	<atom:link href="https://jakecoppinger.com/tag/orthoimagery/feed/" rel="self" type="application/rss+xml" />
	<link>https://jakecoppinger.com</link>
	<description>Jake Coppinger&#039;s blog and portfolio.</description>
	<lastBuildDate>Wed, 05 Apr 2023 03:58:55 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.8.5</generator>

 
	<item>
		<title>Generating aerial imagery with your iPhone&#8217;s LiDAR sensor</title>
		<link>https://jakecoppinger.com/2023/03/generating-aerial-imagery-with-your-iphones-lidar-sensor/</link>
					<comments>https://jakecoppinger.com/2023/03/generating-aerial-imagery-with-your-iphones-lidar-sensor/#comments</comments>
		
		<dc:creator><![CDATA[Jake C]]></dc:creator>
		<pubDate>Mon, 13 Mar 2023 21:00:00 +0000</pubDate>
				<category><![CDATA[Infrastructure]]></category>
		<category><![CDATA[Maps]]></category>
		<category><![CDATA[Sydney]]></category>
		<category><![CDATA[Tech]]></category>
		<category><![CDATA[Urbanism]]></category>
		<category><![CDATA[3d scanner app]]></category>
		<category><![CDATA[geotiff]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[lidar]]></category>
		<category><![CDATA[odm]]></category>
		<category><![CDATA[openaerialmap]]></category>
		<category><![CDATA[opendronemap]]></category>
		<category><![CDATA[openstreetmap]]></category>
		<category><![CDATA[orthoimagery]]></category>
		<category><![CDATA[orthophoto]]></category>
		<category><![CDATA[OSM]]></category>
		<category><![CDATA[polycam]]></category>
		<category><![CDATA[qgis]]></category>
		<category><![CDATA[raster]]></category>
		<category><![CDATA[sydney]]></category>
		<guid isPermaLink="false">https://jakecoppinger.com/?p=495</guid>

					<description><![CDATA[<p>This technical guide details how you can create your own aerial imagery and 3D models of streets with the built in iPhone LiDAR sensor and open source tools in the OpenDroneMap package.</p>
<p>The post <a href="https://jakecoppinger.com/2023/03/generating-aerial-imagery-with-your-iphones-lidar-sensor/">Generating aerial imagery with your iPhone’s LiDAR sensor</a> first appeared on <a href="https://jakecoppinger.com">Jake Coppinger</a>.</p>]]></description>
										<content:encoded><![CDATA[<p>This technical guide details how you can create your own aerial imagery (aka satellite view/<a href="https://twitter.com/btaylor/status/1099370126678253569" target="_blank" rel="noreferrer noopener">bird mode</a>/orthorectified imagery) and 3D models of streets with the built in iPhone LiDAR sensor (iPhone Pro 12 or later, 2020+ iPad Pro) and open source tools in the <a href="https://www.opendronemap.org/" target="_blank" rel="noopener" title="">OpenDroneMap</a> package. All you need to do to capture the model is walk around with your iPhone at ground level.</p>



<figure class="wp-block-image size-large"><img fetchpriority="high" decoding="async" width="1024" height="683" src="https://jakecoppinger.com/wp-content/uploads/2023/03/id-editor-open-aerial-map-1024x683.jpg" alt="" class="wp-image-534" srcset="https://jakecoppinger.com/wp-content/uploads/2023/03/id-editor-open-aerial-map-1024x683.jpg 1024w, https://jakecoppinger.com/wp-content/uploads/2023/03/id-editor-open-aerial-map-300x200.jpg 300w, https://jakecoppinger.com/wp-content/uploads/2023/03/id-editor-open-aerial-map-768x513.jpg 768w, https://jakecoppinger.com/wp-content/uploads/2023/03/id-editor-open-aerial-map-1536x1025.jpg 1536w, https://jakecoppinger.com/wp-content/uploads/2023/03/id-editor-open-aerial-map-2048x1367.jpg 2048w" sizes="(max-width: 1024px) 100vw, 1024px" /><figcaption>The pedestrianised Margaret Street, Sydney with temporary treatment. Imagery captured with a handheld iPhone 14 Pro. Imagery at <a href="https://map.openaerialmap.org/#/-18.6328125,18.562947442888312,3/latest/" target="_blank" rel="noopener" title="">https://map.openaerialmap.org/#/-18.6328125,18.562947442888312,3/latest/</a></figcaption></figure>



<figure class="wp-block-image size-large"><img decoding="async" width="1024" height="682" src="https://jakecoppinger.com/wp-content/uploads/2023/03/image-1024x682.png" alt="" class="wp-image-529" srcset="https://jakecoppinger.com/wp-content/uploads/2023/03/image-1024x682.png 1024w, https://jakecoppinger.com/wp-content/uploads/2023/03/image-300x200.png 300w, https://jakecoppinger.com/wp-content/uploads/2023/03/image-768x511.png 768w, https://jakecoppinger.com/wp-content/uploads/2023/03/image-1536x1022.png 1536w, https://jakecoppinger.com/wp-content/uploads/2023/03/image.png 1600w" sizes="(max-width: 1024px) 100vw, 1024px" /><figcaption>Image by City of Sydney. The George Street Pride flag project is part of the NSW Government’s Streets as Shared Spaces program. <a href="https://www.cityofsydney.nsw.gov.au/improving-streets-public-spaces/closure-george-street-north" target="_blank" rel="noopener" title="">https://www.cityofsydney.nsw.gov.au/improving-streets-public-spaces/closure-george-street-north</a></figcaption></figure>



<div class="wp-block-aioseo-table-of-contents"><ul><li><a href="#aioseo-why-and-how">Why is this useful?</a></li><li><a href="#aioseo-overview">Process overview</a></li><li><a href="#aioseo-capturing-the-model">Capturing the model</a></li><li><a href="#aioseo-exporting-and-preparing-the-model">Exporting and preparing the model</a><ul><li><a href="#aioseo-rotating-the-model-into-the-correct-orientation">Rotating the model into the correct orientation (required for 3d Scanner App)</a></li></ul></li><li><a href="#aioseo-generating-the-raster-orthophoto">Generating the raster orthophoto</a><ul><li><a href="#aioseo-installing-webodm-locally">Installing WebODM locally</a></li><li><a href="#aioseo-copying-the-textured-object-into-the-odm-docker-container">Copying the object into the ODM Docker container</a></li><li><a href="#aioseo-running-odm_orthophoto">Running odm_orthophoto</a></li><li><a href="#aioseo-exporting-the-orthophoto-out-of-the-docker-container">Exporting the orthophoto out of the Docker container</a></li></ul></li><li><a href="#aioseo-georeferencing-the-orthophoto">Georeferencing the orthophoto</a><ul><li><a href="#aioseo-export-georeferenced-geotiff-without-worldfile">Export geo-referenced GeoTIFF (without worldfile)</a></li></ul></li><li><a href="#aioseo-uploading-to-openaerialmap">Uploading to OpenAerialMap</a></li><li><a href="#aioseo-limitations">Limitations</a></li><li><a href="#aioseo-future-work">Future work</a></li></ul></div>



<h1 class="wp-block-heading" id="aioseo-why-and-how">Why is this useful?</h1>



<p>Usually for such a task you would use a drone and process with <a href="https://opendronemap.org/webodm/" target="_blank" rel="noopener" title="">WebODM</a> (or <a href="https://www.pix4d.com/" target="_blank" rel="noopener" title="">Pix4D</a>), but there are areas that are unsafe or illegal to fly in. I&#8217;ve previously detailed how to generate imagery <a href="https://jakecoppinger.com/2022/12/creating-aerial-imagery-with-a-bike-helmet-camera-and-opendronemap/" target="_blank" rel="noopener" title="">using a bicycle helmet mounted GoPro camera</a>, however this can include artifacts where there are lots of people. The helmet camera method requires a decent GPS lock (unsuitable indoors, urban areas or under a bridge) and has relatively low detail.</p>



<p>Again, why might you want to do this? With your own high detail and up-to-date models and street imagery you could:</p>



<ul class="wp-block-list"><li>Map new street interventions, like bollards, modal filters or raised crossings</li><li>Record pothole locations (and their depth!)</li><li>Take measurements such as road and cycleway widths around crowds of people in urban centres</li><li>Measure footpath obstructions in 3D and rate pedestrian amenity</li><li>Survey features underneath large highways</li><li>Survey street parking using the new OSM spec: <a href="https://wiki.openstreetmap.org/wiki/Street_parking" target="_blank" rel="noreferrer noopener">wiki.openstreetmap.org/wiki/Street_parking</a></li><li>Map indoor pedestrian areas in OpenStreetMap for better pedestrian routing<ul><li>The Transport for NSW Connected Journeys Data team is currently doing a fair bit of this work: <a href="https://www.openstreetmap.org/changeset/133107592" target="_blank" rel="noopener" title="">https://www.openstreetmap.org/changeset/133107592</a></li></ul></li><li>Attach your iPhone to your bike and generate LiDAR point clouds of the kerb and cycleway infrastructure (it works, just go slow!)</li></ul>



<p>This method results in very high detail (5mm resolution if desired) 3D models and accurate orthoimagery. Manual georeferencing is required (which I also explain how to do) which limits the confidence in alignment. This is a proof of concept &#8211; if you have corrections/suggestions/ideas to improve the method, please comment below or on Mastodon!</p>



<p>Note: This method also provides a solution to <a href="https://community.opendronemap.org/t/creating-2-5d-oblique-orthophoto/13579">creating 2.5D oblique orthophotos</a> from drone imagery.</p>


<div id="mc_embed_shell"><style type="text/css">
        #mc_embed_signup{background:#fff; false;clear:left; font:14px Helvetica,Arial,sans-serif; width: 600px;}<br />
        /* Add your own Mailchimp form style overrides in your site stylesheet or in this style block.<br />
           We recommend moving this block and the preceding CSS link to the HEAD of your HTML file. */<br />
</style>
<div id="mc_embed_signup"><form id="mc-embedded-subscribe-form" class="validate" action="https://jakecoppinger.us17.list-manage.com/subscribe/post?u=3c1bd4fc8fca6648af03e916a&amp;id=ad49243f2c&amp;f_id=00d3e4e3f0" method="post" name="mc-embedded-subscribe-form" target="_blank">
<div id="mc_embed_signup_scroll">
<h2><a href="http://eepurl.com/hemS9j" target="_blank" rel="noopener">Subscribe to Jake&#8217;s blog</a></h2>
Email notifications of new blog posts are infrequent, brief, and plain text.

</div>
</form></div>
</div>


<h1 class="wp-block-heading" id="aioseo-overview">Process overview</h1>



<p>This guide covers how to:</p>



<ul class="wp-block-list"><li>Capture a 3D model using <a href="https://3dscannerapp.com/" target="_blank" rel="noopener" title="">3d Scanner App</a> (recommended) or <a href="https://poly.cam/" target="_blank" rel="noopener" title="">Polycam</a><ul><li>The iPhone LiDAR sensor has 5 metres max range, so you&#8217;ll need to walk around</li></ul></li><li>Export the model to an <code>.obj</code> file with textures</li><li>Rotating the model in Blender to the required orientation</li><li>Use the <code>odm_orthophoto</code> program inside the OpenDroneMap Docker container to generate a raster <code>.tiff</code></li><li>Georeference the tiff using QGIS</li><li>Uploading the Geotiff to OpenAerialMap to generate a tileset, viewable in the OpenStreetMap <a href="https://github.com/openstreetmap/iD" target="_blank" rel="noopener" title="">iD editor</a> or a Felt map with a custom layer</li></ul>



<h1 class="wp-block-heading" id="aioseo-capturing-the-model">Capturing the model</h1>



<p>Capturing a 3D model on an supported iPhone is easy. I recommend using the app titled <code>3d Scanner App</code> as it allows considerable customisation of the scan settings. It allows finishing a scan and extending later, though this can be buggy. I haven&#8217;t had a crash during capture &#8211; I&#8217;ve had Polycam crash halfway through a large scan losing all data.</p>



<p>Download 3d Scanner App and use the LiDAR Advanced mode. I recommend the following options for scanning streets:</p>



<ul class="wp-block-list"><li>Confidence to low. This extends the range of the LiDAR sensor readings used at the expense of more noise. You can clean up this noise in the processing settings or Blender.</li><li>Range to 5.0 metres</li><li>Masking to None</li><li>Resolution to 50mm (the lowest &#8211; for large models like streets)</li></ul>



<p>In the app settings, make sure to set:</p>



<ul class="wp-block-list"><li>GPS tag scans to ON</li><li>Units to metric</li></ul>



<p>When scanning a street, walk (or cycle) slowly with a sweeping motion to increase the width. If the area is wide enough to require a grid pattern, follow the same shape as a drone survey (an S-shape with considerable overlap). Not enough overlap or higher speeds mean the linear passes don&#8217;t connect correctly due to (I assume) inertial measurement unit drift. I&#8217;m unsure if the GPS information is used in the sensor fusion (<a href="https://developer.apple.com/augmented-reality/arkit/" target="_blank" rel="noopener" title="">via ARKit</a>), please comment if you know!</p>



<figure class="wp-block-image size-large"><img decoding="async" width="472" height="1024" src="https://jakecoppinger.com/wp-content/uploads/2023/03/3d-scanner-app-model-472x1024.jpg" alt="" class="wp-image-517" srcset="https://jakecoppinger.com/wp-content/uploads/2023/03/3d-scanner-app-model-472x1024.jpg 472w, https://jakecoppinger.com/wp-content/uploads/2023/03/3d-scanner-app-model-138x300.jpg 138w, https://jakecoppinger.com/wp-content/uploads/2023/03/3d-scanner-app-model-768x1665.jpg 768w, https://jakecoppinger.com/wp-content/uploads/2023/03/3d-scanner-app-model-709x1536.jpg 709w, https://jakecoppinger.com/wp-content/uploads/2023/03/3d-scanner-app-model-945x2048.jpg 945w, https://jakecoppinger.com/wp-content/uploads/2023/03/3d-scanner-app-model.jpg 1179w" sizes="(max-width: 472px) 100vw, 472px" /><figcaption>View of the completed model</figcaption></figure>



<h1 class="wp-block-heading" id="aioseo-exporting-and-preparing-the-model">Exporting and preparing the model</h1>



<p>In the <code>3d Scanner App</code> use the Share button, then select the <code>.obj</code> file type. Send this to your computer (Airdrop works great if using macOS). <em>If using <a href="https://poly.cam/" target="_blank" rel="noopener" title="">Polycam</a>, set &#8220;Z axis up&#8221; in the mesh export settings</em>.</p>



<h2 class="wp-block-heading" id="aioseo-rotating-the-model-into-the-correct-orientation">Rotating the model into the correct orientation (required for 3d Scanner App)</h2>



<p>Unfortunately the <code>3d Scanner App</code> exports objects with the Z axis as &#8220;up&#8221;, while the <code>odm_orthophoto</code> program expects the Y axis to be &#8220;up&#8221;. <em>Confusingly, you can skip this step if using <a href="https://poly.cam/" target="_blank" rel="noopener" title="">Polycam</a> if exporting with &#8220;Z axis up&#8221; in the mesh export settings, though Blender shows the Y axis as up in this export. If you know why this is, please leave a comment!</em></p>



<p>To rotate the model, import it to Blender and rotate it 90 degrees.</p>



<ul class="wp-block-list"><li>First, install Blender via your preferred method (<a href="https://www.blender.org/download/" target="_blank" rel="noopener" title="">https://www.blender.org/download/</a>).</li><li>Open Blender, delete the initial default cube (right click -&gt; delete, or <code>x</code> hotkey)</li><li>Import the <code>.obj</code> file: File -&gt; Import -&gt; Wavefront (.obj)</li><li>(optional: you can view the pretty texture by selecting &#8220;viewport shading&#8221; in the top right (the horizontal list of sphere icons))</li></ul>



<figure class="wp-block-image size-full"><img loading="lazy" decoding="async" width="934" height="468" src="https://jakecoppinger.com/wp-content/uploads/2023/03/Screen-Shot-2023-03-12-at-20.44.19.png" alt="" class="wp-image-518" srcset="https://jakecoppinger.com/wp-content/uploads/2023/03/Screen-Shot-2023-03-12-at-20.44.19.png 934w, https://jakecoppinger.com/wp-content/uploads/2023/03/Screen-Shot-2023-03-12-at-20.44.19-300x150.png 300w, https://jakecoppinger.com/wp-content/uploads/2023/03/Screen-Shot-2023-03-12-at-20.44.19-768x385.png 768w" sizes="auto, (max-width: 934px) 100vw, 934px" /></figure>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="661" src="https://jakecoppinger.com/wp-content/uploads/2023/03/model-in-blender-before-rotate-1024x661.jpg" alt="" class="wp-image-519" srcset="https://jakecoppinger.com/wp-content/uploads/2023/03/model-in-blender-before-rotate-1024x661.jpg 1024w, https://jakecoppinger.com/wp-content/uploads/2023/03/model-in-blender-before-rotate-300x194.jpg 300w, https://jakecoppinger.com/wp-content/uploads/2023/03/model-in-blender-before-rotate-768x496.jpg 768w, https://jakecoppinger.com/wp-content/uploads/2023/03/model-in-blender-before-rotate-1536x991.jpg 1536w, https://jakecoppinger.com/wp-content/uploads/2023/03/model-in-blender-before-rotate-2048x1321.jpg 2048w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /><figcaption>Model appearing in correct orientation in Blender, before rotating for export</figcaption></figure>



<ul class="wp-block-list"><li>To rotate<ul><li>Click the object and make sure it is selected (orange border)</li></ul><ul><li>Press hotkey&nbsp;<code>r</code>&nbsp;(from any view)</li><li>Press&nbsp;<code>x</code>&nbsp;to only allow rotation on X axis</li><li>Type&nbsp;<code>90</code>&nbsp;(or desired degrees to rotate)</li></ul></li><li>Optional: You can check if the rotation is correct by pressing numpad key 1. If you don&#8217;t have a numpad you will need to enable numpad emulation (see instructions at <a href="https://www.hack-computer.com/post/how-to-emulate-a-third-mouse-button-and-keypad-for-blender" target="_blank" rel="noopener" title="">https://www.hack-computer.com/post/how-to-emulate-a-third-mouse-button-and-keypad-for-blender</a>). <ul><li>The rotation is correct if you have a &#8220;birds eye view&#8221; in the numpad key 1 view, where the blue Z axis is towards the top of screen and the red X axis is towards the right of screen</li></ul></li></ul>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="661" src="https://jakecoppinger.com/wp-content/uploads/2023/03/blender-view-after-rotating-1024x661.jpg" alt="" class="wp-image-521" srcset="https://jakecoppinger.com/wp-content/uploads/2023/03/blender-view-after-rotating-1024x661.jpg 1024w, https://jakecoppinger.com/wp-content/uploads/2023/03/blender-view-after-rotating-300x194.jpg 300w, https://jakecoppinger.com/wp-content/uploads/2023/03/blender-view-after-rotating-768x496.jpg 768w, https://jakecoppinger.com/wp-content/uploads/2023/03/blender-view-after-rotating-1536x991.jpg 1536w, https://jakecoppinger.com/wp-content/uploads/2023/03/blender-view-after-rotating-2048x1321.jpg 2048w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /><figcaption>Correct orientation for export to <code>odm_orthophoto</code>. Not the axis display at the top right.</figcaption></figure>



<ul class="wp-block-list"><li>File -&gt; Export as an <code>.obj</code> <strong>to the same folder</strong> with a new name (eg. <code>blender_export.obj</code>)<ul><li>Note: Blender doesn&#8217;t create a new texture <code>.jpg</code>. If you export to a different folder the path to the <code>.jpg</code> in the <code>.mtl</code> file will need updating.</li></ul></li></ul>



<h1 class="wp-block-heading" id="aioseo-generating-the-raster-orthophoto">Generating the raster orthophoto</h1>



<p>Use the <code>odm_orthophoto</code> command line tool to generate a raster orthophoto from a <code>.obj</code> file. This tool is available at <a href="https://github.com/OpenDroneMap/odm_orthophoto" target="_blank" rel="noopener" title="">https://github.com/OpenDroneMap/odm_orthophoto</a> but has a considerable number of dependencies.</p>



<p>I believe the easiest method currently is to install WebODM locally, copy the <code>.obj</code> and texture files (<code>.mtl</code> and <code>.jpg</code>) into the Docker container and then run the program from inside the Docker container.</p>



<h2 class="wp-block-heading" id="aioseo-installing-webodm-locally">Installing WebODM locally</h2>



<p>Running the software using Docker is a breeze. Install Docker from <a href="https://www.docker.com/" target="_blank" rel="noreferrer noopener">https://www.docker.com/</a> (or your preferred method) and then:</p>



<pre class="wp-block-code"><code>git clone https://github.com/OpenDroneMap/WebODM --config core.autocrlf=input --depth 1
cd WebODM
./webodm.sh start </code></pre>



<p>See <a href="https://github.com/OpenDroneMap/WebODM#getting-started" target="_blank" rel="noreferrer noopener">https://github.com/OpenDroneMap/WebODM#getting-started</a> for more details. WebODM itself is excellent and great fun if you have a drone!</p>



<h2 class="wp-block-heading" id="aioseo-copying-the-textured-object-into-the-odm-docker-container">Copying the object into the ODM Docker container</h2>



<p>You can start a shell in the container with the following command:</p>



<pre class="wp-block-code"><code>docker exec -it webodm_node-odm_1 /bin/bash</code></pre>



<p>Make a new directory to keep your files in</p>



<pre class="wp-block-code"><code>mkdir /iphone_model
cd /iphone_model</code></pre>



<p>In another shell, copy the object and texture files from your local machine into the new Docker container folder. <code>docker cp</code> can only copy one file at a time.</p>



<pre class="wp-block-code"><code>cd path/to/your/model/
docker cp blender_export.obj webodm_node-odm_1:/iphone_model/
docker cp blender_export.mtl webodm_node-odm_1:/iphone_model/
# Note: The blender .obj export doesn't create a new texture .jpg
#   If your Blender export wasn't in the same directory, check
#   update the path in blender_export.mtl
docker cp textured_output.jpg webodm_node-odm_1:/iphone_model/</code></pre>



<h2 class="wp-block-heading" id="aioseo-running-odm_orthophoto">Running <code>odm_orthophoto</code></h2>



<p>In the shell you started in the docker container above, run the following command:</p>



<pre class="wp-block-code"><code>cd /iphone_model/
/code/SuperBuild/install/bin/odm_orthophoto -inputFiles blender_export.obj -logFile log.txt -outputFile orthophoto.tif -resolution 100.0 -outputCornerFile corners.txt</code></pre>



<p>The <code>resolution</code> argument is how many pixels per metre &#8211; this may require changing.</p>



<h2 class="wp-block-heading" id="aioseo-exporting-the-orthophoto-out-of-the-docker-container">Exporting the orthophoto out of the Docker container</h2>



<p>To copy the generated orthophoto out, from a shell on your local machine run:</p>



<pre class="wp-block-code"><code>docker cp webodm_node-odm_1:/iphone_model/orthophoto.tif .</code></pre>



<p>Use a similar command to extract the log file if required.</p>



<h1 class="wp-block-heading" id="aioseo-georeferencing-the-orthophoto">Georeferencing the orthophoto</h1>



<p>Georeferencing is the process of specifying the location and orientation of the image so it perfectly aligns with maps or GIS software. <em>While a rough location (with a moderately incorrect rotation) is stored in the model, it appears to be removed by the Blender rotation step. If you know how to fix this please comment below!</em></p>



<p>To do this:</p>



<ul class="wp-block-list"><li>Install QGIS by your preferred method: <a href="https://www.qgis.org/en/site/forusers/download.html" target="_blank" rel="noopener" title="">https://www.qgis.org/en/site/forusers/download.html</a></li><li>Install the plugins (via the Plugins -&gt; Manage &amp; Install plugins&#8230; menu)<ul><li>QuickMapServices (to pull in Bing satellite imagery easily)</li><li>Freehand raster georeferencer (a beginner friendly georeferencing tool)</li></ul></li><li>Add a Bing satellite base layer: Web -&gt; QuickMapServices -&gt; Bing -&gt; Bing Satellite<ul><li>Feel free to choose another satellite background of your chosing</li><li>If you&#8217;re in NSW: the NSW LPI Imagery is likely the most detailed, follow: <a href="https://www.spatial.nsw.gov.au/products_and_services/web_services/qgis" target="_blank" rel="noopener" title="">https://www.spatial.nsw.gov.au/products_and_services/web_services/qgis</a></li></ul></li><li>Zoom &amp; pan to the rough location of the 3d scan (the initial <code>.tif</code> location will be wherever you&#8217;re viewing)</li><li>Drag the <code>.tif</code> output by the previous step into the sidebar (it won&#8217;t be visible yet as it is not aligned)</li><li>Go to Raster-&gt; Freehand raster georeferencer -&gt; Add raster for freehand georeferencing and select the same <code>.tif</code></li><li>Use the Move, Rotate and scale buttons in the toolbar to align your orthophoto with the imagery background (tip. Hold Cmd or Ctrl <em>before</em> scaling to keep the aspect ratio)</li></ul>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="206" src="https://jakecoppinger.com/wp-content/uploads/2023/03/Screen-Shot-2023-03-11-at-14.14.48-1024x206.png" alt="" class="wp-image-512" srcset="https://jakecoppinger.com/wp-content/uploads/2023/03/Screen-Shot-2023-03-11-at-14.14.48-1024x206.png 1024w, https://jakecoppinger.com/wp-content/uploads/2023/03/Screen-Shot-2023-03-11-at-14.14.48-300x60.png 300w, https://jakecoppinger.com/wp-content/uploads/2023/03/Screen-Shot-2023-03-11-at-14.14.48-768x154.png 768w, https://jakecoppinger.com/wp-content/uploads/2023/03/Screen-Shot-2023-03-11-at-14.14.48.png 1046w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /><figcaption>Buttons to move/scale/rotate</figcaption></figure>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="791" src="https://jakecoppinger.com/wp-content/uploads/2023/03/aligning-tiff-1024x791.jpg" alt="" class="wp-image-522" srcset="https://jakecoppinger.com/wp-content/uploads/2023/03/aligning-tiff-1024x791.jpg 1024w, https://jakecoppinger.com/wp-content/uploads/2023/03/aligning-tiff-300x232.jpg 300w, https://jakecoppinger.com/wp-content/uploads/2023/03/aligning-tiff-768x593.jpg 768w, https://jakecoppinger.com/wp-content/uploads/2023/03/aligning-tiff-1536x1186.jpg 1536w, https://jakecoppinger.com/wp-content/uploads/2023/03/aligning-tiff-2048x1581.jpg 2048w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /><figcaption>Aligned to the nearby buildings</figcaption></figure>



<ul class="wp-block-list"><li>Click the &#8220;Export raster with world file&#8221; button (Green on the right with exclamation marks).</li><li>Check the &#8220;Only export world file for chosen raster&#8221; button. <strong>Make sure to do this before chosing the image path.</strong></li><li>Select the existing <code>.tif</code> image and press OK</li><li>Remove the orthophoto from the QGIS sidebar (right click -&gt; remove layer)</li><li>Drag the existing <code>.tif</code> image back into the sidebar. QGIS will now find the worldfiles next to it (<code>orthophoto.tif.aux.xml</code> and <code>orthophoto.tfw</code>) so it will be positioned in the right place</li></ul>



<h2 class="wp-block-heading" id="aioseo-export-georeferenced-geotiff-without-worldfile">Export geo-referenced GeoTIFF (without worldfile)</h2>



<p>If you would like to upload the GeoTIFF to OpenAerialMap or somewhere else, you will need to &#8220;bake in&#8221; the location into the GeoTIFF itself, rather than in the worldfile &#8211; OpenAerialMap can&#8217;t read the worldfile.</p>



<p>To do this:</p>



<ul class="wp-block-list"><li>right click your <code>orthophoto</code> layer (after the above steps) and click Export -&gt; Save As&#8230;</li><li>Set <code>CRS</code> to your desired coordinate system (if not yet in a coordinate system, <a href="https://gis.stackexchange.com/questions/48949/epsg-3857-or-4326-for-googlemaps-openstreetmap-and-leaflet" target="_blank" rel="noopener" title="">I assume you should use <strong><code>EPSG 3857</code></strong> if you want it to be aligned with OpenStreetMap tiles</a>, but this is the limit of my current understanding &#8211; I haven&#8217;t studied surveying yet!).</li><li>To avoid confusion, create a new subfolder and save it with the default settings (eg. make folder <code>qgis_export</code> and save as orthophoto.tif).</li><li>You now have a nice georeferenced GeoTIFF!</li></ul>



<h1 class="wp-block-heading" id="aioseo-uploading-to-openaerialmap">Uploading to OpenAerialMap</h1>



<p>If you want the imagery to be publicly viewable and accessible from the OpenStreetMap iD editor, OpenAerialMap is a free place to host your imagery.</p>



<p>This is the imagery from the above example: <a href="https://map.openaerialmap.org/#/-18.6328125,18.562947442888312,3/latest/" target="_blank" rel="noopener" title="">https://map.openaerialmap.org/#/-18.6328125,18.562947442888312,3/latest/</a></p>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="675" src="https://jakecoppinger.com/wp-content/uploads/2023/03/open-aerial-map-margaret-st-1024x675.jpg" alt="" class="wp-image-553" srcset="https://jakecoppinger.com/wp-content/uploads/2023/03/open-aerial-map-margaret-st-1024x675.jpg 1024w, https://jakecoppinger.com/wp-content/uploads/2023/03/open-aerial-map-margaret-st-300x198.jpg 300w, https://jakecoppinger.com/wp-content/uploads/2023/03/open-aerial-map-margaret-st-768x506.jpg 768w, https://jakecoppinger.com/wp-content/uploads/2023/03/open-aerial-map-margaret-st.jpg 1500w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>



<p>I&#8217;ve heard of plans for a relaunch of the website, but currently the upload form can be finicky.</p>



<ul class="wp-block-list"><li>Open the explore page: <a href="https://map.openaerialmap.org/" target="_blank" rel="noopener" title="">https://map.openaerialmap.org/</a></li><li>Sign in (only Google &amp; FB Oauth supported)</li><li>Press upload<ul><li>Currently uploading from local file doesn&#8217;t appear to work, see <a href="https://github.com/hotosm/OpenAerialMap/issues/158" target="_blank" rel="noopener" title="">https://github.com/hotosm/OpenAerialMap/issues/158</a> for updates</li><li>Uploading via Google Drive with my account (2fa enabled, Gsuite) fails with This app is blocked: This app tried to access sensitive info in your Google Account. To keep your account safe, Google blocked this access.<ul><li>Enabling less secure apps is not possible for 2fa accounts. Otherwise, if you&#8217;re comfortable turning it off you can do that here: <a href="https://myaccount.google.com/lesssecureapps" target="_blank" rel="noopener" title="">https://myaccount.google.com/lesssecureapps</a></li></ul></li><li>Using a URL is likely the only way. Creating an S3 bucket is one way. If you have a fast connection it would be faster to run a local webserver with Python and running <a href="https://ngrok.com/download" target="_blank" rel="noopener" title="">ngrok</a> to make it publicly available. I recommend not keeping this server running for longer than necessary. Eg:</li></ul></li></ul>



<pre class="wp-block-code"><code>cd qgis_export
python3 -m http.server 8080
ngrok http 8080
# Your file is now available at https://SOME_PATH.ngrok.io/orthophoto.tif</code></pre>



<p>Specify this url in the form and add other details, then press upload.</p>



<h1 class="wp-block-heading" id="aioseo-limitations">Limitations</h1>



<ul class="wp-block-list"><li>Manual alignment limits the real world accuracy of imagery</li><li>Drift during long model captures occurs. My understanding is drift occurs more when there are sudden or fast movements. The 3d Scanner App unfortunately doesn&#8217;t warn you when you&#8217;re moving to fast, but Polycam does. As far as I know, the iOS ARKit doesn&#8217;t attempt to reconcile drift when completing a loop/circuit.</li></ul>



<h1 class="wp-block-heading" id="aioseo-future-work">Future work</h1>



<ul class="wp-block-list"><li>Automation! This process is slow but it works.<ul><li>Adding a Makefile or other compile tooling to <a href="https://github.com/OpenDroneMap/odm_orthophoto" target="_blank" rel="noopener" title="">https://github.com/OpenDroneMap/odm_orthophoto</a> would skip the requirement to install WebODM and transfer files to/from the Docker container</li><li>Rotating the model could be added (behind a flag to be backwards compatible) to the odm_orthophoto script</li></ul></li><li>Generating pointclouds (supported by 3d Scanner App) and then exporting as a raster from CloudCompare. This might make larger captures possible.<ul><li>If there is a way of addressing drift of pointclouds for multiple captures &#8211; let me know how!</li></ul></li><li>Georeferencing using ground control points rather than a freehand referencer</li><li>Creating street facade montages and evaluating doors &amp; soft edges (Jan Gehl (1986) “Soft edges” in residential streets, Scandinavian Housing and Planning Research,3:2,89-102, DOI: <a href="https://doi.org/10.1080/02815738608730092">10.1080/02815738608730092</a>)</li></ul>



<p>Let me know if you have any corrections/suggestions/feedback!</p><p>The post <a href="https://jakecoppinger.com/2023/03/generating-aerial-imagery-with-your-iphones-lidar-sensor/">Generating aerial imagery with your iPhone’s LiDAR sensor</a> first appeared on <a href="https://jakecoppinger.com">Jake Coppinger</a>.</p>]]></content:encoded>
					
					<wfw:commentRss>https://jakecoppinger.com/2023/03/generating-aerial-imagery-with-your-iphones-lidar-sensor/feed/</wfw:commentRss>
			<slash:comments>3</slash:comments>
		
		
			</item>
		<item>
		<title>Creating aerial imagery with a bike helmet camera (GoPro) and OpenDroneMap</title>
		<link>https://jakecoppinger.com/2022/12/creating-aerial-imagery-with-a-bike-helmet-camera-and-opendronemap/</link>
					<comments>https://jakecoppinger.com/2022/12/creating-aerial-imagery-with-a-bike-helmet-camera-and-opendronemap/#comments</comments>
		
		<dc:creator><![CDATA[Jake C]]></dc:creator>
		<pubDate>Sat, 10 Dec 2022 09:08:15 +0000</pubDate>
				<category><![CDATA[Cycling]]></category>
		<category><![CDATA[Infrastructure]]></category>
		<category><![CDATA[Side project]]></category>
		<category><![CDATA[Tech]]></category>
		<category><![CDATA[Urbanism]]></category>
		<category><![CDATA[360 degree camera]]></category>
		<category><![CDATA[bike]]></category>
		<category><![CDATA[cycleways]]></category>
		<category><![CDATA[cycling]]></category>
		<category><![CDATA[gopro]]></category>
		<category><![CDATA[helmet]]></category>
		<category><![CDATA[openaerialmap]]></category>
		<category><![CDATA[opendronemap]]></category>
		<category><![CDATA[openstreetmap]]></category>
		<category><![CDATA[orthoimagery]]></category>
		<category><![CDATA[OSM]]></category>
		<category><![CDATA[photogrammetry]]></category>
		<category><![CDATA[spherical]]></category>
		<guid isPermaLink="false">https://jakecoppinger.com/?p=293</guid>

					<description><![CDATA[<p>This technical guide details how you can create your own orthorectified (aka satellite view/bird mode) imagery, point clouds and 3D models of streets with nothing but a 360 degree camera mounted on bicycle helmet, and the open source photogrammetry software OpenDroneMap.</p>
<p>The post <a href="https://jakecoppinger.com/2022/12/creating-aerial-imagery-with-a-bike-helmet-camera-and-opendronemap/">Creating aerial imagery with a bike helmet camera (GoPro) and OpenDroneMap</a> first appeared on <a href="https://jakecoppinger.com">Jake Coppinger</a>.</p>]]></description>
										<content:encoded><![CDATA[<p><em>See comments on <a href="https://news.ycombinator.com/item?id=33947618" target="_blank" rel="noopener" title="Hacker News (#3, 11 comments)">Hacker News (24 comments, 220 points)</a></em></p>



<p>This technical guide details how you can create your own orthorectified (aka satellite view/<a href="https://twitter.com/btaylor/status/1099370126678253569" target="_blank" rel="noopener" title="">bird mode</a>) imagery, point clouds and 3D models of streets with nothing but a 360 degree camera mounted on bicycle helmet, and the open source photogrammetry software <a href="https://opendronemap.org/" target="_blank" rel="noopener" title="">OpenDroneMap</a>.</p>



<figure class="wp-block-image alignwide size-large"><img loading="lazy" decoding="async" width="1024" height="772" src="https://jakecoppinger.com/wp-content/uploads/2022/12/blender-perspective-1024x772.jpg" alt="" class="wp-image-356" srcset="https://jakecoppinger.com/wp-content/uploads/2022/12/blender-perspective-1024x772.jpg 1024w, https://jakecoppinger.com/wp-content/uploads/2022/12/blender-perspective-300x226.jpg 300w, https://jakecoppinger.com/wp-content/uploads/2022/12/blender-perspective-768x579.jpg 768w, https://jakecoppinger.com/wp-content/uploads/2022/12/blender-perspective-1536x1158.jpg 1536w, https://jakecoppinger.com/wp-content/uploads/2022/12/blender-perspective-2048x1544.jpg 2048w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>



<p>Why might you want to do this? With your own up-to-date and highly detailed orthorectified imagery you could:</p>



<ul class="wp-block-list"><li>quantify and communicate inefficient road space allocation</li><li>record necessary infrastructure repairs</li><li>take measurements such as lane and cycleway widths</li><li>measure footpath obstructions in 3D and rate pedestrian amenity</li><li>map kerb features on OpenStreetMap</li><li>survey street parking using the new OSM spec: <a href="https://wiki.openstreetmap.org/wiki/Street_parking" target="_blank" rel="noopener" title="">wiki.openstreetmap.org/wiki/Street_parking</a></li><li>3D print a model of your home street!</li></ul>


<div id="mc_embed_shell"><style type="text/css">
        #mc_embed_signup{background:#fff; false;clear:left; font:14px Helvetica,Arial,sans-serif; width: 600px;}<br />
        /* Add your own Mailchimp form style overrides in your site stylesheet or in this style block.<br />
           We recommend moving this block and the preceding CSS link to the HEAD of your HTML file. */<br />
</style>
<div id="mc_embed_signup"><form id="mc-embedded-subscribe-form" class="validate" action="https://jakecoppinger.us17.list-manage.com/subscribe/post?u=3c1bd4fc8fca6648af03e916a&amp;id=ad49243f2c&amp;f_id=00d3e4e3f0" method="post" name="mc-embedded-subscribe-form" target="_blank">
<div id="mc_embed_signup_scroll">
<h2><a href="http://eepurl.com/hemS9j" target="_blank" rel="noopener">Subscribe to Jake&#8217;s blog</a></h2>
Email notifications of new blog posts are infrequent, brief, and plain text.

</div>
</form></div>
</div>


<iframe style="width:100%;max-width:100%;height:70vh" src="https://sketchfab.com/models/85937287d282425c86cd53ae85fbec35/embed?autostart=1" class=" alignfull" frameborder="0"></iframe>



<p></p>



<figure class="wp-block-image alignwide size-large"><img loading="lazy" decoding="async" width="1024" height="738" src="https://jakecoppinger.com/wp-content/uploads/2022/12/id-editor-portman-st-1024x738.png" alt="" class="wp-image-355" srcset="https://jakecoppinger.com/wp-content/uploads/2022/12/id-editor-portman-st-1024x738.png 1024w, https://jakecoppinger.com/wp-content/uploads/2022/12/id-editor-portman-st-300x216.png 300w, https://jakecoppinger.com/wp-content/uploads/2022/12/id-editor-portman-st-768x553.png 768w, https://jakecoppinger.com/wp-content/uploads/2022/12/id-editor-portman-st-1536x1107.png 1536w, https://jakecoppinger.com/wp-content/uploads/2022/12/id-editor-portman-st-2048x1476.png 2048w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /><figcaption>Viewing orthoimagery generated with a GoPro in the OpenStreetMap ID editor</figcaption></figure>



<figure class="wp-block-video alignwide"><video controls src="https://jakecoppinger.com/wp-content/uploads/2022/12/measuring-cyclepath-v2.mp4"></video><figcaption>Making measurements in WebODM. Generating 2d elevation profiles is also possible.</figcaption></figure>



<p>Drones are great for surveying and mapping things on the kerb (parking/public space/road widths/building shadows) but there are a lot of places you can&#8217;t fly a drone.</p>



<p><a href="https://opendronemap.org/" target="_blank" rel="noopener" title="">OpenDroneMap</a> is usually used to combine a set of geotagged drone images into one coherent 3d model, but it can also be used with <em>any</em> geotagged images.</p>



<p>Roughly the process is as follows:</p>



<ul class="wp-block-list"><li>Take spherical images on a 360 degree camera</li><li>Import the files from the camera</li><li>Optional: Convert the photos into rectangular, spherical images (for GoPro: convert from the proprietary GoPro format to standard spherical images using GoPro Fusion v1.2)</li><li>Start/login to WebODM, upload the images <em>with fisheye lens setting</em> for 180 degree images or <em>spherical</em> for rectangular 360 degree images and generate imagery &amp; model</li></ul>



<p>Please comment any projects you make after reading this guide, or if you have any questions! There are certainly issues with this process with I&#8217;ve added under Limitations.</p>



<h1 class="wp-block-heading">Table of Contents</h1>



<div class="wp-block-aioseo-table-of-contents"><ul><li><a href="#aioseo-aquiring-a-camera">Buying a 360 degree camera</a></li><li><a href="#aioseo-gopro-fusion-specific-tips">GoPro Fusion specific tips</a></li><li><a href="#aioseo-downloading-the-images">Preprocessing the images</a></li><li><a href="#aioseo-generating-the-model-and-orthorectified-imagery-with-web-opendronemap">Generating the model and orthorectified imagery with Web OpenDroneMap</a><ul><li><a href="#aioseo-setting-up-webodm">Setting up WebODM locally</a></li><li><a href="#aioseo-optional-adding-webodm-lightening-as-a-processing-node">Optional: Adding WebODM lightening as a processing node</a></li><li><a href="#aioseo-generating-the-model">Generating the model &#038; selecting the correct options</a></li><li><a href="#aioseo-output-accuracy-taking-measurements">Output accuracy: taking measurements</a></li></ul></li><li><a href="#aioseo-optional-create-equirectangular-photos-rather-than">Alternative: Generating and processing equirectangular spherical images</a></li><li><a href="#aioseo-limitations">Limitations</a></li><li><a href="#aioseo-appendix-things-that-didnt-work">Appendix: Things that didn&#8217;t work</a><ul><li><a href="#aioseo-removing">Removing helmet artifacts by cropping</a></li><li><a href="#aioseo-removing-helmet-artifacts-by-adding-a-mask">Removing helmet artifacts by adding a mask</a></li></ul></li><li><a href="#aioseo-further-research-to-be-done">Further research/experimentation</a></li><li><a href="#aioseo-prior-art">Prior art</a></li></ul></div>



<h2 class="wp-block-heading" id="aioseo-aquiring-a-camera">Buying a 360 degree camera</h2>



<p>360 degree cameras can be very expensive (the ones mounted on Google Street View cars are <a href="https://www.quora.com/How-much-does-Google-Street-View-cost-Google-to-operate" target="_blank" rel="noopener" title="">possibly $45,000!</a>), but GoPro makes <em>relatively</em> affordable models. See <a href="https://help.mapillary.com/hc/en-us/articles/115001465989-About-360-cameras" target="_blank" rel="noopener" title="">https://help.mapillary.com/hc/en-us/articles/115001465989-About-360-cameras</a> for more suitable cameras. One common second hand (and now unsupported) model is the GoPro Fusion, which I&#8217;ll base this guide on.</p>



<p>Note: Seth Deegan has written a PowerShell script for using Insta360 cameras with ODM. I haven&#8217;t tried it but it seems worth a look: <a href="https://github.com/lectrician1/Insta360-2-ODM" target="_blank" rel="noopener" title="">https://github.com/lectrician1/Insta360-2-ODM</a></p>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="872" height="1024" src="https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-11.09.23-pm-872x1024.png" alt="" class="wp-image-303" srcset="https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-11.09.23-pm-872x1024.png 872w, https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-11.09.23-pm-255x300.png 255w, https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-11.09.23-pm-768x902.png 768w, https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-11.09.23-pm-1308x1536.png 1308w, https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-11.09.23-pm.png 1328w" sizes="auto, (max-width: 872px) 100vw, 872px" /></figure>



<p>To generate a detailed model you will need to capture enough images close together. You could mount it to the top of your car, but unless you bought an expensive elevated mount much of the image would just be your car roof!</p>



<p>Some advantages of taking imagery from a bicycle:</p>



<ul class="wp-block-list"><li>The unobstructed field of view is larger so ODM gets more data for a better model</li><li>The average speed is lower permitting more photos (the maximum self timer frequency of this camera is 2 shots/second)</li><li>You can capture images from different places in the street for more data, such as the road, footpath and bike lanes</li><li>If you&#8217;re putting in this much effort to study public space and urban planning you probably like bikes <img src="https://s.w.org/images/core/emoji/16.0.1/72x72/1f642.png" alt="🙂" class="wp-smiley" style="height: 1em; max-height: 1em;" /></li></ul>



<p>You&#8217;ll need to buy a helmet mount for the GoPro or your chosen camera. GoPro makes a bicycle helmet mount: <a href="https://gopro.com/en/us/shop/mounts-accessories /vented-helmet-strap-mount/GVHS30.html" target="_blank" rel="noopener" title="">https://gopro.com/en/us/shop/mounts-accessories /vented-helmet-strap-mount/GVHS30.html</a> (though many non-branded mounts exist on eBay/Amazon too). Be aware a camera mounted on a bicycle helmet is possibly a safety risk to yourself if you crash &#8211; be careful.</p>



<p>If you have the ability to mount the camera on a long pole above your head, this will increase the unobstructed field of view and improve the perspective of the street which will improve the results. I imagine this would attract even more attention! If you have tips for building rigs like this please leave a comment. Andrew Harvey wrote an OpenStreetMap diary entry on his setup here: <a href="https://www.openstreetmap.org/user/aharvey/diary/42139" target="_blank" rel="noopener" title="">https://www.openstreetmap.org/user/aharvey/diary/42139</a></p>



<h2 class="wp-block-heading" id="aioseo-gopro-fusion-specific-tips">GoPro Fusion specific tips</h2>



<p>Set the camera to:</p>



<ul class="wp-block-list"><li>Timelapse photo mode (icon of camera &amp; timer circle)</li><li>Image frequency to 0.5 seconds</li><li>Enable GPS geotagging</li></ul>



<p>Mount the camera to the helmet, wait until the GPS location icon turns solid, and start capturing! Sometimes the first few shots don&#8217;t have any location in the EXIF data but this doesn&#8217;t seem to confuse WebODM, you can check this with <code>identify -verbose image</code> or your metadata viewer of choice.</p>



<h1 class="wp-block-heading" id="aioseo-downloading-the-images">Preprocessing the images</h1>



<p>The GoPro Fusion will output two 180 degree &#8220;fisheye&#8221; images to two separate SD cards. Copy all these images to a folder onto your computer.</p>



<figure class="wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-1 is-layout-flex wp-block-gallery-is-layout-flex">
<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="2560" height="2474" data-id="359" src="https://jakecoppinger.com/wp-content/uploads/2022/12/GB010277-scaled.jpg" alt="" class="wp-image-359" srcset="https://jakecoppinger.com/wp-content/uploads/2022/12/GB010277-scaled.jpg 2560w, https://jakecoppinger.com/wp-content/uploads/2022/12/GB010277-300x290.jpg 300w, https://jakecoppinger.com/wp-content/uploads/2022/12/GB010277-1024x990.jpg 1024w, https://jakecoppinger.com/wp-content/uploads/2022/12/GB010277-768x742.jpg 768w, https://jakecoppinger.com/wp-content/uploads/2022/12/GB010277-1536x1485.jpg 1536w, https://jakecoppinger.com/wp-content/uploads/2022/12/GB010277-2048x1979.jpg 2048w" sizes="auto, (max-width: 2560px) 100vw, 2560px" /></figure>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="2560" height="2474" data-id="358" src="https://jakecoppinger.com/wp-content/uploads/2022/12/GF010277-scaled.jpg" alt="" class="wp-image-358" srcset="https://jakecoppinger.com/wp-content/uploads/2022/12/GF010277-scaled.jpg 2560w, https://jakecoppinger.com/wp-content/uploads/2022/12/GF010277-300x290.jpg 300w, https://jakecoppinger.com/wp-content/uploads/2022/12/GF010277-1024x990.jpg 1024w, https://jakecoppinger.com/wp-content/uploads/2022/12/GF010277-768x742.jpg 768w, https://jakecoppinger.com/wp-content/uploads/2022/12/GF010277-1536x1485.jpg 1536w, https://jakecoppinger.com/wp-content/uploads/2022/12/GF010277-2048x1979.jpg 2048w" sizes="auto, (max-width: 2560px) 100vw, 2560px" /></figure>
<figcaption class="blocks-gallery-caption">Front and back images</figcaption></figure>



<p>For an alternate method, see instructions under the generating equirectangular image heading.</p>



<p>Unfortunately, only the &#8220;front&#8221; images have geotagged information, so you&#8217;ll need to copy the exif data from each &#8220;back&#8221; image to each similarly named &#8220;front&#8221; image.</p>



<p>A quick and dirty way of doing this is generating a list of terminal commands, that copies all exif tags from each &#8220;front&#8221; to each &#8220;back&#8221; photo:</p>



<p><code>ls -l | grep "GF" | sed -E "s/^.* GF(.*).JPG*/exiftool −overwrite_original_in_place -gps:all -tagsFromFile GF\1.JPG GB\1.JPG/g" &gt; script.sh</code></p>



<p>I&#8217;m sure there are better and easier ways of doing this, please let me know if you come up with one.</p>



<h1 class="wp-block-heading" id="aioseo-generating-the-model-and-orthorectified-imagery-with-web-opendronemap">Generating the model and orthorectified imagery with Web OpenDroneMap</h1>



<p>WebODM (<a href="https://opendronemap.org/" target="_blank" rel="noreferrer noopener">https://opendronemap.org</a>) is an absolute marvel of open source engineering. Unfortunately, generating 3D models takes some serious computing horsepower. You can either use the paid cloud version (WebODM Lightening) at <a href="https://webodm.net/" target="_blank" rel="noopener" title="">https://webodm.net/</a> (fast) or run the software on your own computer (a few hours/overnight/days depending on the number of images).</p>



<p>You can directly upload images to WebODM Lightening to process, however you don&#8217;t get some friendly/useful features like a browsable map and 3d model viewer. I recommend setting up WebODM locally and adding WebODM Lightening as a &#8220;processing node&#8221;, so you get the power of the cloud and the extra features of WebODM.</p>



<h2 class="wp-block-heading" id="aioseo-setting-up-webodm">Setting up WebODM locally</h2>



<p>Running the software using Docker is a breeze. Install Docker from <a href="https://www.docker.com/" target="_blank" rel="noopener" title="">https://www.docker.com/</a> (or your preferred method), allocate as much memory &amp; CPUs as you can, and then:</p>



<pre class="wp-block-code"><code>git clone https://github.com/OpenDroneMap/WebODM --config core.autocrlf=input --depth 1
cd WebODM
./webodm.sh start </code></pre>



<p>See <a href="https://github.com/OpenDroneMap/WebODM#getting-started" target="_blank" rel="noopener" title="">https://github.com/OpenDroneMap/WebODM#getting-started</a> more more details including GPU acceleration.</p>



<p>You&#8217;ll now be able to open WebODM on <a href="http://localhost:8000" target="_blank" rel="noopener" title="">http://localhost:8000</a></p>



<h2 class="wp-block-heading" id="aioseo-optional-adding-webodm-lightening-as-a-processing-node">Optional: Adding WebODM lightening as a processing node</h2>



<p>Click lightening network in your WebODM sidebar and login.</p>



<h2 class="wp-block-heading" id="aioseo-generating-the-model">Generating the model &amp; selecting the correct options</h2>



<p>Add a new project, click &#8220;Select images &amp; GCP&#8221;, select your images, then you will see options for processing your imagery. If you&#8217;re using WebODM Lightening, make sure to set the Processing Node appropriately.</p>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="421" src="https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-10.30.39-pm-1024x421.png" alt="" class="wp-image-296" srcset="https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-10.30.39-pm-1024x421.png 1024w, https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-10.30.39-pm-300x123.png 300w, https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-10.30.39-pm-768x316.png 768w, https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-10.30.39-pm.png 1368w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>



<p></p>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="759" src="https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-10.47.37-pm-1024x759.png" alt="" class="wp-image-300" srcset="https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-10.47.37-pm-1024x759.png 1024w, https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-10.47.37-pm-300x222.png 300w, https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-10.47.37-pm-768x569.png 768w, https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-10.47.37-pm-1536x1139.png 1536w, https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-10.47.37-pm.png 1538w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>



<p>Some options I&#8217;ve found are required:</p>



<p>If you are using the raw 180 degree &#8220;fisheye&#8221; images straight out of the GoPro Fusion, set <code>camera-lens</code> to <strong>fisheye</strong> (this is critical). If you are using equirectangular images set to <code><strong>spherical</strong></code>. WebODM seems to be unable to auto-identify the lens type in both cases.</p>



<ul class="wp-block-list"><li>enable <code>sky removal</code> &#8211; this uses AI to mask out the sky in each frame so that there are less artifacts in the final model</li><li>enable <code>auto-boundary</code></li><li>enable <code>bg-removal</code></li></ul>



<p>This is just from my trial and error, there may be better option configurations.</p>



<p>You will likely need to enable &#8220;resize images&#8221; to 2000px or it is very easy to run out of memory.</p>



<p>Though it is out of scope for this article, you can also set up a VPS instance to speed up the process if you don&#8217;t want to use the hosted cloud processing tool. Ive tried this, but it&#8217;s probably more effort than it&#8217;s worth unless you&#8217;re making a business out of this.</p>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="619" src="https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-10.33.27-pm-1024x619.png" alt="" class="wp-image-297" srcset="https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-10.33.27-pm-1024x619.png 1024w, https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-10.33.27-pm-300x181.png 300w, https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-10.33.27-pm-768x464.png 768w, https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-10.33.27-pm-1536x929.png 1536w, https://jakecoppinger.com/wp-content/uploads/2022/11/Screen-Shot-2022-11-17-at-10.33.27-pm-2048x1238.png 2048w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /><figcaption>64 cores and 124GB of RAM!</figcaption></figure>



<p>Even with an EC2 instance with 124GB of RAM and 324 images at 5760 × 2180 pixels I ran out of memory &#8211; remember to enable some swap space if you want to run at full size: <a href="https://www.digitalocean.com/community/tutorials/how-to-add-swap-space-on-ubuntu-18-04" target="_blank" rel="noopener" title="">https://www.digitalocean.com/community/tutorials/how-to-add-swap-space-on-ubuntu-18-04</a></p>



<h2 class="wp-block-heading" id="aioseo-output-accuracy-taking-measurements">Output accuracy: taking measurements</h2>



<p>OpenDroneMap can make measurements of the 3D model. I&#8217;ve found these are usually quite accurate; they may have small &#8220;measurement&#8221; errors due to artifacts or distortion but will tend towards the correct value &#8211; GPS coordinates ensure the scale is correct.</p>



<p>For example, measuring the width of the green paint of the cycle path:</p>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="699" src="https://jakecoppinger.com/wp-content/uploads/2022/12/measuring-cycle-path-compressed-1024x699.jpg" alt="" class="wp-image-378" srcset="https://jakecoppinger.com/wp-content/uploads/2022/12/measuring-cycle-path-compressed-1024x699.jpg 1024w, https://jakecoppinger.com/wp-content/uploads/2022/12/measuring-cycle-path-compressed-300x205.jpg 300w, https://jakecoppinger.com/wp-content/uploads/2022/12/measuring-cycle-path-compressed-768x524.jpg 768w, https://jakecoppinger.com/wp-content/uploads/2022/12/measuring-cycle-path-compressed-1536x1049.jpg 1536w, https://jakecoppinger.com/wp-content/uploads/2022/12/measuring-cycle-path-compressed-2048x1399.jpg 2048w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>



<p></p>



<figure class="wp-block-gallery alignwide has-nested-images columns-default is-cropped wp-block-gallery-2 is-layout-flex wp-block-gallery-is-layout-flex">
<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="857" data-id="381" src="https://jakecoppinger.com/wp-content/uploads/2022/12/large-path-1-1024x857.jpg" alt="" class="wp-image-381" srcset="https://jakecoppinger.com/wp-content/uploads/2022/12/large-path-1-1024x857.jpg 1024w, https://jakecoppinger.com/wp-content/uploads/2022/12/large-path-1-300x251.jpg 300w, https://jakecoppinger.com/wp-content/uploads/2022/12/large-path-1-768x643.jpg 768w, https://jakecoppinger.com/wp-content/uploads/2022/12/large-path-1-1536x1286.jpg 1536w, https://jakecoppinger.com/wp-content/uploads/2022/12/large-path-1.jpg 2000w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="768" data-id="380" src="https://jakecoppinger.com/wp-content/uploads/2022/12/small-path-1024x768.jpg" alt="" class="wp-image-380" srcset="https://jakecoppinger.com/wp-content/uploads/2022/12/small-path-1024x768.jpg 1024w, https://jakecoppinger.com/wp-content/uploads/2022/12/small-path-300x225.jpg 300w, https://jakecoppinger.com/wp-content/uploads/2022/12/small-path-768x576.jpg 768w, https://jakecoppinger.com/wp-content/uploads/2022/12/small-path-1536x1152.jpg 1536w, https://jakecoppinger.com/wp-content/uploads/2022/12/small-path.jpg 2000w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>
</figure>



<p>Large path:</p>



<ul class="wp-block-list"><li>OpenDroneMap: 4.54 metres</li><li>Tape measure: 4.56 metres</li></ul>



<p>Small path:</p>



<ul class="wp-block-list"><li>OpenDroneMap: 1.89 metres (though this varies slightly due to distortions)</li><li>Tape measure: 1.89 metres</li></ul>



<p></p>



<h1 class="wp-block-heading" id="aioseo-optional-create-equirectangular-photos-rather-than">Alternative: Generating and processing equirectangular spherical images</h1>



<p>I&#8217;m currently unclear whether this method is faster or produces better results than the raw 180 degree images. I&#8217;m very interested to hear if you have experience (and I&#8217;ll update this if I get more evidence). I believe the GroPro Max generates these images <em>in camera</em>, so this won&#8217;t be required.</p>



<p>If you use the GoPro Fusion, unfortunately because this camera is no longer supported GoPro discontinued the software: <a href="https://gopro.com/en/au/news/fusion-end-of-life" target="_blank" rel="noopener" title="">https://gopro.com/en/au/news/fusion-end-of-life</a></p>



<p>You&#8217;ll (unfortunately) need to use the proprietary GoPro Fusion software to generate spherical images. I&#8217;ve found on an M1 Mac (Monterey) the only version that still runs is 1.2, 1.4 just crashes. The least dodgy download I can find is <a href="https://macdownload.informer.com/fusion-studio/1.2/" target="_blank" rel="noopener" title="">https://macdownload.informer.com/fusion-studio/1.2/</a>.</p>



<p>If you make an open source solution (maybe building from <a href="https://stackoverflow.com/questions/37796911/is-there-a-fisheye-or-dual-fisheye-to-equirectangular-filter-for-ffmpeg" target="_blank" rel="noopener" title="">https://stackoverflow.com/questions/37796911/is-there-a-fisheye-or-dual-fisheye-to-equirectangular-filter-for-ffmpeg</a>) please let me know! Edit: Looks like this repo will do it. <a href="https://github.com/trek-view/fusion2sphere" target="_blank" rel="noopener" title="">https://github.com/trek-view/fusion2sphere</a></p>



<p>This camera has two SD card slots &#8211; one for the front facing camera, and one for the back facing camera. Sometimes only one camera works for part of the shoot and the proprietary software refuses to stitch any of the photos at all! The only way I&#8217;ve found to combat this is to format the card in camera before each shoot. Spending more on a GoPro Max may solve a lot of headaches!</p>



<h1 class="wp-block-heading" id="aioseo-limitations">Limitations</h1>



<p>Little helmets appear! I&#8217;ve tried some techniques but haven&#8217;t found a proper solution that doesn&#8217;t degrade the model/orthophoto structure.</p>



<h1 class="wp-block-heading" id="aioseo-appendix-things-that-didnt-work">Appendix: Things that didn&#8217;t work</h1>



<h2 class="wp-block-heading" id="aioseo-removing">Removing helmet artifacts by cropping</h2>



<p>I tried using the <code>convert</code> and <code>mogrify</code> tools (part of the amazing open source <code>ImageMagick</code> suite) can crop the spherical photos <em>and retain the geographical information!</em> The cropping works great, but OpenDroneMap doesn&#8217;t seem to be able to understand a cropped equirectangular image. If you&#8217;d like to try to get this working, here are the steps I took.</p>



<p>First install ImageMagick (<code>sudo apt-get install imagemagick</code> on Linux/<code>brew install imagemagick</code> on macOS).</p>



<p>The general syntax is:<br><code>convert -crop {x_size}x{y_size}+{x_offset}+{y_offset} inputfile outputfile</code></p>



<p>See <a href="https://deparkes.co.uk/2015/04/30/batch-crop-images-with-imagemagick/" target="_blank" rel="noopener" title="">https://deparkes.co.uk/2015/04/30/batch-crop-images-with-imagemagick/</a> for more detail and helpful diagrams.</p>



<p>Helpfully, the helmet artifact is always at the bottom of frame, so the x offset and y offset will be zero.</p>



<p>To crop 700 pixels off the bottom of a spherical image, you can use</p>



<p><code>convert -crop 5760x"$((2880-700))"+0+0 inputfile output</code>file</p>



<p>To batch convert a number of files, where the input files are in <code>./input/</code>, and you have made an output directory <code>./output/</code>, you can use:</p>



<p><code>mogrify -monitor -crop 5760x"$((2880-700))"+0+0 -path ./output ./input/*</code></p>



<h2 class="wp-block-heading" id="aioseo-removing-helmet-artifacts-by-adding-a-mask">Removing helmet artifacts by adding a mask</h2>



<p>Adding custom drawn masks, either covering just the helmet or also the surrounding black &#8220;void&#8221; either:</p>



<ul class="wp-block-list"><li>removed the helments but caused extra &#8220;holes&#8221;/missing segments in the ground</li><li>didn&#8217;t remove the helmets</li></ul>



<p>I think this may be due to clashing behaviour of the <code>bg-removal</code> and <code>sky-removal</code> flags with manual masks.</p>



<p>ODM requires a mask image for every image with <code>_mask.{EXT}</code> as a suffix (where <code>EXT</code> replaces the original extension).</p>



<p>The two mask types I tried:</p>



<figure class="wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-3 is-layout-flex wp-block-gallery-is-layout-flex">
<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="2560" height="2474" data-id="360" src="https://jakecoppinger.com/wp-content/uploads/2022/12/GB010284_mask-scaled.jpg" alt="" class="wp-image-360" srcset="https://jakecoppinger.com/wp-content/uploads/2022/12/GB010284_mask-scaled.jpg 2560w, https://jakecoppinger.com/wp-content/uploads/2022/12/GB010284_mask-300x290.jpg 300w, https://jakecoppinger.com/wp-content/uploads/2022/12/GB010284_mask-1024x990.jpg 1024w, https://jakecoppinger.com/wp-content/uploads/2022/12/GB010284_mask-768x742.jpg 768w, https://jakecoppinger.com/wp-content/uploads/2022/12/GB010284_mask-1536x1485.jpg 1536w, https://jakecoppinger.com/wp-content/uploads/2022/12/GB010284_mask-2048x1979.jpg 2048w" sizes="auto, (max-width: 2560px) 100vw, 2560px" /></figure>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="2560" height="2474" data-id="361" src="https://jakecoppinger.com/wp-content/uploads/2022/12/GB010277_mask-scaled.jpg" alt="" class="wp-image-361" srcset="https://jakecoppinger.com/wp-content/uploads/2022/12/GB010277_mask-scaled.jpg 2560w, https://jakecoppinger.com/wp-content/uploads/2022/12/GB010277_mask-300x290.jpg 300w, https://jakecoppinger.com/wp-content/uploads/2022/12/GB010277_mask-1024x990.jpg 1024w, https://jakecoppinger.com/wp-content/uploads/2022/12/GB010277_mask-768x742.jpg 768w, https://jakecoppinger.com/wp-content/uploads/2022/12/GB010277_mask-1536x1485.jpg 1536w, https://jakecoppinger.com/wp-content/uploads/2022/12/GB010277_mask-2048x1979.jpg 2048w" sizes="auto, (max-width: 2560px) 100vw, 2560px" /></figure>
</figure>



<p>If you know what may be happening please let me know!</p>



<h1 class="wp-block-heading" id="aioseo-further-research-to-be-done">Further research/experimentation</h1>



<ul class="wp-block-list"><li>How many &#8220;capture lines&#8221; per street are necessary to make a decent model? I didn&#8217;t have any luck with one but it may have been my settings. I used ~4 for the above model.</li><li>How much better are models when the camera is on a stick?</li><li>Improved WebODM settings for generation</li></ul>



<h1 class="wp-block-heading" id="aioseo-prior-art">Prior art</h1>



<ul class="wp-block-list"><li>&#8220;A Satellite in Your Pocket: Ground Based Action Cameras to Create Aerial Perspective for OSM Editing&#8221;. This was recorded at OpenStreetMap US: Connect 2020 by Sean Gorman. The company Pixel8 doesn&#8217;t appear to exist any more.<br><a href="https://www.youtube.com/watch?v=tfab-iuWlsQ" target="_blank" rel="noopener" title="">https://www.youtube.com/watch?v=tfab-iuWlsQ</a></li><li>Twitter thread by <a href="https://twitter.com/klaskarlsson">@klaskarlsson</a> (@klaskarlsson@fosstodon.org on Mastodon) on creating a model of his house using a &#8220;GoPro on a stick&#8221;: https://twitter.com/klaskarlsson/status/1583401741386936320</li><li>&#8220;<a href="https://community.opendronemap.org/t/create-aerial-imagery-base-on-360-pictures/12339">Create aerial imagery base on 360° pictures</a>&#8221; &#8211; discussion in the OpenDroneMap community: <a href="https://community.opendronemap.org/t/create-aerial-imagery-base-on-360-pictures/12339" target="_blank" rel="noopener" title="">https://community.opendronemap.org/t/create-aerial-imagery-base-on-360-pictures/12339</a></li></ul>



<p></p><p>The post <a href="https://jakecoppinger.com/2022/12/creating-aerial-imagery-with-a-bike-helmet-camera-and-opendronemap/">Creating aerial imagery with a bike helmet camera (GoPro) and OpenDroneMap</a> first appeared on <a href="https://jakecoppinger.com">Jake Coppinger</a>.</p>]]></content:encoded>
					
					<wfw:commentRss>https://jakecoppinger.com/2022/12/creating-aerial-imagery-with-a-bike-helmet-camera-and-opendronemap/feed/</wfw:commentRss>
			<slash:comments>17</slash:comments>
		
		<enclosure url="https://jakecoppinger.com/wp-content/uploads/2022/12/measuring-cyclepath-v2.mp4" length="3108898" type="video/mp4" />

			</item>
	</channel>
</rss>
