59. R pyinaturalist Sites/Apps Using iNaturalist’s API

Developer Info

Tools

Sites/Apps Using iNaturalist’s API

https://forum.inaturalist.org/t/how-to-replicate-world-tour-figures/13109/8

eg:

the hard/inefficient part is positioning the markers on the map based on where the center of gravity of observations is for an observer. loarie describes generally how he’s doing it here: https://forum.inaturalist.org/t/china-inaturalist-world-tour/5440/4.

(http://api.inaturalist.org/v1/docs/#!/Observations/get_observations_observers), then downloading all observations in the area for each top observer, and then probably running it through some algorithm in R to find that center of gravity.

iNat Visualizations Using R

iNaturalist Visualization: Flexdashboard (R markdown)

https://forum.inaturalist.org/t/inaturalist-visualization-flexdashboard-r-markdown/13036

https://forum.inaturalist.org/t/tool-for-exporting-inaturalist-data-to-irecord/19160
https://forum.inaturalist.org/t/how-to-efficiently-revisit-locations-of-numerous-observations/17969/12

I also agree, that was my conclusion also. I had considered taking over development of rinat, but then realized that for any scientific work GBIF is a better source (more comprehensive data set, citeable data queries, …). As it exists, rinat is a simple repackaging of the iNat v1 API, without the more useful tools for working with the data post-download, and the more recent ROpenSci plans don’t go much further than adding the v2 API to the same framework. From following the user requests on this forum, there seems to be more interest in tools for “social” summaries (searching and/or displaying a user’s

  1. (5-10 minutes) custom searches using URL parameters
  2. (5 minutes) downloading data from iNaturalist via CSV export
  3. (10 minutes) downloading iNaturalist data from GBIF (including pros and cons)
  4. (5-15 minutes) demystifying the API – part 1 (general overview – what is an API, what kind of data can be accessed via iNaturalist’s APIs, and what are pros and cons of API vs other ways to get data?)
  5. (5-15 minutes) demystifying the API – part 2 (conceptual walkthrough getting data from the observation endpoint using the Swagger interface)
  6. (10-15 minutes) demystifying the API – part 3 (example – get 1000 observations into R)
  7. (10 minutes) demystifying the API – part 4 (example – get histogram data – or other aggregated data – into Excel)
  8. (10-30 minutes) demystifying the API – part 5a (example – producing a tiled map visualization in QGIS, including pulling data from GBIF, too)
  9. (5 minutes) demystifying the API – part 5b (example – producing and sharing a tiled map visualization in ArcGIS Online in under 5 minutes)
  10. (5-10 minutes) demystifying the API – part 6 (general best practices and limitations of the API)

Bronnen Literatuur Referenties

https://forum.inaturalist.org/t/inaturalist-visualization-flexdashboard-r-markdown/13036
https://forum.inaturalist.org/t/inat-visualizations-using-r/13724
https://forum.inaturalist.org/t/wiki-external-code-tools-etc-for-working-with-inat/15906
https://forum.inaturalist.org/t/wiki-external-code-tools-etc-for-working-with-inat/15906
https://github.com/oliverburrus/iNat_Visualizations
https://forum.inaturalist.org/t/how-to-efficiently-revisit-locations-of-numerous-observations/17969/12
https://forum.inaturalist.org/t/species-or-color-search-on-photos/17611
https://forum.inaturalist.org/t/inaturalist-computer-vision-for-video-game/18835/5
https://blog.nature.org/science/2015/08/03/wolf-coyote-coywolf-understanding-wolf-hybrids-just-got-a-bit-easier/ 3



you can also specify multiple places and other parameters. for example, here are the top 10 reptile observers in Netherlands+Belgium+Denmark:
https://jumear.github.io/stirfry/iNat_top_observers_map.html?place_id=7506,7008,8051&&iconic_taxa=Reptilia

https://www.inaturalist.org/observations?place_id=6734
Singapore.
otal observation count: 105,642
created month max count: 5,390
max created month: 2020-01-01

https://www.inaturalist.org/observations?place_id=6966
Indonesia.
total observation count: 162,069
created month max count: 9,644
max created month: 2019-08-01

https://www.inaturalist.org/observations?place_id=7190
Argentina.
total observation count: 220,423
created month max count: 27,039
max created month: 2020-04-01

https://forum.inaturalist.org/t/how-to-efficiently-revisit-locations-of-numerous-observations/17969/12
https://www.inaturalist.org/observations/export



In November 2020 we have updated the Explore pages for "All records" and "My records". You should find that these load a lot faster than the previous versions, and are capable of displaying all records (so that the default restriction to the latest month of records that was set on the old All records page is no longer needed). Many of the features of the new pages are similar to the old, but there are some changes, described below. The old versions of the pages will remain available for some time, so that these can still be accessed if needed.

In order to achieve the faster performance, we have to make a copy of the records in iRecord, so that they can be mapped and queried using 'elasticsearch' technology. This means that when you add a record to iRecord, there will be a short delay before it appears on the My records and All records pages. This delay is usually around 15 minutes, but depends on how busy the system is.

The examples below are based on the “Explore – All records” part of iRecord, but most of this also applies to “Explore – My records”.

The Explore page layout
From the “Explore” menu at the top of the page, go to “All records”. You’ll see a map display, with records listed underneath. The main elements of the Explore pages are shown here:

https://www.brc.ac.uk/irecord/explore-and-filter
https://www.elastic.co/elasticsearch/
https://www.elastic.co/blog/introducing-elasticsearch-searchable-snapshots

lasticsearch is a distributed, RESTful search and analytics engine capable of addressing a growing number of use cases. As the heart of the Elastic Stack, it centrally stores your data for lightning fast search, fine‑tuned relevancy, and powerful analytics that scale with ease.

Now this endpoint returns a TON of information, basically everything needed to produce what you see on an Observation page on iNaturalist. You’ll probably only want a small subset of this information. Here’s an example to pare it down to just the basics, to make it easier to look over:

If you want to do any processing on that information, though, you’ll be better off using the iNaturalist API, specifically the GET /observations endpoint. You can see an example of some raw observation data from your project here.

I maintain pyinaturalist, which is a tool that helps work with this information in the python programming language. Here’s an example query to get the relevant observations from your project, with comments explaining what each parameter does:

from pyinaturalist.node_api import get_all_observations
observations = get_all_observations(
project_id=36883, # ID of the 'Sugarloaf Ridge State Park' project
created_d1='2020-01-01', # Get observations from January 2020...
created_d2='2020-09-30', # ...through September 2020 (adjust these dates as needed)
geo=True, # Only get observations with geospatial coordinates
geoprivacy='open', # Only get observations with public coordinates (not obscured/private)
)
https://github.com/oliverburrus/iNat_Visualizations
https://forum.inaturalist.org/t/how-to-efficiently-revisit-locations-of-numerous-observations/17969/12


seems like you would need something to detect individuals, and then you would then need something to classify each detected individual. Microsoft has this thing (https://github.com/microsoft/CameraTraps/blob/master/megadetector.md) that might help with the first part, but i bet you would need to design your own way of addressing the second part. so you could train an AI to identify species of interest, or you could just do some sort of simple color analysis maybe. (Microsoft does actually have a species classification model available: https://github.com/Microsoft/SpeciesClassification 2, but i don’t know if that model has been trained on your species of interest. you could also just use their framework on your own dataset, or do something on else your own. i bet there are other things available out there, not just stuff developed by Microsoft.)

iNaturalist featured something about shedding mountain goats back in the day (https://www.inaturalist.org/blog/16807-mountain-goat-molts-inat-photos-and-climate-change 1), and there was also a thread about someone’s winter break project analyzing maple leaf color (https://forum.inaturalist.org/t/analyzing-red-maple-leaf-color-in-10-950-inaturalist-observations/9376). these things seem somewhat similar to what you’re describing. so you might be able to reach out to such folks to get more specific advice, if others don’t provide better answers in this thread. or maybe a quick trip or e-mail to your local university might also get you some quick answers to your question (and possibly even some help with your effort)…
https://forum.inaturalist.org/t/species-or-color-search-on-photos/17611
https://forum.inaturalist.org/t/inaturalist-computer-vision-for-video-game/18835/5

FYI, this is for a capstone project for a Deep Learning certification, and the problem I’m trying to solve is detecting Eastern Coyotes (coywolves) on my sheep farm, and distinguishing them from my English Shepherds, who look quite similar. There is significant variation in the Eastern Coyote due to varying levels of interbreeding with wolves and dogs, and here’s an article with a picture that shows the conundrum of trying to classify it cleanly - is that a wolf or a coyote?

https://blog.nature.org/science/2015/08/03/wolf-coyote-coywolf-understanding-wolf-hybrids-just-got-a-bit-easier/ 3

  1. https://www.delpher.nl/nl/kranten/results?query=Roege Bos
  2. https://www.dekrantvantoen.nl
Publicado el enero 1, 2021 09:12 TARDE por ahospers ahospers

Comentarios

Publicado por ahospers hace alrededor de 3 años

Agregar un comentario

Acceder o Crear una cuenta para agregar comentarios.