Coordinate Systems#
Introduction#
Because of the many different data representations, dealing with coordinates in the MICrONs data is not entirely simple and it is easy to make mistakes by converting between coordinate systems incorrectly.
There are three main coordinate systems that wind up getting used:
Voxel coordinates are the coordinates of a point in the original image volume. These are the coordinates that are used to index into the volumes you can see in Neuroglancer, but each number has a potentially different unit. In the MICrONs data, a voxel is 4 nm wide in the x and y directions, and 40 nm long in the z direction. This means that a 1x1x1 micron cube would be represented by a 250x250x25 voxel span. Annotations (such as synapses) are stored in voxel coordinates.
Nanometer coordinates are the coordinates of a point in the original image volume, but in nanometers. This is equivalent to the voxel coordinate multiplied by the voxel resolution, with no further transformation applied. Mesh and skeleton vertices are stored in nanometer coordinates.
Transformed coordinates reflect a transformation that has been applied to the original image volume. This transformation is a rotation to make the pia surface as flat as possible, a translation to move the pial surface to y=0, and a scaling to bring coordinates into microns. Transformed coordinates are convenient for more accurate computations of depth and the pia-to-white-matter axis, but are not stored by default. A python package
standard_transform
helps convert data to and from transformed coordinates.
Important
Note that in all of these coordinate systems (including Neuroglancer), the y axis increases with depth. This is a standard definition when working with images, but is the opposite of what you usually think of with points.
Because of that, when plotting annotations or neuroanatomy in matplotlib, you will usually have to invert the y axis with ax.invert_yaxis()
.
Standard Transform#
Standard Transform is a python package designed to convert voxel and nanometer coordinates to transformed coordinates, with particular emphasis on the MICrONs data.
Installation#
Standard Transform can be installed from pip: pip install standard_transform
Why use Standard Transform?#
Let’s look at the coordinates of every excitatory neuron in the MICrONs data to see why we might want to use transformed coordinates.
import matplotlib.pyplot as plt
import seaborn as sns
from caveclient import CAVEclient
client = CAVEclient('minnie65_public')
# set version, for consistency across time
client.materialize.version = 1078 # Current as of Summer 2024
ct_df = client.materialize.query_table('aibs_metamodel_celltypes_v661', split_positions=True)
# convert to nanometers
ct_df['pt_position_x_nm'] = ct_df['pt_position_x'] * 4
ct_df['pt_position_y_nm'] = ct_df['pt_position_y'] * 4
ct_df['pt_position_z_nm'] = ct_df['pt_position_z'] * 40
fig, ax = plt.subplots(figsize=(5,4))
sns.scatterplot(
x='pt_position_x_nm',
y='pt_position_y_nm',
s=3,
data=ct_df.sample(10_000), # Pick a random sample of 10,000 points, enough to see the shape of the data.
ax=ax,
)
# Important to flip the y axis to have y increase with depth!
ax.invert_yaxis()
sns.despine(ax=ax)
---------------------------------------------------------------------------
RecursionError Traceback (most recent call last)
Cell In[2], line 8
5 client = CAVEclient('minnie65_public')
7 # set version, for consistency across time
----> 8 client.materialize.version = 1078 # Current as of Summer 2024
10 ct_df = client.materialize.query_table('aibs_metamodel_celltypes_v661', split_positions=True)
12 # convert to nanometers
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/frameworkclient.py:449, in CAVEclientFull.materialize(self)
444 """
445 A client for the materialization service. See [client.materialize](../client_api/materialize.md)
446 for more information.
447 """
448 if self._materialize is None:
--> 449 self._materialize = MaterializationClient(
450 server_address=self.local_server,
451 auth_client=self.auth,
452 datastack_name=self._datastack_name,
453 synapse_table=self.info.get_datastack_info().get("synapse_table", None),
454 max_retries=self._max_retries,
455 pool_maxsize=self._pool_maxsize,
456 pool_block=self._pool_block,
457 over_client=self,
458 desired_resolution=self.desired_resolution,
459 )
460 return self._materialize
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/materializationengine.py:201, in MaterializationClient(server_address, datastack_name, auth_client, cg_client, synapse_table, api_version, version, verify, max_retries, pool_maxsize, pool_block, desired_resolution, over_client)
189 endpoints, api_version = _api_endpoints(
190 api_version,
191 SERVER_KEY,
(...)
197 verify=verify,
198 )
200 MatClient = client_mapping[api_version]
--> 201 return MatClient(
202 server_address,
203 auth_header,
204 api_version,
205 endpoints,
206 SERVER_KEY,
207 datastack_name,
208 cg_client=cg_client,
209 synapse_table=synapse_table,
210 version=version,
211 verify=verify,
212 max_retries=max_retries,
213 pool_maxsize=pool_maxsize,
214 pool_block=pool_block,
215 over_client=over_client,
216 desired_resolution=desired_resolution,
217 )
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/materializationengine.py:1924, in MaterializationClientV3.__init__(self, *args, **kwargs)
1922 if self.fc is not None:
1923 if metadata[0].result() is not None and metadata[1].result() is not None:
-> 1924 tables = TableManager(
1925 self.fc, metadata[0].result(), metadata[1].result()
1926 )
1927 self._tables = tables
1928 if self.fc is not None:
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/tools/table_manager.py:669, in TableManager.__init__(self, client, metadata, schema)
667 populate_table_cache(client, metadata=self._table_metadata)
668 for tn in self._tables:
--> 669 setattr(self, tn, make_query_filter(tn, self._table_metadata[tn], client))
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/tools/table_manager.py:623, in make_query_filter(table_name, meta, client)
614 def make_query_filter(table_name, meta, client):
615 (
616 pts,
617 val_cols,
618 all_unbd_pts,
619 table_map,
620 rename_map,
621 table_list,
622 desc,
--> 623 ) = get_table_info(table_name, meta, client)
624 class_vals = make_class_vals(
625 pts, val_cols, all_unbd_pts, table_map, rename_map, table_list
626 )
627 QueryFilter = attrs.make_class(
628 table_name, class_vals, bases=(make_kwargs_mixin(client),)
629 )
File /opt/envs/allensdk/lib/python3.10/site-packages/cachetools/__init__.py:741, in cached.<locals>.decorator.<locals>.wrapper(*args, **kwargs)
739 except KeyError:
740 pass # key not found
--> 741 v = func(*args, **kwargs)
742 try:
743 cache[k] = v
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/tools/table_manager.py:267, in get_table_info(tn, meta, client, allow_types, merge_schema, suffixes)
265 name_ref = None
266 else:
--> 267 schema = table_metadata(ref_table, client).get("schema")
268 ref_pts, ref_cols, ref_unbd_pts = get_col_info(
269 meta["schema"], client, allow_types=allow_types, omit_fields=["target_id"]
270 )
271 name_base = ref_table
File /opt/envs/allensdk/lib/python3.10/site-packages/cachetools/__init__.py:741, in cached.<locals>.decorator.<locals>.wrapper(*args, **kwargs)
739 except KeyError:
740 pass # key not found
--> 741 v = func(*args, **kwargs)
742 try:
743 cache[k] = v
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/tools/table_manager.py:314, in table_metadata(table_name, client, meta)
312 warnings.simplefilter(action="ignore")
313 if meta is None:
--> 314 meta = client.materialize.get_table_metadata(table_name)
315 if "schema" not in meta:
316 meta["schema"] = meta.get("schema_type")
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/frameworkclient.py:449, in CAVEclientFull.materialize(self)
444 """
445 A client for the materialization service. See [client.materialize](../client_api/materialize.md)
446 for more information.
447 """
448 if self._materialize is None:
--> 449 self._materialize = MaterializationClient(
450 server_address=self.local_server,
451 auth_client=self.auth,
452 datastack_name=self._datastack_name,
453 synapse_table=self.info.get_datastack_info().get("synapse_table", None),
454 max_retries=self._max_retries,
455 pool_maxsize=self._pool_maxsize,
456 pool_block=self._pool_block,
457 over_client=self,
458 desired_resolution=self.desired_resolution,
459 )
460 return self._materialize
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/materializationengine.py:201, in MaterializationClient(server_address, datastack_name, auth_client, cg_client, synapse_table, api_version, version, verify, max_retries, pool_maxsize, pool_block, desired_resolution, over_client)
189 endpoints, api_version = _api_endpoints(
190 api_version,
191 SERVER_KEY,
(...)
197 verify=verify,
198 )
200 MatClient = client_mapping[api_version]
--> 201 return MatClient(
202 server_address,
203 auth_header,
204 api_version,
205 endpoints,
206 SERVER_KEY,
207 datastack_name,
208 cg_client=cg_client,
209 synapse_table=synapse_table,
210 version=version,
211 verify=verify,
212 max_retries=max_retries,
213 pool_maxsize=pool_maxsize,
214 pool_block=pool_block,
215 over_client=over_client,
216 desired_resolution=desired_resolution,
217 )
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/materializationengine.py:1924, in MaterializationClientV3.__init__(self, *args, **kwargs)
1922 if self.fc is not None:
1923 if metadata[0].result() is not None and metadata[1].result() is not None:
-> 1924 tables = TableManager(
1925 self.fc, metadata[0].result(), metadata[1].result()
1926 )
1927 self._tables = tables
1928 if self.fc is not None:
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/tools/table_manager.py:669, in TableManager.__init__(self, client, metadata, schema)
667 populate_table_cache(client, metadata=self._table_metadata)
668 for tn in self._tables:
--> 669 setattr(self, tn, make_query_filter(tn, self._table_metadata[tn], client))
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/tools/table_manager.py:623, in make_query_filter(table_name, meta, client)
614 def make_query_filter(table_name, meta, client):
615 (
616 pts,
617 val_cols,
618 all_unbd_pts,
619 table_map,
620 rename_map,
621 table_list,
622 desc,
--> 623 ) = get_table_info(table_name, meta, client)
624 class_vals = make_class_vals(
625 pts, val_cols, all_unbd_pts, table_map, rename_map, table_list
626 )
627 QueryFilter = attrs.make_class(
628 table_name, class_vals, bases=(make_kwargs_mixin(client),)
629 )
[... skipping similar frames: cached.<locals>.decorator.<locals>.wrapper at line 741 (1 times)]
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/tools/table_manager.py:267, in get_table_info(tn, meta, client, allow_types, merge_schema, suffixes)
265 name_ref = None
266 else:
--> 267 schema = table_metadata(ref_table, client).get("schema")
268 ref_pts, ref_cols, ref_unbd_pts = get_col_info(
269 meta["schema"], client, allow_types=allow_types, omit_fields=["target_id"]
270 )
271 name_base = ref_table
[... skipping similar frames: cached.<locals>.decorator.<locals>.wrapper at line 741 (1 times)]
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/tools/table_manager.py:314, in table_metadata(table_name, client, meta)
312 warnings.simplefilter(action="ignore")
313 if meta is None:
--> 314 meta = client.materialize.get_table_metadata(table_name)
315 if "schema" not in meta:
316 meta["schema"] = meta.get("schema_type")
[... skipping similar frames: cached.<locals>.decorator.<locals>.wrapper at line 741 (530 times), MaterializationClient at line 201 (265 times), MaterializationClientV3.__init__ at line 1924 (265 times), TableManager.__init__ at line 669 (265 times), get_table_info at line 267 (265 times), make_query_filter at line 623 (265 times), CAVEclientFull.materialize at line 449 (265 times), table_metadata at line 314 (265 times)]
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/frameworkclient.py:449, in CAVEclientFull.materialize(self)
444 """
445 A client for the materialization service. See [client.materialize](../client_api/materialize.md)
446 for more information.
447 """
448 if self._materialize is None:
--> 449 self._materialize = MaterializationClient(
450 server_address=self.local_server,
451 auth_client=self.auth,
452 datastack_name=self._datastack_name,
453 synapse_table=self.info.get_datastack_info().get("synapse_table", None),
454 max_retries=self._max_retries,
455 pool_maxsize=self._pool_maxsize,
456 pool_block=self._pool_block,
457 over_client=self,
458 desired_resolution=self.desired_resolution,
459 )
460 return self._materialize
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/materializationengine.py:201, in MaterializationClient(server_address, datastack_name, auth_client, cg_client, synapse_table, api_version, version, verify, max_retries, pool_maxsize, pool_block, desired_resolution, over_client)
189 endpoints, api_version = _api_endpoints(
190 api_version,
191 SERVER_KEY,
(...)
197 verify=verify,
198 )
200 MatClient = client_mapping[api_version]
--> 201 return MatClient(
202 server_address,
203 auth_header,
204 api_version,
205 endpoints,
206 SERVER_KEY,
207 datastack_name,
208 cg_client=cg_client,
209 synapse_table=synapse_table,
210 version=version,
211 verify=verify,
212 max_retries=max_retries,
213 pool_maxsize=pool_maxsize,
214 pool_block=pool_block,
215 over_client=over_client,
216 desired_resolution=desired_resolution,
217 )
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/materializationengine.py:1924, in MaterializationClientV3.__init__(self, *args, **kwargs)
1922 if self.fc is not None:
1923 if metadata[0].result() is not None and metadata[1].result() is not None:
-> 1924 tables = TableManager(
1925 self.fc, metadata[0].result(), metadata[1].result()
1926 )
1927 self._tables = tables
1928 if self.fc is not None:
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/tools/table_manager.py:669, in TableManager.__init__(self, client, metadata, schema)
667 populate_table_cache(client, metadata=self._table_metadata)
668 for tn in self._tables:
--> 669 setattr(self, tn, make_query_filter(tn, self._table_metadata[tn], client))
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/tools/table_manager.py:623, in make_query_filter(table_name, meta, client)
614 def make_query_filter(table_name, meta, client):
615 (
616 pts,
617 val_cols,
618 all_unbd_pts,
619 table_map,
620 rename_map,
621 table_list,
622 desc,
--> 623 ) = get_table_info(table_name, meta, client)
624 class_vals = make_class_vals(
625 pts, val_cols, all_unbd_pts, table_map, rename_map, table_list
626 )
627 QueryFilter = attrs.make_class(
628 table_name, class_vals, bases=(make_kwargs_mixin(client),)
629 )
[... skipping similar frames: cached.<locals>.decorator.<locals>.wrapper at line 741 (1 times)]
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/tools/table_manager.py:267, in get_table_info(tn, meta, client, allow_types, merge_schema, suffixes)
265 name_ref = None
266 else:
--> 267 schema = table_metadata(ref_table, client).get("schema")
268 ref_pts, ref_cols, ref_unbd_pts = get_col_info(
269 meta["schema"], client, allow_types=allow_types, omit_fields=["target_id"]
270 )
271 name_base = ref_table
[... skipping similar frames: cached.<locals>.decorator.<locals>.wrapper at line 741 (1 times)]
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/tools/table_manager.py:314, in table_metadata(table_name, client, meta)
312 warnings.simplefilter(action="ignore")
313 if meta is None:
--> 314 meta = client.materialize.get_table_metadata(table_name)
315 if "schema" not in meta:
316 meta["schema"] = meta.get("schema_type")
File /opt/envs/allensdk/lib/python3.10/site-packages/cachetools/__init__.py:741, in cached.<locals>.decorator.<locals>.wrapper(*args, **kwargs)
739 except KeyError:
740 pass # key not found
--> 741 v = func(*args, **kwargs)
742 try:
743 cache[k] = v
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/materializationengine.py:519, in MaterializationClientV2.get_table_metadata(self, table_name, datastack_name, version, log_warning)
517 datastack_name = self.datastack_name
518 if version is None:
--> 519 version = self.version
520 endpoint_mapping = self.default_url_mapping
521 endpoint_mapping["datastack_name"] = datastack_name
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/materializationengine.py:280, in MaterializationClientV2.version(self)
276 """The version of the materialization. Can be used to set up the
277 client to default to a specific version when timestamps or versions are not
278 specified in queries. If not set, defaults to the most recent version."""
279 if self._version is None:
--> 280 self._version = self.most_recent_version()
281 return self._version
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/materializationengine.py:335, in MaterializationClientV2.most_recent_version(self, datastack_name)
318 def most_recent_version(self, datastack_name=None) -> int:
319 """
320 Get the most recent version of materialization for this datastack name
321
(...)
332 Most recent version of materialization for this datastack name
333 """
--> 335 versions = self.get_versions(datastack_name=datastack_name)
336 return np.max(np.array(versions))
File /opt/envs/allensdk/lib/python3.10/site-packages/caveclient/materializationengine.py:360, in MaterializationClientV2.get_versions(self, datastack_name, expired)
358 url = self._endpoints["versions"].format_map(endpoint_mapping)
359 query_args = {"expired": expired}
--> 360 response = self.session.get(url, params=query_args)
361 self.raise_for_status(response)
362 return response.json()
File /opt/envs/allensdk/lib/python3.10/site-packages/requests/sessions.py:602, in Session.get(self, url, **kwargs)
594 r"""Sends a GET request. Returns :class:`Response` object.
595
596 :param url: URL for the new :class:`Request` object.
597 :param \*\*kwargs: Optional arguments that ``request`` takes.
598 :rtype: requests.Response
599 """
601 kwargs.setdefault("allow_redirects", True)
--> 602 return self.request("GET", url, **kwargs)
File /opt/envs/allensdk/lib/python3.10/site-packages/requests/sessions.py:589, in Session.request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json)
584 send_kwargs = {
585 "timeout": timeout,
586 "allow_redirects": allow_redirects,
587 }
588 send_kwargs.update(settings)
--> 589 resp = self.send(prep, **send_kwargs)
591 return resp
File /opt/envs/allensdk/lib/python3.10/site-packages/requests/sessions.py:703, in Session.send(self, request, **kwargs)
700 start = preferred_clock()
702 # Send the request
--> 703 r = adapter.send(request, **kwargs)
705 # Total elapsed time of the request (approximately)
706 elapsed = preferred_clock() - start
File /opt/envs/allensdk/lib/python3.10/site-packages/requests/adapters.py:667, in HTTPAdapter.send(self, request, stream, timeout, verify, cert, proxies)
664 timeout = TimeoutSauce(connect=timeout, read=timeout)
666 try:
--> 667 resp = conn.urlopen(
668 method=request.method,
669 url=url,
670 body=request.body,
671 headers=request.headers,
672 redirect=False,
673 assert_same_host=False,
674 preload_content=False,
675 decode_content=False,
676 retries=self.max_retries,
677 timeout=timeout,
678 chunked=chunked,
679 )
681 except (ProtocolError, OSError) as err:
682 raise ConnectionError(err, request=request)
File /opt/envs/allensdk/lib/python3.10/site-packages/urllib3/connectionpool.py:789, in HTTPConnectionPool.urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, preload_content, decode_content, **response_kw)
786 response_conn = conn if not release_conn else None
788 # Make the request on the HTTPConnection object
--> 789 response = self._make_request(
790 conn,
791 method,
792 url,
793 timeout=timeout_obj,
794 body=body,
795 headers=headers,
796 chunked=chunked,
797 retries=retries,
798 response_conn=response_conn,
799 preload_content=preload_content,
800 decode_content=decode_content,
801 **response_kw,
802 )
804 # Everything went great!
805 clean_exit = True
File /opt/envs/allensdk/lib/python3.10/site-packages/urllib3/connectionpool.py:536, in HTTPConnectionPool._make_request(self, conn, method, url, body, headers, retries, timeout, chunked, response_conn, preload_content, decode_content, enforce_content_length)
534 # Receive the response from the server
535 try:
--> 536 response = conn.getresponse()
537 except (BaseSSLError, OSError) as e:
538 self._raise_timeout(err=e, url=url, timeout_value=read_timeout)
File /opt/envs/allensdk/lib/python3.10/site-packages/urllib3/connection.py:507, in HTTPConnection.getresponse(self)
504 from .response import HTTPResponse
506 # Get the response from http.client.HTTPConnection
--> 507 httplib_response = super().getresponse()
509 try:
510 assert_header_parsing(httplib_response.msg)
File /opt/conda/lib/python3.10/http/client.py:1375, in HTTPConnection.getresponse(self)
1373 try:
1374 try:
-> 1375 response.begin()
1376 except ConnectionError:
1377 self.close()
File /opt/conda/lib/python3.10/http/client.py:337, in HTTPResponse.begin(self)
334 else:
335 raise UnknownProtocol(version)
--> 337 self.headers = self.msg = parse_headers(self.fp)
339 if self.debuglevel > 0:
340 for hdr, val in self.headers.items():
File /opt/conda/lib/python3.10/http/client.py:236, in parse_headers(fp, _class)
234 headers = _read_headers(fp)
235 hstring = b''.join(headers).decode('iso-8859-1')
--> 236 return email.parser.Parser(_class=_class).parsestr(hstring)
File /opt/conda/lib/python3.10/email/parser.py:67, in Parser.parsestr(self, text, headersonly)
59 def parsestr(self, text, headersonly=False):
60 """Create a message structure from a string.
61
62 Returns the root of the message structure. Optional headersonly is a
(...)
65 the file.
66 """
---> 67 return self.parse(StringIO(text), headersonly=headersonly)
File /opt/conda/lib/python3.10/email/parser.py:56, in Parser.parse(self, fp, headersonly)
54 if not data:
55 break
---> 56 feedparser.feed(data)
57 return feedparser.close()
File /opt/conda/lib/python3.10/email/feedparser.py:176, in FeedParser.feed(self, data)
174 """Push more data into the parser."""
175 self._input.push(data)
--> 176 self._call_parse()
File /opt/conda/lib/python3.10/email/feedparser.py:180, in FeedParser._call_parse(self)
178 def _call_parse(self):
179 try:
--> 180 self._parse()
181 except StopIteration:
182 pass
File /opt/conda/lib/python3.10/email/feedparser.py:256, in FeedParser._parsegen(self)
254 self._cur.set_payload(EMPTYSTRING.join(lines))
255 return
--> 256 if self._cur.get_content_type() == 'message/delivery-status':
257 # message/delivery-status contains blocks of headers separated by
258 # a blank line. We'll represent each header block as a separate
259 # nested message object, but the processing is a bit different
260 # than standard message/* types because there is no body for the
261 # nested messages. A blank line separates the subparts.
262 while True:
263 self._input.push_eof_matcher(NLCRE.match)
File /opt/conda/lib/python3.10/email/message.py:578, in Message.get_content_type(self)
565 """Return the message's content type.
566
567 The returned string is coerced to lower case of the form
(...)
575 message/rfc822.
576 """
577 missing = object()
--> 578 value = self.get('content-type', missing)
579 if value is missing:
580 # This should have no parameters
581 return self.get_default_type()
File /opt/conda/lib/python3.10/email/message.py:471, in Message.get(self, name, failobj)
469 for k, v in self._headers:
470 if k.lower() == name:
--> 471 return self.policy.header_fetch_parse(k, v)
472 return failobj
File /opt/conda/lib/python3.10/email/_policybase.py:316, in Compat32.header_fetch_parse(self, name, value)
311 def header_fetch_parse(self, name, value):
312 """+
313 If the value contains binary data, it is converted into a Header object
314 using the unknown-8bit charset. Otherwise it is returned unmodified.
315 """
--> 316 return self._sanitize_header(name, value)
RecursionError: maximum recursion depth exceeded
This is just the raw positions, converted to nanometers. You can see a few aspects that might make this hard to work with. First, if you look along the top of the data, you can see that the pia surface is not flat. In fact, there’s a roughly 5 degree slope from the left to the right. Second, if you look at the location of the pia surface, it’s at around y = 0.4 * 10^6. Not only are these units large, the offset is arbitrary and it would make much more sense to anchor y=0 to the pial surface.
Let’s see how we can do this with standard_transform
.
from standard_transform import minnie_ds
import numpy as np
X_transformed = minnie_ds.transform_vx.apply_dataframe('pt_position', ct_df)
X_transformed = np.array(X_transformed)
ct_df['pt_xt'] = X_transformed[:,0]
ct_df['pt_yt'] = X_transformed[:,1]
ct_df['pt_zt'] = X_transformed[:,2]
fig, ax = plt.subplots(figsize=(5,4))
sns.scatterplot(
x='pt_xt',
y='pt_yt',
s=3,
data=ct_df.sample(10_000),
ax=ax,
)
ax.invert_yaxis()
sns.despine(ax=ax)
Now you can see that the surface is much more aligned with the x-axis, and the cells in layer 1 start just below y=0. In addition, the units have been converted to microns, which is much more readable and more consistent with measurements in other modalities.
How to use Standard Transform.#
A number of examples are available in the standard_transform readme.
Transforming points#
The main functions when working with point data are transform_vx
and transform_nm
, which transform from voxel coordinates and from nanometer coordinates respectively.
In the above example, you will note that because we were working with annotations, we used the voxel coordinate transform.
Each of the two transforms has the same functions that can be used:
minnie_ds.{transform}.apply(x)
: This converts anNx3
array of points from the original space to anNx3
array of transformed coordinates.minnie_ds.{transform}.apply_project(projection_axis, x)
: This converts anNx3
array of points from the original space to anN
-length array of transformed coordinates, taking only values along the given axis (x
,y
, orz
). This is typically done along they
axis to get the depth of each point.minnie_ds.{transform}.apply_dataframe(column_name, df, Optional[projection_axis])
: This takes points from a dataframe column (or collection of three dataframe columns named{column_name}_x
,{column_name}_y
, and{column_name}_z
) and converts them to transformed coordinates. Ifprojection_axis
is given, it will only return the values along that axis.
Where {transform}
is either transform_vx
or transform_nm
.
You can also invert a transformation, for example if you want to convert transformed coordinates back to voxel coordinates to view in Neuroglancer.
minnie_ds.{transform}.invert(X)
: This maps from an Nx3 array in the transformed space back to the original space (voxel or nanometer coordinates, depending on which transform you used).
Transforming meshes and skeletons#
Standard transform has the capability to transform MeshParty skeletons, MeshWorks, and MeshWork annotations using these same transforms as above.
For example, to transform all the vertices of meshwork object:
nrn_transformed = minnie_ds.transform_nm.apply_meshwork_vertices(nrn)
However, Meshwork objects also have annotations in dataframes as well as vertices.
In order to transform these points, we need to specify both the name of the dataframe table and the columns to transform as a dictionary.
For example, if you want to also remap the ctr_pt_position
values from the pre_syn
table and post_syn
tables (which are in voxels by default), you would use:
anno_dict={'pre_syn': 'ctr_pt_position', 'post_syn': 'ctr_pt_position}
nrn_transformed = minnie_ds.transform_vx.apply_meshwork_annotations(nrn_transformed, anno_dict, inplace=True)
Note that without the inplace=True
parameter, these meshwork transformations will return a new object, while ifinplace=True
the original object is modified.