Scripting a Collector Data Export Workflow

Slides:



Advertisements
Similar presentations
CC SQL Utilities.
Advertisements

Concepts of Maintaining Your Data Simple Ways to Edit Your Data By Lorne Woods.
Why python? Automate processes Batch programming Faster Open source Easy recognition of errors Good for data management What is python? Scripting programming.
Technical Support: (989) GIS and Mapping Procedures in ArcMap 9.x Creating an ArcMap Project Editing an ArcMap Project Printing an ArcMap Project.
Python & ModelBuilder. Overview Python/ModelBuilder Concepts – The Geoprocessor – Checking some environment variables – Providing feedback from your model/script.
Editing Basics (in ArcGIS 9.2) By Alma Vargas. Levels of Desktop ArcGIS Arc View Version that most clients will use The version that this session will.
Esri UC2013. Technical Workshop. Technical Workshop 2013 Esri International User Conference July 8–12, 2013 | San Diego, California Editing in ArcMap:
Lecture 4 Geodatabases. Geodatabases Outline  Data types  Geodatabases  Data table joins  Spatial joins  Field calculator  Calculate geometry 
Introduction to ArcGIS for Environmental Scientists Module 2 – Fundamentals Lecture 6 – Table Functions.
Python & ModelBuilder. Continuing Education Python and ModelBuilder Overview Python/ModelBuilder Concepts –The Geoprocessor –Checking some environment.
Esri International User Conference | San Diego, CA Technical Workshops | Python – Getting Started Drew Flater, Ghislain Prince July 12 - July 14, 2011.
Working with cursors in Python GISDE Python Workshop Qiao Li.
Importing your Own Data To display in GIS Lab 4a: (Table Join) Mapping By State, County, or Nation.
Overview Cursors arcpy.da module Geometrys Arrays SpatialReferences
Introduction to ArcPy. Topics What is ArcPy? Accessing geoprocessing tools using ArcPy Writing scripts using ArcPy.
Functions Reading/writing files Catching exceptions
CPTR 124 Review for Test 1. Development Tools Editor Similar to a word processor Allows programmer to compose/save/edit source code Compiler/interpreter.
Stored Procedures, Transactions, and Error-Handling
GIS Tutorial 1 Lecture 4 Geodatabases. Outline  Data types  Geodatabases  Data table joins  Spatial joins  Field calculator  Calculate geometry.
If statements while loop for loop
Copyright © 2006 by Maribeth H. Price 8-1 Chapter 8 Geoprocessing.
WBAreaComID Queries. By first converting the NHDFlowline feature class to a point feature class and then joining the flowline point feature class with.
Python Let’s get started!.
Juanita Cano City of Sacramento Spring 2014 Geography 375.
Selecting features in ArcMap
WBAreaComID Queries Paul Kimsey 3/18/2007. Open Arc Toolbox.
Introduction to Geographic Information Systems Fall 2013 (INF 385T-28620) Dr. David Arctur Research Fellow, Adjunct Faculty University of Texas at Austin.
Esri UC 2014 | Technical Workshop | Editing in ArcMap: An Introduction Lisa Stanners, Phil Sanchez.
Lecture 10: Geoprocessing with Python (II) Dr. Taysir Hassan Abdel Hamid Associate Professor, Information Systems Dept., Faculty of Computers and Information.
PROGRAMMING USING PYTHON LANGUAGE ASSIGNMENT 1. INSTALLATION OF RASPBERRY NOOB First prepare the SD card provided in the kit by loading an Operating System.
1 Chapter 6: Creating Oracle Data Block Forms. 2 Forms  Application with a graphical user interface that looks like a paper form  Used to insert, update,
FILES AND EXCEPTIONS Topics Introduction to File Input and Output Using Loops to Process Files Processing Records Exceptions.
Outline of Script Import Modules Setup Workspace Environment and Assign Data Path Variables Summary of Script Title and Author Info.
Introduction to GIS Programming Final Project Submitted by Todd Lenkin Geography 375 Spring of 2011 American River College.
Lecture III Syntax ● Statements ● Output ● Variables ● Conditions ● Loops ● List Comprehension ● Function Calls ● Modules.
ASP.NET Programming with C# and SQL Server First Edition
Key Terms Attribute join Target table Join table Spatial join.
Designing a Spatial/GIS Project
Python Let’s get started!.
Improving Georeferencing Workflow with Python
CS-104 Final Exam Review Victor Norman.
Computer Programming I
GEOG 375 Final Project Robert Abbotts Spring 2013.
Mastering ArcGIS Attribute Data (Continued)
Statement atoms The 'atomic' components of a statement are: delimiters (indents, semicolons, etc.); keywords (built into the language); identifiers (names.
Final Project: Read from a csv file and write to a database table
Attribute Extraction.
Microsoft Access Illustrated
Programming and Automation
Spatial Data Processing
Geography 375 Introduction to Python May 15, 2015
Editing Tabular Data Francisco Olivera, Ph.D., P.E. Srikanth Koka
Topics Introduction to File Input and Output
Chapter 7 Files and Exceptions
Conditions and Ifs BIS1523 – Lecture 8.
Sirena Hardy HRMS Trainer
This lecture Introduction to arcpy Debugging Using arcpy.
Automating and Validating Edits
Vector Geoprocessing.
The Use of Looping Code in Map Production
Topics Introduction to File Input and Output
Virginia Lenvik Geography 375 Spring 2013
ESRM 250/CFR 520 Autumn 2009 Phil Hurvitz
Midwest-bound A Site Suitability Analysis of South Bend, Indiana for Relocation by Joi Misenti Geog 375--Spring 2016.
Processing of NOAA Precipitation Data and Thematic Map Generation
Your Data and ESRI’s Local Government Model
Final Project Geog 375 Daniel Hewitt.
Zoning Map Modernization with GIS
Kimberly Sparks, GISP and Evan O’Keefe
Geog 375 Individual Final Programming Project: Automated Thematic Maps
Presentation transcript:

Scripting a Collector Data Export Workflow Craig Mueller

Background Abandoned Mine Lands Program, Department of Conservation Inventory mines for hazards and historical record

Background Abandoned Mine Lands Program, Department of Conservation Inventory mines for hazards and historical record Field data acquired using Collector and imported into Access

Background

Background Abandoned Mine Lands Program, Department of Conservation Inventory mines for hazards and historical record Field data acquired using Collector and imported into Access Previous workflow was time and user input intensive

Procedure I created 6 script to guide the data and user through the process of preparing Collector data for the database The scripts allow for either successful or unsuccessful sync with ArcGIS Online All scripts are designed to be run in a toolbox within ArcMap using GetParameterAsText to allow the user to input necessary data

Procedure The scripts are integrated into a larger workflow

Script 1 Geodatabase Converter import arcpy, os, shutil, getpass, time # Input variables gb_input = arcpy.GetParameterAsText(0) geodatabases = gb_input.split(';') geo_path = geodatabases[0] counter = 0 gb_number = ["0th", "1st", "2nd", "3rd", "4th", "5th", "6th", "7th", "8th", "9th", "10th"] temp_folder = "C:\Users\\" + getpass.getuser() + "\\" + r"Desktop\temp" if not arcpy.Exists(temp_folder): os.makedirs(temp_folder) #if folder structure changes these need updating!! def lvl_down(path): return os.path.split(path)[0] def last_lvl(path): return os.path.split(path)[1] # Run through list of Runtimes geodatabases and convert them to FGDB in the backups folder with a temp name of 1,2,3 for geodatabase in geodatabases: counter += 1 outgdb = "\\" + str(counter) + ".gdb" backups = lvl_down(geo_path) shutil.copy(geodatabase, temp_folder) temp_geodatabase = temp_folder + "\\" + last_lvl(geodatabase) # arcpy.AddMessage(backups + outgdb) while not os.path.isfile(temp_geodatabase): time.sleep(1) arcpy.CopyRuntimeGdbToFileGdb_conversion(temp_geodatabase, backups + outgdb) gb_count = gb_number[counter] arcpy.AddMessage("The {0} geodatabase has been converted".format(gb_count)) # I can't get this dang thing to work, there's a lock that doesnt clear until ArcMap closes #shutil.rmtree(temp_folder, "ignore_errors") if len(geodatabases) <> 1: s = "s" else: s = "" arcpy.AddMessage("{0} geodatabase{1} have successfully been converted".format(counter, s)) First script converts runtime GDBs into FGDBS Needs to be run only if ArcGIS Online sync fails Allows for multiple files to be run at once

Script 2 GDB Merger Merges the new GDBs into a single one import arcpy, os gdbs = gdbs_input.split(';') gdbs_input = arcpy.GetParameterAsText(0) arcpy.env.overwriteOutput = True # input variables counter = 1 destination = gdbs[0] "DISABLE_CREATOR", arcpy.DisableEditorTracking_management(destination, # Disable editor tracking to preserve edit/create dates # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # If folders change this block needs changing!! # # # # # # # # "DISABLE_LAST_EDIT_DATE") "DISABLE_CREATION_DATE", "DISABLE_LAST_EDITOR", def lvl_down(path): return os.path.split(path)[1] def last_lvl(path): return os.path.split(path)[0] #gets the path to the general inventory folder (three down from /Backups/destination.gdb/Collector) gis_folder = inv_path + "\GIS" #gets gis folder path by adding \GIS onto the end of the inventory folder inv_path = lvl_down(lvl_down(lvl_down(destination))) gdb_name = (last_lvl(lvl_down(lvl_down(lvl_down(destination))))).replace(".", '_') #gets the destination gdb name from the general inventory folder with the periods replaced with underscores def buildWhereClauseFromList(OriginTable, PrimaryKeyField, valueList): # written by Ben Nadler # the following modules are from https://github.com/bnadler/Append-Features-With-Attachments/blob/master/AppendFeaturesWithAttachments.py clause to select those values within a given PrimaryKeyField """Takes a list of values and constructs a SQL WHERE fieldDelimited = arcpy.AddFieldDelimiters(arcpy.Describe(OriginTable).path, PrimaryKeyField) # Add DBMS-specific field delimiters and OriginTable.""" # Determine field type if str(fieldType) == 'String' or str(fieldType) == 'Guid': valueList = ["'%s'" % value for value in valueList] # Add single-quotes for string field values fieldType = arcpy.ListFields(OriginTable, PrimaryKeyField)[0].type # Format WHERE clause in the form of an IN statement return whereClause whereClause = "%s IN(%s)" % (fieldDelimited, ', '.join(map(str, valueList))) fieldNames = [fld.name for fld in arcpy.ListFields(fc)] """Convert FieldList object to list of fields""" def fieldNameList(fc): #Ensure proper formatting for Shapefield name def validate_shape_field(origin, target): return fieldNames pass if 'Shape' in origin and 'Shape' in target: origin[origin.index('SHAPE')] = 'Shape' elif 'SHAPE' in origin and 'Shape' in target: elif 'SHAPE' in origin and 'SHAPE' in target: elif 'Shape' in origin and 'SHAPE' in target: """Writes each update feature to target feature class, then appends attachments to the target feature class attachment table with the GUID from the newly added update feature""" def appendFeatures(features,targetFc): origin[origin.index('Shape')] = 'SHAPE' tfieldNames = fieldNameList(targetFc) #List of fields from target feature class afieldNames = fieldNameList(features) desc = arcpy.Describe(targetFc) #List of fields from feature class to append oldGuidField = afieldNames.index('GlobalID') if 'GlobalID' in afieldNames else None tempField = None #Find Guid field index for later use validate_shape_field(afieldNames, tfieldNames) for f in tfields: #find a field we can use temporarily to hold a unique ID tfields = arcpy.ListFields(targetFc) guids = {} editor= arcpy.da.Editor(desc.path) with arcpy.da.SearchCursor(features,'*') as fscur: break if f.type == 'String' and f.length> 5 and f.domain == '': tempField = f.name editor.startEditing(False, False) for arow in fscur: with arcpy.da.InsertCursor(targetFc,tempField) as icur: #The new row will have a new GUID assigned to it #Insert new row and write temp ID to the field. expression = "{} = {}".format(fieldDelimited,newRow) fieldDelimited = arcpy.AddFieldDelimiters(desc.path, "OBJECTID") #Format expression to query new row newRow = icur.insertRow(["TEMP"]) print "Old GUID = {} New GUID = {}".format(arow[oldGuidField], scur[0]) guids[arow[oldGuidField]] = scur[0] for srow in scur: with arcpy.da.SearchCursor(targetFc,'GlobalID',expression) as scur: #Query new row and get new GUID urow = ucur.next() with arcpy.da.UpdateCursor(targetFc,'*', expression)as ucur: #Update empty row with all the information from update feature fname = f.name editor.stopEditing(True) ucur.updateRow(urow) urow[tfieldNames.index(fname)] = arow[afieldNames.index(fname)] if fname != 'OBJECTID' and fname != 'GlobalID' and fname in afieldNames: appendAttachments(features,targetFc,guids) tfc = tfc + '__ATTACH' expression = None fc = fc + '__ATTACH' def appendAttachments(fc,tfc,guidDict): tfcLayer = "in_memory\\tfcLayer" #Make query for attachment table, looking for newly written GUIDS arcpy.Append_management(fc,tfc,"NO_TEST") desc = arcpy.Describe(fc) arcpy.MakeTableView_management(tfc,tfcLayer,expression) expression = buildWhereClauseFromList(tfc,"REL_GLOBALID",guidDict.keys()) #Update attachments with new GUID for target features editor = arcpy.da.Editor(desc.path) #editor with arcpy.da.UpdateCursor(tfcLayer,'REL_GLOBALID') as ucur: urow[0] = guidDict[urow[0]] for urow in ucur: return None gdbs.remove(destination) # Merge individual file geodatabases updateFeatureClass = gdb counter += 1 for gdb in gdbs: targetFeatureClass = destination arcpy.DisableEditorTracking_management(updateFeatureClass, # Process: Disable Editor Tracking appendFeatures(updateFeatureClass, targetFeatureClass) arcpy.AddMessage("{0} geodatabases merged".format(counter)) "last_edited_user", "last_edited_date", "created_date", "created_user", arcpy.EnableEditorTracking_management(destination, arcpy.CreateFileGDB_management(gis_folder, gdb_name) # Export GDB to a new GDB (this optimizes the GDB which runs verrry slowly after the original merge) "NO_ADD_FIELDS","UTC") arcpy.AddMessage("Empty geodatabase created") xml = gis_folder + "\export.xml" arcpy.AddMessage("XML file exported") arcpy.ExportXMLWorkspaceDocument_management(destination, xml, "DATA") arcpy.AddMessage("Geodatabase import complete") arcpy.ImportXMLWorkspaceDocument_management(gis_folder + "\\" + gdb_name + ".gdb", xml, "DATA") arcpy.Delete_management(lvl_down(destination)) arcpy.Delete_management(destination) arcpy.Delete_management(xml) # Delete intermediates arcpy.Delete_management(lvl_down(gdb)) arcpy.Delete_management(gdb) arcpy.AddMessage("\n\n") Merges the new GDBs into a single one Exports XML and data and reimports Names new GDB by folder name

Script 3 Site Accounting import arcpy arcpy.env.overwriteOutput = True df = arcpy.mapping.ListDataFrames(mxd, "Layers")[0] mxd = arcpy.mapping.MapDocument("CURRENT") # Variables EF_in = "G:\OMR\AMLU\GIS\Basefile\MineSpecific\ExtraFeatures.shp" TOMS_in = "G:\OMR\AMLU\GIS\Basefile\MineSpecific\TOMS.shp" Coll_temp = r"in_memory\Coll" EF_temp = r"in_memory\EF" TOMS_temp = r"in_memory\TOMS" symb = "G:\OMR\AMLU\GIS\Tools\Scripts\Layers\Date.lyr" Beg_date = arcpy.GetParameterAsText(1) # use date parameter Coll = arcpy.GetParameterAsText(0) fields = "Date" Beg_split = Beg_date.split("/") End_date = arcpy.GetParameterAsText(2) # use date parameter End_split = End_date.split("/") Beg_date = "0" + Beg_date if len(Beg_split[0]) < 2: End_date = "0" + End_date if len(End_split[0]) < 2: Beg_date = Beg_split[0] End_split = End_date.split(" ") Beg_split = Beg_date.split(" ") End_date = End_split[0] query = """Date >= '""" + Beg_date + """' AND Date <= '""" + End_date + """'""" arcpy.AddMessage("\n") arcpy.FeatureClassToFeatureClass_conversion(Coll, "in_memory", "Coll") # Collector layer creation "Temp_Date", "Pacific_Standard_Time", arcpy.ConvertTimeZone_management(Coll_temp, "created_date", "UTC", "OUTPUT_ADJUSTED_FOR_DST") "INPUT_NOT_ADJUSTED_FOR_DST", output_time_type="TEXT",output_time_format="MM/dd/yyyy;1033;;") input_time_format="'Not Used'",output_time_field="Date", arcpy.ConvertTimeField_management(in_table=Coll_temp,input_time_field="Temp_Date", Layer = "Collector" arcpy.MakeFeatureLayer_management(Coll_temp, Layer, query, "", fields) newlayer = arcpy.mapping.Layer(Layer) arcpy.ApplySymbologyFromLayer_management(newlayer, symb) arcpy.mapping.AddLayer(df, newlayer,"TOP") arcpy.AddMessage("Collector layer created") # TOMS layer creation arcpy.FeatureClassToFeatureClass_conversion(TOMS_in, "in_memory", "TOMS") arcpy.ConvertTimeZone_management(TOMS_temp, "edited_date", "UTC", arcpy.ConvertTimeField_management(in_table=TOMS_temp,input_time_field="Temp_Date", Layer = "TOMS" arcpy.MakeFeatureLayer_management(TOMS_temp, Layer, query, "", fields) arcpy.AddMessage("TOMS layer created") arcpy.FeatureClassToFeatureClass_conversion(EF_in, "in_memory", "EF") # Extra features layer creation arcpy.ConvertTimeZone_management(EF_temp, "edited_date", "UTC", arcpy.ConvertTimeField_management(in_table=EF_temp,input_time_field="Temp_Date", Layer = "Extra Features" arcpy.MakeFeatureLayer_management(EF_temp, Layer, query, "", fields) arcpy.AddMessage("Extra features layer created") arcpy.RefreshActiveView() arcpy.RefreshTOC() # Get counts of edits by day unique_values = set(all_values) all_values = [r[0] for r in arcpy.da.SearchCursor(Coll_temp, "Date", query)] arcpy.AddMessage("\n Collector counts \n -------------------------------------------") count_dict = {} unique_values= sorted(unique_values) if count_dict[value] <> 1: count_dict[value] = all_values.count(value) for value in unique_values: else: s = "s" arcpy.AddMessage(" {0} feature{2} inventoried on {1}".format(count_dict[value], value, s)) s = "" all_values = [r[0] for r in arcpy.da.SearchCursor(TOMS_temp, "Date", query)] arcpy.AddMessage("\n TOMS counts \n -------------------------------------------") arcpy.AddMessage(" {0} TOMS updated on {1}".format(count_dict[value], value)) arcpy.AddMessage("\n\n") Creates layers with definition queries by date Gets the count of several layers by date and prints the results in the script window Uses Date parameters as user input

Script 4 QA import os, arcpy arcpy.env.overwriteOutput = True from random import randint c_layer = "c_layer" boundary_layer = "boundary_layer" invent_data = "G:\OMR\AMLU\GIS\Basefiles\Mine Specific\Invent_features.gdb\invent_features" boundary_data = "G:\OMR\AMLU\GIS\Basefiles\Mine Specific\Mine_Boundaries_California.shp" # variables dataFrame = arcpy.mapping.ListDataFrames(mxd, "Layers")[0] mxd = arcpy.mapping.MapDocument("CURRENT") c_data = arcpy.GetParameterAsText(0) #Need Feature Class type, named Collector features total_errors = 0 invent_layer = "invent_layer" #create empty layers for various QA checks boundary_symb = "G:\OMR\AMLU\GIS\Tools\Scripts\Layers\Boundaries.lyr" access_symb = "G:\OMR\AMLU\GIS\Tools\Scripts\Layers\AccessSymb.lyr" # symbology layers for the QA layer outputs symb = "G:\OMR\AMLU\GIS\Tools\Scripts\Layers\Symbology.lyr" arcpy.AddMessage("\nError report:") arcpy.MakeFeatureLayer_management(boundary_data, boundary_layer) arcpy.MakeFeatureLayer_management(invent_data, invent_layer) arcpy.MakeFeatureLayer_management(c_data, c_layer) test_lyr = "Access visual QA" # # # # Create access visual QA layer # # # # # the following are a series of QA checks run on Collector data to call out common diagnosable problems feat_lyr = arcpy.mapping.Layer(test_lyr) arcpy.MakeFeatureLayer_management(c_data, test_lyr, query, "", fields) query = """Feature_Ty <> 'Site'""" fields = "OBJECTID OBJECTID HIDDEN NONE;SHAPE SHAPE HIDDEN NONE;Feat_index Feat_index HIDDEN NONE;Feature_Ty Feature_Ty HIDDEN NONE;X_Dimensio X_Dimensio HIDDEN NONE;Y_Dimensio Y_Dimensio HIDDEN NONE;Z_Dimensio Z_Dimensio HIDDEN NONE;Greater_Th Greater_Th HIDDEN NONE;Condition_ Condition_ HIDDEN NONE;Condition2 Condition2 HIDDEN NONE;Condition3 Condition3 HIDDEN NONE;Condition4 Condition4 HIDDEN NONE;Access_Rat Access_Rat VISIBLE NONE;Hazard_Rat Hazard_Rat HIDDEN NONE;Aspect Aspect HIDDEN NONE;Bat_Rank Bat_Rank HIDDEN NONE;Color Color HIDDEN NONE;Odor Odor HIDDEN NONE;Feat_Desc Feat_Desc HIDDEN NONE;GPS_Person GPS_Person HIDDEN NONE;Note_Taker Note_Taker HIDDEN NONE;GlobalID GlobalID HIDDEN NONE;created_user created_user VISIBLE NONE;last_edited_user last_edited_user HIDDEN NONE;last_edited_date last_edited_date HIDDEN NONE;created_date created_date HIDDEN NONE" lyr.visible = False lyr = arcpy.mapping.ListLayers(mxd)[0] #----------------------------------------------------------------- arcpy.mapping.AddLayer(dataFrame, feat_lyr, "TOP") arcpy.ApplySymbologyFromLayer_management(feat_lyr, access_symb) query = """Hazard_Rat IS NULL and Feature_Ty <> 'Site'""" fields = "OBJECTID OBJECTID HIDDEN NONE;SHAPE SHAPE HIDDEN NONE;Feat_index Feat_index HIDDEN NONE;Feature_Ty Feature_Ty VISIBLE NONE;X_Dimensio X_Dimensio VISIBLE NONE;Y_Dimensio Y_Dimensio VISIBLE NONE;Z_Dimensio Z_Dimensio VISIBLE NONE;Greater_Th Greater_Th VISIBLE NONE;Condition_ Condition_ VISIBLE NONE;Condition2 Condition2 VISIBLE NONE;Condition3 Condition3 VISIBLE NONE;Condition4 Condition4 VISIBLE NONE;Access_Rat Access_Rat HIDDEN NONE;Hazard_Rat Hazard_Rat VISIBLE NONE;Aspect Aspect HIDDEN NONE;Bat_Rank Bat_Rank HIDDEN NONE;Color Color HIDDEN NONE;Odor Odor HIDDEN NONE;Feat_Desc Feat_Desc VISIBLE NONE;GPS_Person GPS_Person HIDDEN NONE;Note_Taker Note_Taker HIDDEN NONE;GlobalID GlobalID HIDDEN NONE;created_user created_user VISIBLE NONE;last_edited_user last_edited_user HIDDEN NONE;last_edited_date last_edited_date HIDDEN NONE;created_date created_date HIDDEN NONE" # Test variables # # # # Check for null hazards # # # # arcpy.SelectLayerByAttribute_management(c_layer, "NEW_SELECTION", query) error_count = int(arcpy.GetCount_management(c_layer).getOutput(0)) # Test script message = " {0} missing hazard rating{1}" test_lyr = "Missing Hazard" else: s = "" if error_count <> 1: s = "s" total_errors += error_count arcpy.ApplySymbologyFromLayer_management(feat_lyr, symb) if error_count > 0: arcpy.AddMessage(message.format(error_count, s)) # # # # Check for null access # # # # message = " {0} missing access rating{1}" test_lyr = "Missing Access" fields = "OBJECTID OBJECTID HIDDEN NONE;SHAPE SHAPE HIDDEN NONE;Feat_index Feat_index HIDDEN NONE;Feature_Ty Feature_Ty VISIBLE NONE;X_Dimensio X_Dimensio HIDDEN NONE;Y_Dimensio Y_Dimensio HIDDEN NONE;Z_Dimensio Z_Dimensio HIDDEN NONE;Greater_Th Greater_Th HIDDEN NONE;Condition_ Condition_ HIDDEN NONE;Condition2 Condition2 HIDDEN NONE;Condition3 Condition3 HIDDEN NONE;Condition4 Condition4 HIDDEN NONE;Access_Rat Access_Rat VISIBLE NONE;Hazard_Rat Hazard_Rat HIDDEN NONE;Aspect Aspect HIDDEN NONE;Bat_Rank Bat_Rank HIDDEN NONE;Color Color HIDDEN NONE;Odor Odor HIDDEN NONE;Feat_Desc Feat_Desc HIDDEN NONE;GPS_Person GPS_Person HIDDEN NONE;Note_Taker Note_Taker HIDDEN NONE;GlobalID GlobalID HIDDEN NONE;created_user created_user VISIBLE NONE;last_edited_user last_edited_user HIDDEN NONE;last_edited_date last_edited_date HIDDEN NONE;created_date created_date HIDDEN NONE" query = """Access_Rat IS NULL and Feature_Ty <> 'Site'""" # # # # Check for null GPS person / note taker # # # # test_lyr = "Missing GPS and Note Taker" message = " {0} missing GPS Person or Note Taker value{1}" fields = "OBJECTID OBJECTID HIDDEN NONE;SHAPE SHAPE HIDDEN NONE;Feat_index Feat_index HIDDEN NONE;Feature_Ty Feature_Ty HIDDEN NONE;X_Dimensio X_Dimensio HIDDEN NONE;Y_Dimensio Y_Dimensio HIDDEN NONE;Z_Dimensio Z_Dimensio HIDDEN NONE;Greater_Th Greater_Th HIDDEN NONE;Condition_ Condition_ HIDDEN NONE;Condition2 Condition2 HIDDEN NONE;Condition3 Condition3 HIDDEN NONE;Condition4 Condition4 HIDDEN NONE;Access_Rat Access_Rat HIDDEN NONE;Hazard_Rat Hazard_Rat HIDDEN NONE;Aspect Aspect HIDDEN NONE;Bat_Rank Bat_Rank HIDDEN NONE;Color Color HIDDEN NONE;Odor Odor HIDDEN NONE;Feat_Desc Feat_Desc HIDDEN NONE;GPS_Person GPS_Person VISIBLE NONE;Note_Taker Note_Taker VISIBLE NONE;GlobalID GlobalID HIDDEN NONE;created_user created_user VISIBLE NONE;last_edited_user last_edited_user HIDDEN NONE;last_edited_date last_edited_date HIDDEN NONE;created_date created_date HIDDEN NONE" query = """GPS_Person IS NULL or Note_Taker IS NULL""" # # # # Check for openings without aspect # # # # test_lyr = "Missing opening aspect" message = " {0} missing aspect value{1} on an opening" fields = "OBJECTID OBJECTID HIDDEN NONE;SHAPE SHAPE HIDDEN NONE;Feat_index Feat_index HIDDEN NONE;Feature_Ty Feature_Ty VISIBLE NONE;X_Dimensio X_Dimensio HIDDEN NONE;Y_Dimensio Y_Dimensio HIDDEN NONE;Z_Dimensio Z_Dimensio HIDDEN NONE;Greater_Th Greater_Th HIDDEN NONE;Condition_ Condition_ VISIBLE NONE;Condition2 Condition2 VISIBLE NONE;Condition3 Condition3 VISIBLE NONE;Condition4 Condition4 VISIBLE NONE;Access_Rat Access_Rat HIDDEN NONE;Hazard_Rat Hazard_Rat HIDDEN NONE;Aspect Aspect VISIBLE NONE;Bat_Rank Bat_Rank HIDDEN NONE;Color Color HIDDEN NONE;Odor Odor HIDDEN NONE;Feat_Desc Feat_Desc VISIBLE NONE;GPS_Person GPS_Person VISIBLE NONE;Note_Taker Note_Taker VISIBLE NONE;GlobalID GlobalID HIDDEN NONE;created_user created_user VISIBLE NONE;last_edited_user last_edited_user HIDDEN NONE;last_edited_date last_edited_date HIDDEN NONE;created_date created_date HIDDEN NONE" query = """Feature_Ty = 'Vertical Opening' AND Aspect IS NULL OR Feature_Ty = 'Horizontal Opening' AND Aspect IS NULL""" # # # # Check for openings without bat rank # # # # message = " {0} missing bat rank{1} on an opening" test_lyr = "Missing bat rank" fields = "OBJECTID OBJECTID HIDDEN NONE;SHAPE SHAPE HIDDEN NONE;Feat_index Feat_index HIDDEN NONE;Feature_Ty Feature_Ty VISIBLE NONE;X_Dimensio X_Dimensio VISIBLE NONE;Y_Dimensio Y_Dimensio VISIBLE NONE;Z_Dimensio Z_Dimensio VISIBLE NONE;Greater_Th Greater_Th VISIBLE NONE;Condition_ Condition_ VISIBLE NONE;Condition2 Condition2 VISIBLE NONE;Condition3 Condition3 VISIBLE NONE;Condition4 Condition4 VISIBLE NONE;Access_Rat Access_Rat HIDDEN NONE;Hazard_Rat Hazard_Rat VISIBLE NONE;Aspect Aspect HIDDEN NONE;Bat_Rank Bat_Rank VISIBLE NONE;Color Color HIDDEN NONE;Odor Odor HIDDEN NONE;Feat_Desc Feat_Desc VISIBLE NONE;GPS_Person GPS_Person HIDDEN NONE;Note_Taker Note_Taker HIDDEN NONE;GlobalID GlobalID HIDDEN NONE;created_user created_user VISIBLE NONE;last_edited_user last_edited_user HIDDEN NONE;last_edited_date last_edited_date HIDDEN NONE;created_date created_date HIDDEN NONE" query = """Feature_Ty = 'Vertical Opening' AND Bat_Rank IS NULL OR Feature_Ty = 'Horizontal Opening' AND Bat_Rank IS NULL""" message = " {0} waste pile{1} missing a color" test_lyr = "Missing waste color" query = """Feature_Ty = 'Mine Waste' AND Color IS NULL""" # # # # Check for waste piles with no color # # # # fields = "OBJECTID OBJECTID HIDDEN NONE;SHAPE SHAPE HIDDEN NONE;Feat_index Feat_index HIDDEN NONE;Feature_Ty Feature_Ty VISIBLE NONE;X_Dimensio X_Dimensio HIDDEN NONE;Y_Dimensio Y_Dimensio HIDDEN NONE;Z_Dimensio Z_Dimensio HIDDEN NONE;Greater_Th Greater_Th HIDDEN NONE;Condition_ Condition_ HIDDEN NONE;Condition2 Condition2 HIDDEN NONE;Condition3 Condition3 HIDDEN NONE;Condition4 Condition4 HIDDEN NONE;Access_Rat Access_Rat HIDDEN NONE;Hazard_Rat Hazard_Rat HIDDEN NONE;Aspect Aspect HIDDEN NONE;Bat_Rank Bat_Rank HIDDEN NONE;Color Color VISIBLE NONE;Odor Odor HIDDEN NONE;Feat_Desc Feat_Desc VISIBLE NONE;GPS_Person GPS_Person HIDDEN NONE;Note_Taker Note_Taker HIDDEN NONE;GlobalID GlobalID HIDDEN NONE;created_user created_user VISIBLE NONE;last_edited_user last_edited_user HIDDEN NONE;last_edited_date last_edited_date HIDDEN NONE;created_date created_date HIDDEN NONE" query = """Y_Dimensio < X_Dimensio AND Feature_Ty <> 'Vertical Opening' AND Feature_Ty <> 'Horizontal Opening'""" # # # # Check for reversed X Y dimensions # # # # fields = "OBJECTID OBJECTID HIDDEN NONE;SHAPE SHAPE HIDDEN NONE;Feat_index Feat_index HIDDEN NONE;Feature_Ty Feature_Ty VISIBLE NONE;X_Dimensio X_Dimensio VISIBLE NONE;Y_Dimensio Y_Dimensio VISIBLE NONE;Z_Dimensio Z_Dimensio VISIBLE NONE;Greater_Th Greater_Th HIDDEN NONE;Condition_ Condition_ HIDDEN NONE;Condition2 Condition2 HIDDEN NONE;Condition3 Condition3 HIDDEN NONE;Condition4 Condition4 HIDDEN NONE;Access_Rat Access_Rat HIDDEN NONE;Hazard_Rat Hazard_Rat HIDDEN NONE;Aspect Aspect HIDDEN NONE;Bat_Rank Bat_Rank HIDDEN NONE;Color Color HIDDEN NONE;Odor Odor HIDDEN NONE;Feat_Desc Feat_Desc HIDDEN NONE;GPS_Person GPS_Person HIDDEN NONE;Note_Taker Note_Taker HIDDEN NONE;GlobalID GlobalID HIDDEN NONE;created_user created_user VISIBLE NONE;last_edited_user last_edited_user HIDDEN NONE;last_edited_date last_edited_date HIDDEN NONE;created_date created_date HIDDEN NONE" message = " {0} reversed X Y dimension{1}" test_lyr = "Reversed X Y dimensions" if float(error_count) > 0: # # # # Check for features with empty dimensions # # # # test_lyr = "Empty dimensions" message = " {0} feature{1} with empty dimensions" query = """X_Dimensio = 0 OR Y_Dimensio = 0 OR Z_Dimensio = 0 AND Feature_Ty <> 'Foundation' OR X_Dimensio IS NULL AND Feature_Ty <> 'Site'OR Y_Dimensio IS NULL AND Feature_Ty <> 'Site'OR Z_Dimensio IS NULL AND Feature_Ty <> 'Site'""" if error_count <> 0: # # # # Check for sites without site points # # # # test_lyr = "Sites missing site points" message = " {0} site{1} with missing site points" query = """Feature_Ty = 'Site'""" arcpy.SelectLayerByLocation_management(boundary_layer, "intersect", c_layer) arcpy.SelectLayerByAttribute_management(c_layer, "CLEAR_SELECTION") arcpy.SelectLayerByLocation_management(boundary_layer, "intersect", invent_layer, "", "REMOVE_FROM_SELECTION") arcpy.SelectLayerByLocation_management(boundary_layer, "intersect", c_layer, "", "REMOVE_FROM_SELECTION") error_count = int(arcpy.GetCount_management(boundary_layer).getOutput(0)) arcpy.ApplySymbologyFromLayer_management(feat_lyr, boundary_symb) arcpy.MakeFeatureLayer_management(boundary_layer, test_lyr) message = " {0} feature{1} outside of boundaries" # # # # Check for features outside of boundaries # # # # arcpy.SelectLayerByAttribute_management(boundary_layer, "CLEAR_SELECTION") test_lyr = "Features outside of boundaries" arcpy.SelectLayerByAttribute_management(c_layer, "SUBSET_SELECTION", query) arcpy.SelectLayerByLocation_management(c_layer, "INTERSECT", boundary_layer, "", "", "INVERT") arcpy.MakeFeatureLayer_management(c_layer, test_lyr) # # # # Check for site points outside of boundaries # # # # test_lyr = "Site points outside of boundaries" message = " {0} site point{1} outside of boundaries" # # # # Check for features without attachments # # # # test_lyr = "Features missing attachments" fields = "Collector2015v1.OBJECTID Collector2015v1.OBJECTID HIDDEN NONE;Collector2015v1.SHAPE Collector2015v1.SHAPE HIDDEN NONE;Collector2015v1.Feat_index Collector2015v1.Feat_index HIDDEN NONE;Collector2015v1.Feature_Ty Collector2015v1.Feature_Ty VISIBLE NONE;Collector2015v1.X_Dimensio Collector2015v1.X_Dimensio HIDDEN NONE;Collector2015v1.Y_Dimensio Collector2015v1.Y_Dimensio HIDDEN NONE;Collector2015v1.Z_Dimensio Collector2015v1.Z_Dimensio HIDDEN NONE;Collector2015v1.Greater_Th Collector2015v1.Greater_Th HIDDEN NONE;Collector2015v1.Condition_ Collector2015v1.Condition_ HIDDEN NONE;Collector2015v1.Condition2 Collector2015v1.Condition2 HIDDEN NONE;Collector2015v1.Condition3 Collector2015v1.Condition3 HIDDEN NONE;Collector2015v1.Condition4 Collector2015v1.Condition4 HIDDEN NONE;Collector2015v1.Access_Rat Collector2015v1.Access_Rat HIDDEN NONE;Collector2015v1.Hazard_Rat Collector2015v1.Hazard_Rat HIDDEN NONE;Collector2015v1.Aspect Collector2015v1.Aspect HIDDEN NONE;Collector2015v1.Bat_Rank Collector2015v1.Bat_Rank HIDDEN NONE;Collector2015v1.Color Collector2015v1.Color HIDDEN NONE;Collector2015v1.Odor Collector2015v1.Odor HIDDEN NONE;Collector2015v1.Feat_Desc Collector2015v1.Feat_Desc HIDDEN NONE;Collector2015v1.GPS_Person Collector2015v1.GPS_Person HIDDEN NONE;Collector2015v1.Note_Taker Collector2015v1.Note_Taker HIDDEN NONE;Collector2015v1.GlobalID Collector2015v1.GlobalID HIDDEN NONE;Collector2015v1.created_user Collector2015v1.created_user VISIBLE NONE;Collector2015v1.last_edited_user Collector2015v1.last_edited_user HIDDEN NONE;Collector2015v1.last_edited_date Collector2015v1.last_edited_date HIDDEN NONE;Collector2015v1.created_date Collector2015v1.created_date HIDDEN NONE;Collector2015v1__ATTACH.ATTACHMENTID Collector2015v1__ATTACH.ATTACHMENTID HIDDEN NONE;Collector2015v1__ATTACH.GLOBALID Collector2015v1__ATTACH.GLOBALID HIDDEN NONE;Collector2015v1__ATTACH.REL_GLOBALID Collector2015v1__ATTACH.REL_GLOBALID HIDDEN NONE;Collector2015v1__ATTACH.CONTENT_TYPE Collector2015v1__ATTACH.CONTENT_TYPE HIDDEN NONE;Collector2015v1__ATTACH.ATT_NAME Collector2015v1__ATTACH.ATT_NAME HIDDEN NONE;Collector2015v1__ATTACH.DATA_SIZE Collector2015v1__ATTACH.DATA_SIZE HIDDEN NONE" query = """Collector2015v1__ATTACH.ATT_NAME IS NULL AND Feature_Ty <> 'Site'""" attach_path = c_data + "__ATTACH" message = " {0} feature{1} missing attachments" arcpy.AddJoin_management(in_layer_or_view=c_layer, in_field="GlobalID", join_table= attach_path, join_field="REL_GLOBALID", join_type="KEEP_ALL") arcpy.MakeFeatureLayer_management(c_layer, test_lyr, "", "", fields) # # # # Check for features too close together # # # # test_lyr = "Too close features" message = " {0} feature{1} that are highly near another feature" fields = "OBJECTID OBJECTID HIDDEN NONE;SHAPE SHAPE HIDDEN NONE;Feat_index Feat_index HIDDEN NONE;Feature_Ty Feature_Ty VISIBLE NONE;X_Dimensio X_Dimensio #VISIBLE NONE;Y_Dimensio Y_Dimensio VISIBLE NONE;Z_Dimensio Z_Dimensio VISIBLE NONE;Greater_Th Greater_Th HIDDEN NONE;Condition_ Condition_ HIDDEN #NONE;Condition2 Condition2 HIDDEN NONE;Condition3 Condition3 HIDDEN NONE;Condition4 Condition4 HIDDEN NONE;Access_Rat Access_Rat HIDDEN NONE;Hazard_Rat #Hazard_Rat HIDDEN NONE;Aspect Aspect HIDDEN NONE;Bat_Rank Bat_Rank HIDDEN NONE;Color Color HIDDEN NONE;Odor Odor HIDDEN NONE;Feat_Desc Feat_Desc HIDDEN #NONE;GPS_Person GPS_Person HIDDEN NONE;Note_Taker Note_Taker HIDDEN NONE;GlobalID GlobalID HIDDEN NONE;created_user created_user VISIBLE #NONE;last_edited_user last_edited_user HIDDEN NONE;last_edited_date last_edited_date HIDDEN NONE;created_date created_date HIDDEN NONE" query = """neartable.NEAR_DIST IS NOT NULL""" arcpy.GenerateNearTable_analysis(c_layer, c_layer, "neartable", "1.5 Meters", "", "", "CLOSEST") if near_count > 0: arcpy.AddJoin_management(c_layer, "OBJECTID", "neartable", "IN_FID") near_count = int(arcpy.GetCount_management("neartable").getOutput(0)) arcpy.AddMessage(message.format("0", "s")) arcpy.RefreshActiveView() arcpy.RemoveJoin_management(c_layer, "neartable") arcpy.AddMessage(" ------------------------------------------\n {0} total errors".format(total_errors)) random = randint(0,(len(list) - 1)) if total_errors > 0: list = ["Good luck!", "May the odds be ever in your favor!", "Mazal tov!", "^_^", "Break a leg!", "....Yikes", "You've got this!", "*Fist bump*", "Viel Gluck!", "Knock'em dead!"] arcpy.AddMessage("\n{0} \n \n".format(list[random])) arcpy.Delete_management(boundary_layer) arcpy.Delete_management(invent_layer) arcpy.Delete_management(c_layer) arcpy.arcpy.Delete_management("neartable") arcpy.AddMessage("\nNo errors, good work! \n \n") arcpy.Delete_management("in_memory") Creates layers of common errors conditionally based on if those errors are in the data Gets the count of each error type and outputs them in the script window Limits visibility of fields to ease readability

Script 5 PExport Generator import sys, os, arcpy # Import system modules try: arcpy.CheckOutExtension("Spatial") Collector_orig = inPointFeatures desc = arcpy.Describe(inPointFeatures) inPointFeatures = arcpy.GetParameterAsText(0) # Input shapefile/feature class def lvl_down(path): return os.path.split(path)[0] arcpy.env.workspace = gdb_path gdb_path = desc.path while srow: srow = scursor.next() scursor = arcpy.SearchCursor(inPointFeatures) # check for null nput Date and GPS_Person arcpy.AddError("GPS Person is missing in input table") if srow.GPS_Person == None: arcpy.AddMessage("Data exist") else: arcpy.AddError("Feature Type is missing in input table") raise NoFeatures elif srow.Feature_Ty == None: break if srow == None: # Create permanent ID and GUID fields arcpy.AddMessage("Creating PExport") arcpy.AddField_management(inPointFeatures, "ID", "LONG") del srow del scursor expressionID = "getID(!OBJECTID!)" fieldNameID ="ID" arcpy.AddField_management(inPointFeatures, "GUID", "TEXT") codeblockID) arcpy.CalculateField_management(inPointFeatures, fieldNameID, expressionID, "PYTHON_9.3", codeblockID = """def getID(OBJECTID): return int(OBJECTID) """ "PYTHON_9.3") '!GlobalID!', arcpy.CalculateField_management(inPointFeatures, "GUID", "OUTPUT_ADJUSTED_FOR_DST") "INPUT_NOT_ADJUSTED_FOR_DST", "GPS_DATE", "Pacific_Standard_Time", # Convert created_date UTC to GPS_DATE in PST added by Craig arcpy.ConvertTimeZone_management(inPointFeatures, "created_date", "UTC", arcpy.ConvertTimeField_management(in_table=inPointFeatures,input_time_field="GPS_DATE",input_time_format="'Not Used'",output_time_field="ActualDate",output_time_type="TEXT",output_time_format="MM/dd/yyyy;1033;;") inRaster = "G:/OMR/AMLU/GIS/Basefiles/Collector/ca_dem10m.tif" inFeatures = env.workspace +"/PExportTemp" arcpy.AlterField_management(inFeatures, 'RASTERVALU', 'GNSS_Heigh', 'GNSS_Heigh') "INTERPOLATE", "VALUE_ONLY") arcpy.sa.ExtractValuesToPoints(inPointFeatures, inRaster, inFeatures, # Execute ExtractValuesToPoints to get elevation data and convert into feet in GNNS_Heigh fieldNameFeet ="GNSS_Heigh" return float(GNSS_Heigh) * 3.28084""" codeblockFeet = """def getFeet(GNSS_Heigh): expressionFeet = "getFeet(!GNSS_Heigh!)" # Check for features in input arcpy.CalculateField_management(inFeatures, fieldNameFeet, expressionFeet, "PYTHON_9.3", codeblockFeet) count1 = int(result1.getOutput(0)) result1 = arcpy.GetCount_management(inFeatures) #inputfolder = os.path.dirname(inFeatures) arcpy.Delete_management("G:/OMR/AMLU/GIS/Basefiles/Collector/base.gdb/out") county="G:/OMR/AMLU/GIS/Basefiles/Collector/base.gdb/Counties83" plss="G:/OMR/AMLU/GIS/Basefiles/Collector/base.gdb/PLS_Fill83" join_features = "G:/OMR/AMLU/GIS/Basefiles/Collector/base.gdb/t24kquad_trim83" if count1 > 0: arcpy.Delete_management("G:/OMR/AMLU/GIS/Basefiles/Collector/base.gdb/outcounty") outFeatures = env.workspace +"/PExport84" # Spatial join to create PExport tintermediate outFeatures83 = env.workspace +"/PExport" out_county_class = "G:/OMR/AMLU/GIS/Basefiles/Collector/base.gdb/outcounty" out_feature_class = "G:/OMR/AMLU/GIS/Basefiles/Collector/base.gdb/out" arcpy.SpatialJoin_analysis(out_county_class, county, outFeatures) arcpy.SpatialJoin_analysis(out_feature_class, plss, out_county_class) arcpy.SpatialJoin_analysis(inFeatures, join_features, out_feature_class) fieldName2 = "DDLAT83" fieldLength1 = 22 fieldName1 = "GIS_ID" # Set local variables fieldPrecision3 = 9 fieldName3 = "DDLON83" fieldPrecision2 = 9 fieldPrecision5 = 9 fieldName6 = "TheHour" fieldName5 = "TEALE_Y83" fieldPrecision4 = 9 fieldName4 = "TEALE_X83" fieldName7 = "TheLetter" fieldLength6 = 2 #DD - Day in 2 digits #MM - Month in 2 digits # GIS_ID format L mm dd hh L yyyy LLL 1 fieldLength7 = 1 #L - first initial of GPS_Person #HH - 24 hours hour in 2 digits #LLL - GPS_Person (first 3 letters) #Add GIS_ID, DDLAT83, DDLON83, TEALE_X83, TEALE_Y83 fields #1 - Object ID #YYYY - Year in 4 digits #L - Ascending letters from A to Z expression1 = "getClass(!GPS_Person!,str(!GPS_DATE!), !TheHour!, !TheLetter!, str(!ID!))" arcpy.AddField_management(outFeatures, fieldName1, "TEXT", "", "", fieldLength1) nu = '1' yyyy = GPS_DATE.split("/")[2].split(" ")[0] dd = GPS_DATE.split("/")[1] codeblock1 = """def getClass(GPS_Person,GPS_DATE, TheHour, TheLetter, ID): mo = GPS_DATE.split("/")[0] hrs = TheHour letter = TheLetter if len(hrs) == 1: return GPS_Person[0:1].upper() + mo + dd + hrs + letter + yyyy + GPS_Person[0:3].upper()+nu""" hrs = '0' + hrs dd = '0' + dd if len(dd) == 1: if len(mo) == 1: mo = '0' + mo arcpy.AddField_management(outFeatures, fieldName3, "DOUBLE", "", "","","","NULLABLE","") arcpy.AddField_management(outFeatures, fieldName2, "DOUBLE", "", "","","","NULLABLE","") arcpy.AddField_management(outFeatures, fieldName6, "TEXT", "", "", fieldLength6) arcpy.AddField_management(outFeatures, fieldName5, "DOUBLE", "", "","","","NULLABLE","") arcpy.AddField_management(outFeatures, fieldName4, "DOUBLE", "", "","","","NULLABLE","") expression6 = "getHour(str(!GPS_DATE!))" #Calculate 24 hours Hour arcpy.AddField_management(outFeatures, fieldName7, "TEXT", "", "", fieldLength7) isPm = 1 if am == 'P': am = GPS_DATE.split("/")[2].split(" ")[2][0][0] codeblock6 = """def getHour(GPS_DATE): hh = GPS_DATE.split("/")[2].split(" ")[1].split(":")[0] ampm = 0 ampm = 12 isPm = 0 arcpy.CalculateField_management(outFeatures, fieldName6, expression6, "PYTHON_9.3", return str(hhhh)""" hhhh = int(hh) % 12 + ampm codeblock6) cursor = arcpy.UpdateCursor(outFeatures) myList = map(chr, range(65, 91)) # Calculate Acsending Letters from A to Z cursor.updateRow(row) row.TheLetter = i for i in myList: row = cursor.next() while row: del cursor del row if row == None: joinFieldA = "OBJECTID" codeblock1) arcpy.CalculateField_management(outFeatures, fieldName1, expression1, "PYTHON_9.3", # Execute CalculateField outputjoinField = "ID" inTable = inPointFeatures arcpy.AddXY_management(outFeatures83) #Add X Y fields arcpy.Project_management(outFeatures, outFeatures83, out_coordinate_system, "WGS_1984_(ITRF08)_To_NAD_1983_2011") #Convert outFeatures data to NAD83_2011_California_Teale_Albers in outFeatures83 (added 4/15/16 by Craig) out_coordinate_system = arcpy.SpatialReference('NAD 1983 (2011) California (Teale) Albers (Meters)') '!POINT_X!', "PYTHON") arcpy.CalculateField_management(outFeatures83, fieldName4, arcpy.CalculateField_management(outFeatures83, fieldName3, "arcpy.PointGeometry(!Shape!.firstPoint,!Shape!.spatialReference).projectAs(arcpy.SpatialReference(6318)).firstPoint.X", "PYTHON_9.3") arcpy.CalculateField_management(outFeatures83, fieldName2, "arcpy.PointGeometry(!Shape!.firstPoint,!Shape!.spatialReference).projectAs(arcpy.SpatialReference(6318)).firstPoint.Y", "PYTHON_9.3") # Convert X Y to Lat/Lon arcpy.CalculateField_management(outFeatures83, fieldName5, '!POINT_Y!', "PYTHON") arcpy.DeleteField_management(outFeatures83, dropFields) dropFields = ["POINT_X", "POINT_Y", "Join_Count", "TARGET_FID_1","Join_Count_1","TARGET_FID_12","TheHour", "TheLetter","Join_Count_12","AREA","PERIMETER", "DRGINDEX_","DRGINDEX_I", "OCODE", "USGS100","USGS250","CD","AREA_1","PERIMETER_1","AREA_12","PERIMETER_12","PLSFILL_","PLSFILL_ID","X_COORD","Y_COORD","COUNTY_","COUNTY_ID","Range_1","Staff"] # Delete fields and shapefile arcpy.DeleteField_management(inPointFeatures, dropFieldsIn) dropFieldsIn = ["ID", "GUID", "GPS_DATE", "ActualDate"] arcpy.Delete_management(outFeatures) arcpy.Delete_management(env.workspace +"/PExportTemp") df = arcpy.mapping.ListDataFrames(mxd,"*")[0] arcpy.MakeFeatureLayer_management(outFeatures83, "PExport", "", "", fields) newlayer = arcpy.mapping.Layer("PExport") # create a new layer # get the map document mxd = arcpy.mapping.MapDocument("CURRENT") # get the data frame arcpy.ApplySymbologyFromLayer_management(newlayer, symb) symb = "G:\OMR\AMLU\GIS\Tools\Scripts\Layers\Symbology.lyr" arcpy.mapping.AddLayer(df, newlayer,"TOP") arcpy.RefreshTOC() arcpy.AddMessage("PExport created") arcpy.RefreshActiveView() arcpy.AddError("{0} has no features.".format(inFeatures)) # Populate Photo Attachment table with GIS_ID field arcpy.AddMessage("Photos not exported") inPhotos = inPointFeatures + "__ATTACH" if str(ischecked) == 'true': ischecked = arcpy.GetParameterAsText(1) # "AMLU_Generic_2015_V5__ATTACH" # Boolean condition to skip or proceed with photos count = int(result.getOutput(0)) arcpy.AddMessage("Exporting photos") result = arcpy.GetCount_management(inPhotos) joinTable = outFeatures83 outputjoinField = "GUID" joinField = "REL_GLOBALID" if count > 0: arcpy.AddMessage("photos to export") arcpy.JoinField_management (inPhotos, joinField, joinTable, outputjoinField, "GIS_ID") # Write Photos to output folder and rename to GIS_ID_Photo#.jpg with da.SearchCursor(inPhotos, ['DATA', 'ATT_NAME', 'ATTACHMENTID', "GIS_ID",]) as cursor: for item in cursor: fileLocation = lvl_down(lvl_down(gdb_path)) + "\Photos" arcpy.AddMessage("Photo join complete") #Get Output Photo Folder attachment = item[0] del attachment dropFieldsPhoto = ["GIS_ID"] del filename del item filename = str(item[3]) + "_" + str(item[1]) open(fileLocation + os.sep + filename, 'wb').write(attachment.tobytes()) arcpy.AddError("No Photos in attachment ".format(inPhotos)) arcpy.DeleteField_management(inPhotos, dropFieldsPhoto) arcpy.AddMessage("Photo export complete") # print "Input table has no GPS_Person or Date features." # The input has no features except NoFeatures: print e.message tb = sys.exc_info()[2] # If an error occurred, print line number and error message except Exception, e: import traceback, sys Creates the final dataset ready for import into the database Exports photos in the attachments table with names based on a field in the feature class Uses spatial joins for some attributes

Script 6 Excel Exporter # Import arcpy module import arcpy, os, subprocess def lvl_down(path): return os.path.split(path)[0] # Input variables PExport = arcpy.GetParameterAsText(0) desc = arcpy.Describe(PExport) path = desc.path xls_path = lvl_down(lvl_down(path)) + "\XLS" Draw_the_Boundary = arcpy.GetParameterAsText(1) if Draw_the_Boundary == '#' or not Draw_the_Boundary: Draw_the_Boundary = "in_memory\\{9D6CE50C-AA28-4B6B-A40A-C45860EC5058}" xls_name = arcpy.GetParameterAsText(2) xls = xls_path + "\\" + xls_name + ".xls" Output_Feature_Class = r"in_memory\temp" if arcpy.Exists(Output_Feature_Class): arcpy.Delete_management(Output_Feature_Class) # Clip the points arcpy.Clip_analysis(PExport, Draw_the_Boundary, Output_Feature_Class, "") # Export the spreadsheet arcpy.TableToExcel_conversion(Output_Feature_Class, xls, "NAME", "CODE") #Find site GIS_ID with arcpy.da.SearchCursor(Output_Feature_Class, 'GIS_ID', """"Feature_Ty" = 'Site'""") as cursor: for row in cursor: arcpy.AddMessage('The site GIS ID is {0}'.format(row[0])) os.system("start excel.exe " + xls) Uses a Feature Dataset parameter for the user to define a polygon in the tool for clip Creates an excel spreadsheet of the clip Uses a search cursor and AddMessage to print the site GIS_ID in the script window Opens the spreadsheet automatically in Excel

Challenges Escaping backslashes! Inexplicable locks on data in ArcMap preventing deletion