Presentation is loading. Please wait.

Presentation is loading. Please wait.

Coding in Python and Basics of Tensorflow

Similar presentations


Presentation on theme: "Coding in Python and Basics of Tensorflow"— Presentation transcript:

1 Coding in Python and Basics of Tensorflow
Dr. Hien NGUYEN Ilker GURCAN (Teaching Assistant)

2 Coding in Python-Hello World
Commenting and Print Function # For a single line comment ‘’’ If you want to have multiple lines of comment, then you may use triple quotes ‘’’ print(‘You may print out using %s function for printing %d number of lines, or any number of lines ’, %(‘print()’, 10)) print(‘\n’ * 3) # You may also duplicate strings print(”I don’t like ”, end=“”) print(“new lines”)

3 Coding in Python-Hello World
List groceryList = [‘Juice’, ‘Tomatoes’, ‘Potatoes’] otherEvents = [‘Wash car’, ‘Pick up kids’, ‘Cash check’] toDoList = [otherEvents, groceryList] print(‘What I should do’, toDoList[0][1]) # Will print out “Tomatoes” groceryList.append(item) groceryList.insert(index, item) groceryList.remove(…) del otherEvents[2] And many others such as sort, reverse, etc. You have other methods which could be applied to container-typed classes such as : len(list1), max(list2), min(listN), etc.

4 Coding in Python-Hello World
Tuples piTuple = {3, 1, 4, 1, 5, 9} # Unmodifiable list newList = list(piTuple) # Convert to list newTuple = tuple(newList) Map superHeroes = {‘Fiddler’ : ‘Isaac Chaney’, ‘iliTheBlack’ : ‘Ilker GURCAN’, ’Tulu Girl’ : ‘Tulun ERGIN’} print(superHereos[‘iliTheBlack’]) # Will print out “Ilker GURCAN” print(superVillains.keys()) print(superVillains.values()) superVillains['Tulu Girl'] = ’Michael Jordan’ del superVillains[‘Fiddler’]

5 Coding in Python-Hello World
Loops and Conditions: Looping through an iterable object: for x in range(0, 10): print(x, ' ', end="") for y in groceryList: print(y, ' ', end="") for x in [1,2,4,7,9]: print(x) An example for both conditionals and while loop: i=0 while(i<20): if(i%2 == 0): print(i) elif(i == 9): break else: i += 1 # i = i continue i += 1 # This will be ignored when else block is executed

6 Coding Python-Methods
Method Signature: def methodName(method_args=default_args): doSomething [return return_val] Calling a function: methodName(args) No notion of call by value or pass by reference as in C/C++. Every immutable object is passed by its value; while all mutable objects are passed by so called “object”. Check this link for more info.

7 Coding Python-Classes
class Animal: __name = "" # Double underscore means that your field is private __height = __weight = __sound = def __init__(self, name, height, weight, sound): # Constructor self.__name = name self.__height = height self.__weight = weight self.__sound = sound def setName(self, name): self.__name = name def getName(self): return self.__name def getType(self): print("Animal") # For the sake of polymorphism def getSound(self): return self.__sound def toString(self): return "{} is {} cm tall and {} kilograms and say {}” format(self.__name, self.__height, self.__weight, self.__sound)

8 Coding Python-Inheretance
class Dog(Animal): # Inheritance __owner = "" def __init__(self, name, height, weight, sound, owner): self.__owner = owner super(Dog, self).__init__(name, height, weight, sound) def setOwner(self, owner): self.__owner = owner def getType(self): # Overridden print("Dog") # Remember it for polymorphism def toString(self): # Overridden parStr = super(Dog, self).toString() return (parStr + ". His owner is {}").format(self.__owner) def multipleSounds(self, howMany=None): # New Method if howMany is None: print(self.getSound()) else: print(self.getSound() * howMany)

9 Coding Python-Polymorphism
class AnimalTesting: def getType(self, animal): animal.getType() Create two Animals: cat = Animal('Whiskers', 33, 10, 'Meow') print(cat.toString()) dog = Dog("Newton", 110, 50, "Hoowww", "iliTheBlack") print(dog.toString()) print(dog.multipleSounds(4)) Now where polymorphism comes into play: testAnimal = AnimalTesting() testAnimal.getType(cat) # Will work regardless of its type and print out “Animal” testAnimal.getType(dog) # Will work regardless of its type and print out “Dog”

10 Tensorflow An open source software library for numerical computation using data flow graphs. Works on both CPU and GPU Multi platform support: Desktop, Server or Mobile device Purpose initially was to conduct research on machine learning and neural networks. Now applicable to many domains. You may even build your graph to solve Dijkstra’s shortest path algorithm. A substitution for Torch library

11 Tensorflow-Data Structures
Constants tf.zeros(…), tf.ones(…), and tf.constant(…) const_val = tf.constant(9, dtype=tf.float32, shape=[2, 2]) # [[9, 9], [9, 9]] Placeholders Placeholders are used to feed graphs with the input data. As its name suggests, they are placeholders for actual tensors. x = tf.placeholder(tf.float32, shape=(1024, 1024)) y = tf.matmul(x, x) with tf.Session() as sess: rand_array = np.random.rand(1024, 1024) sess.run(y, feed_dict={x: rand_array})

12 Tensorflow-Data Structures
Variables To train a model you should use variables (weights, biases, counters, etc.). They can be saved to disk. They can be restored from disk. They are used to hold and update parameters of your model. There are some important concepts specific to variables: Creation Device placement Initialization Saving/Restoring Sharing them across separate model executions

13 Tensorflow-Data Structures
Variable Creation: tf.Variable(aTensor, name=“weights”) You may use either built in functions to create a tensor: aTensor = tf.random_normal(…) / tf.zeros(…) / tf.range(…),… etc. OR a python array to create a tensor aTensor = tf.constant(aPythonArray) Calling function Variable(…) adds several ops to the graph object: variable op holding variable’s value Initializer op that sets the variable to its initial value. In fact it is tf.assign operation.

14 Tensorflow-Data Structures
Device placement of Variables: with tf.device(“/cpu:0”) # You may use any enumerated device v = tf.Variable(…) Note that however; parameter update operations and optimizer must be run on the same device as on where these variables have been created. Initialization: Variable initializaters must be run before everything else. Since it is tedious operation to initialize all variables; tensorflow has a global initializer function used as the following: init_op = tf.global_variable_initializer() with tf.Session() as sess: sess.run(init_op)

15 Graph Building Process
Tensorflow is composed of several mechanics. In fact any meaningful neural network would have these following three components: Inference: Building the model, defining all operations carried out by the graph. Including: Weights Biases Activation functions Net functions Loss: How we define the loss w.r.t. the actual output and the one gathered from network. The purpose is to minimize this loss function using an optimizer. There various built-in cost functions in tensorflow: Softmax loss Cross entropy loss MSE Train: It is mainly composed of (but not restricted to) following operations: Defining summary writers Creating optimizer: Gradient Descent, SGD, Adam, etc. Associating loss with the optimizer and returning optimizer to be run on the session

16 Visualizing Learning Tensorboard
A tool to visualize the output of your graph. You may attach multiple summary collectors to a You may visualize the outcome of a particular layer by monitoring variables: def variable_summaries(var):   """Attach a lot of summaries to a Tensor (for TensorBoard visualization)."""   with tf.name_scope('summaries'):     mean = tf.reduce_mean(var)     tf.summary.scalar('mean', mean)     with tf.name_scope('stddev'):       stddev = tf.sqrt(tf.reduce_mean(tf.square(var - mean)))     tf.summary.scalar('stddev', stddev)     tf.summary.scalar('max', tf.reduce_max(var))     tf.summary.scalar('min', tf.reduce_min(var))     tf.summary.histogram('histogram', var)

17 Visualizing Learning You may also attach summary writer to your loss function; so that you may visualize how your cost function behaves. with tf.name_scope('cross_entropy'):   # The raw formulation of cross-entropy,   #   # tf.reduce_mean(-tf.reduce_sum(y_ * tf.log(tf.softmax(y)),   #                               reduction_indices=[1]))   #   # can be numerically unstable.   #   # So here we use tf.nn.softmax_cross_entropy_with_logits on the   # raw outputs of the nn_layer above, and then average across   # the batch.   diff = tf.nn.softmax_cross_entropy_with_logits(targets=y_, logits=y)   with tf.name_scope('total'):     cross_entropy = tf.reduce_mean(diff) tf.summary.scalar('cross_entropy', cross_entropy)

18 Visualizing Learning Writing into a log file which is read and parsed by tensorboard: summary = tf.summary_merge_all() with tf.Session() as sess: summary_writer = tf.summary.FileWriter(FLAGS.log_dir, sess.graph) … summary_str = sess.run(summary, feed_dict=feed_dict) summary_writer.add_summary(summary_str, step) summary_writer.flush() After training is complete, run the following command to visualize results on a web browser (ensure that there is no space on both sides of the assignment operator): tensorboard --logdir=path/to/logFolder

19 Useful Links Python Tutorial – Language basics
Numpy Tutorial – All tensorflow array operations are inspired by numpy Python Collections – Lists, Queues, Maps, etc. Installing Tensorflow on Ubuntu – You may also find installation for different platforms on the link


Download ppt "Coding in Python and Basics of Tensorflow"

Similar presentations


Ads by Google