Connect two or more Azure Virtual Networks using one VPN Gateway
Peering is a feature that allows to connect two or more virtual networks and act as one bigger network. At this post we will see how we can connect two Azure Virtual Networks, using peering and access the whole network using one VPN Gateway. We can connect Virtual Networks despite if they are in the same Subscription or not.
I have created a diagram to help understand the topology.
- We have a Virtual Network with Site-2-Site VPN wto On Premises. It can also have Point-2-Site connection configured. The VNET A.
- We have another Virtual Network at the Same Subscription that we want to connect each other. The VNET B.
- Also we can have a third Virtual Network at a different subscription. The VNET C.
In sort we need those peerings with the specific settings:
- At the VNETA Peering VNETA to VNETB with “Allow Gateway transit”
- At the VNETA Peering VNETA to VNET
- At the VNETB Peering VNETB to VNETA with “Use Remote Gateway”
- At the VNETB Peering VNETB to VNETC
- At the VNETC Peering VNETC to VNETA with “Use Remote Gateway”
- At the VNETC Peering VNETC to VNETB
In order to be able to connect all those networks and also access them using the VPN Connection there are four requirements:
- The account that will be used to create the peering must have the “Network Contributor” Role.
- The Address Space must be different on each other and not overlap.
- All other Virtual Networks, except the one that has the VPN Connection must NOT have a VPN Gateway deployed.
- Of course at the local VPN device (router) we need to add the address spaces of all the Virtual Networks that we need to access.
Lets lab it:
- HQ 192.168.0.0/16 –> The on-premises network
- VNET A 10.1.0.0/16 –> The Virtual Network that has the VPN Gateway (At my lab is named “devvn”)
- VNET B 10.229.128.0/24 –> THe virtual network at a different subscription of the Gateway (At my lab is named “Network prtg-rsg-vnet”)
- VNET C 172.16.1.0/24 –> The virtual network at the same subscription as the Gateway Network (At my lab is named “provsevnet)
The on-premises network is connected with Site-to-site (IPsec) VPN to the VNETA
Now we need to connect VNETA and VNETB using Vnet Peering. in order to have a Peering connection we need to create a connection from VNETA to VNETB and one from VNETB to VNETA.
Open the VNETA Virtual Network, go to the Peerings setting and press +ADD
Select the VNETB and check the “Allow Gateway transit” to allow the peer virtual network to use your virtual network gateway
Then go to the VNETB, go to the Peerings setting and click +ADD.
Select the VNETA Virtual Network and check the “Use Remote Gateway” to use the peer’s virtual network gateway. This way the VNETB will use the VNETA’s Gateway.
Now we can contact the VNETB network from our on-premises network
a multi-ping screenshot:
- From 10.229.128.5 (VNETB) to 192.168.0.4 (on-premises) & the opposite
- From 10..1.2.4 (VNETA) to 10.229.128.5 (VNETB) & to 192.168.0.4 (on-premises)
The next step is to create a cross-subscription peering VNETA with VNETC
Open the VNETA and create a peering by selecting the VNETC from the other Subscription and check the “allow gateway transit”
Then go to the VNETC and create a peer with the VNETA and check the “use remote gaeway”
With the two above connections we have connectivity between the on-premises network and the VNETC.
The final step, to enable the connectivity between VNETB & VNETC. To accomplish this just create one peer from the VNETB to VNETC and one from VNETC to VNETB.
In order to have client VPN connectivity to the whole network, create a Point-2-Site VPN at the VNETA. You can follow this guide: Azure Start Point | Point-to-Site VPN
The post Connect two or more Azure Virtual Networks using one VPN Gateway appeared first on Apostolidis IT Corner.
New UI for the Azure VM Creation Wizard
Microsoft has revamped the UI around creating a new Azure VM. The information is now much more compact. Way less scrolling. And it’s easier to move along the dozens and dozens of options.
Its not only about UI. You will see much more options available at the VM creation, like adding data disks!!!
Also there is no need anymore to go through all the options, you can “Review & Create” the VM at any step since you have completed all the required fields or just accept the defaults.
Azure Backup | Enable backup alert notifications
Azure Backup generates alerts for all backup events, such as unsuccessful backups. A new option is to create backup alert notifications so Azure Backup will alert you firing an email when an alert is generated.
To enable the backup alert notifications, navigate to the “Backup Alerts” section of the “recovery Services vault” and click the “Configure notifications”
There switch the Email notification to On to enable the alerts. Enter one or more recipients separated with semicolon (;). Choose Per Alert or Hourly Digest. Per Alert will fire an email for every alert instantly and the Hourly Digest means that the notification agent will check for alerts every hour and will fire an email with the active alerts.
Finally choose the Severity of the alerts which you will be notified and press save.
Azure Portal | Virtual Machines bulk actions
Azure Portal is a great GUI tool to administer all your Azure Resources and it continues evolving. Here is a very useful Tip. Did you know that you can manage Virtual Machines in bulk using the Azure Portal VIrtual Machines section? We have virtual machines bulk actions!
Not only we can Assign Tags, Start, Restart, Stop and Delete Virtual Machines in bulk but also configure Change Tracking, Inventory and Update Management!!
Filter out the Virtual Machines needed and just click the “Change Tracking” to have a report off all changes that happens inside the VM, like changes to services for Windows, daemons for Linux, applications and file changes.
Use the “Inventory” to have a complete inventory of all the installed applications of the VM. Enable consistent control and compliance of these virtual machines.
Enable the “Update Management” to manage the Updates of the selected Virtual Machines. Create update policies and control the installation of the updates.
Alice envisions the future, Athens August 27th – August 30th, 2018
This year I am honored to being part of such an inspiring event. Microsoft brought 160 girls, 60 teachers and topnotch speakers to a 4-day workshop with access to the latest tech, one unique Artificial Intelligence bootcamp: Alice envisions the future.
Thank you girls for amazing and inspirational time, it was a privilege to watch you completing all the challenges!
The Greek Microsoft MSPs & MVPs, together with Kateřina Zahradníčková:
Me and George Markou, the two MVPs, at Hilton Athens, the third day of the event:
The girls in action, solving all the challenges :
Event Link: https://www.microsoft.com/en-mt/ai4girls
Application Security Groups to simplify your Azure VMs network security
Application Security Groups helps to manage the security of the Azure Virtual Machines by grouping them according the applications that runs on them. It is a feature that allows the application-centric use of Network Security Groups.
An example is always the best way to better understand a feature. So let’s say that in a Subnet we have some Web Servers and some Database Servers. The access rules of the Subnet’s Network Security Group to allow http, https & database access to those servers will be something like this:
Using only the Network Security Groups functionality we need to add the IP addresses of the servers to use them to the access lists. There are two major difficulties here:
- For every rule we need to add all the IPs of the servers that will be included.
- If there is an IP address change (e.g by adding or removing a server) then all the relative rules must change.
Use Application Security Groups
Now, lets see how we can bypass this complexity by using Application Security Groups, combined with Network Security Groups.
Create two Application Security Groups, one for the Web Servers and one for the Database Servers
At the Azure Portal, search for Application Security Groups
Provide a name and a Resource Group
Create one more with name Database Servers and at the Resource Group you will have those two Application Security Groups:
Then go each Virtual Machine and attach the relevant ASG.
Click the Virtual Machine and then go to the Networking settings blade, and press the “Configure the application security groups”
Select the relevant ASG and press save:
Do the same for all your servers. Finally open the Network Security Group. Open the https rule, at my example is the “https2WebServers” rule. Change the Destination to “Application Security Group” and for Destination application security group select the Web Servers.
Same way change the database access rule and for Source add the “Database Server” ASG and for destination the “Web Servers” ASG. Now the NSG will look like this:
Now on when removing a VM from the Web Servers farm of the Database servers cluster there is no need to change anything at the NSG. When adding a new VM, the only thing we need to do is to attach the VM to the relative Application Security Group.
A Virtual Machine can be attached to more than one Application Security Group. This helps in cases of multi-application servers.
There are only two requirements:
- All network interfaces used in an ASG must be within the same VNet
- If ASGs are used in the source and destination, they must be within the same VNet
The post Application Security Groups to simplify your Azure VMs network security appeared first on Apostolidis IT Corner.
Ασφαλίστε την MySQL και την PostgreSQL με τη χρήση Service Endpoints
Σε προηγούμενο post, Ασφάλισε την Azure SQL Database μέσα σε ένα VNET χρησιμοποιώντας service endpoints, είδαμε πως μπορούμε να χρησιμοποιήσουμε τα Service Endpoints του Azure Virtual Network για να ασφαλίσουμε μια Azure SQL για πρόσβαση μόνο από εσωτερικό δίκτυο.
Σήμερα, το Microsoft Azure, ανακοίνωσε την γενική διαθεσιμότητα του Service Endpoints για MySQL και PostgreSQL. Αυτό δίνει την δυνατότητα να κόψουμε όλη την Public πρόσβαση στις MySQL & PostgreSQL και να επιτρέψουμε μόνο πρόσβαση απο το εσωτερικό μας δίκτυο. Φυσικά μπορεί να οριστεί συγκεκριμένο Subnet ή Subnets. Επίσης δεν υπαρχει επιπλέων χρέωση για την χρήση των Service Endpoint.
Περισσότερα μπορείτε να δείτε στο Microsoft Azure Blog: Announcing VNet service endpoints general availability for MySQL and PostgreSQL
The post Ασφαλίστε την Azure MySQL και PostgreSQL με τη χρήση Service Endpoints appeared first on Apostolidis IT Corner.
This post is reposted from the Microsoft Azure Blog : What is Artificial Intelligence? <azure.microsoft.com/blog/what-is-artificial-intelligence/>
Aug 9th 2018, 12:00, by Theo van Kraay
It has been said that Artificial Intelligence will define the next generation of software solutions. If you are even remotely involved with technology, you will almost certainly have heard the term with increasing regularity over the last few years. It is likely that you will also have heard different definitions for Artificial Intelligence offered, such as:
*“The ability of a digital computer or computer-controlled robot to perform tasks commonly associated with intelligent beings.”* – Encyclopedia Britannica
*“Intelligence demonstrated by machines, in contrast to the natural intelligence displayed by humans.”* – Wikipedia
How useful are these definitions? What exactly are “tasks commonly associated with intelligent beings”? For many people, such definitions can seem too broad or nebulous. After all, there are many tasks that we can associate with human beings! What exactly do we mean by “intelligence” in the context of machines, and how is this different from the tasks that many traditional computer systems are able to perform, some of which may already seem to have some level of *intelligence* in their sophistication? What exactly makes the *Artificial Intelligence* systems of today different from sophisticated software systems of the past?
It could be argued that any attempt to try to define “Artificial Intelligence” is somewhat futile, since we would first have to properly define “intelligence”, a word which conjures a wide variety of connotations. Nonetheless, this article attempts to offer a more accessible definition for what passes as Artificial Intelligence in the current vernacular, as well as some commentary on the nature of today’s AI systems, and why they might be more aptly referred to as “intelligent” than previous incarnations.
Firstly, it is interesting and important to note that the technical difference between what used to be referred to as Artificial Intelligence over 20 years ago and traditional computer systems, is close to zero. Prior attempts to create intelligent systems known as *expert systems* at the time, involved the complex implementation of exhaustive rules that were intended to approximate* intelligent behavior*. For all intents and purposes, these systems did not differ from traditional computers in any drastic way other than having many thousands more lines of code. The problem with trying to replicate human intelligence in this way was that it requires far too many rules and ignores something very fundamental to the way *intelligent beings* make *decisions*, which is very different from the way traditional computers process information.
Let me illustrate with a simple example. Suppose I walk into your office and I say the words “Good Weekend?” Your immediate response is likely to be something like “yes” or “fine thanks”. This may seem like very trivial behavior, but in this simple action you will have immediately demonstrated a behavior that a traditional computer system is completely incapable of. In responding to my question, you have effectively dealt with ambiguity by making a prediction about the correct way to respond. It is not certain that by saying “Good Weekend” I actually intended to ask you whether you had a good weekend. Here are just a few possible* intents* behind that utterance:
– Did you have a good weekend? – Weekends are good (generally). – I had a good weekend. – It was a good football game at the weekend, wasn’t it? – Will the coming weekend be a good weekend for you?
The most likely intended meaning may seem obvious, but suppose that when you respond with “yes”, I had responded with “No, I mean it was a good football game at the weekend, wasn’t it?”. It would have been a surprise, but without even thinking, you will absorb that information into a mental model, correlate the fact that there was an important game last weekend with the fact that I said “Good Weekend?” and adjust the probability of the expected response for next time accordingly so that you can respond correctly next time you are asked the same question. Granted, those aren’t the thoughts that will pass through your head! You happen to have a neural network (aka “your brain”) that will absorb this information automatically and *learn* to respond differently next time.
The key point is that even when you do respond next time, you will still be making a prediction about the correct way in which to respond. As before, you won’t be certain, but if your prediction *fails* again, you will gather new data which leads to my definition of Artificial Intelligence:
“Artificial Intelligence is the ability of a computer system to deal with ambiguity, by making predictions using previously gathered *data*, and learning from errors in those predictions in order to generate newer, more accurate predictions about how to behave in the future”.
This is a somewhat appropriate definition of Artificial Intelligence because it is exactly what AI systems today are doing, and more importantly, it reflects an important characteristic of human beings which separates us from traditional computer systems: human beings are prediction machines. We deal with ambiguity all day long, from very trivial scenarios such as the above, to more convoluted scenarios that involve *playing the odds* on a larger scale. This is in one sense the essence of *reasoning*. We very rarely know whether the way we respond to different scenarios is absolutely correct, but we make reasonable predictions based on past experience.
Just for fun, let’s illustrate the earlier example with some code in R! First, lets start with some data that represents information in your mind about when a particular person has said “good weekend?” to you.
In this example, we are saying that *GoodWeekendResponse* is our *score label* (i.e. it denotes the appropriate response that we want to predict). For modelling purposes, there have to be at least two possible values in this case “yes” and “no”. For brevity, the response in most cases is “yes”.
We can fit the data to a logistic regression model:
library(VGAM) greetings=read.csv(‘c:/AI/greetings.csv’,header=TRUE) fit <- vglm(GoodWeekendResponse~., family=multinomial, data=greetings)
Now what happens if we try to make a prediction on that model, where the expected response is different than we have previously recorded? In this case, I am expecting the response to be “Go England!”. Below, some more code to add the prediction. For illustration we just hardcode the new input data, output is shown in bold:
response <- data.frame(FootballGamePlayed=”Yes”, WorldCup=”Yes”, EnglandPlaying=”Yes”, GoodWeekendResponse=”Go England!!”) greetings <- rbind(greetings, response) fit <- vglm(GoodWeekendResponse~., family=multinomial, data=greetings) prediction <- predict(fit, response, type=”response”) prediction index <- which.max(prediction) df <- colnames(prediction) df[index] * No Yes Go England!! 1 3.901506e-09 0.5 0.5 > index <- which.max(prediction) > df <- colnames(prediction) > df[index]  “Yes”*
The initial prediction “yes” was wrong, but note that in addition to predicting against the new data, we also incorporated the actual response back into our existing model. Also note, that the new response value “Go England!” has been *learnt*, with a probability of 50 percent based on current data. If we run the same piece of code again, the probability that “Go England!” is the right response based on prior data increases, so this time our model *chooses* to respond with “Go England!”, because it has finally learnt that this is most likely the correct response!
* No Yes Go England!! 1 3.478377e-09 0.3333333 0.6666667 > index <- which.max(prediction) > df <- colnames(prediction) > df[index]  “Go England!!”*
Do we have Artificial Intelligence here? Well, clearly there are different *levels* of intelligence, just as there are with human beings. There is, of course, a good deal of nuance that may be missing here, but nonetheless this very simple program will be able to react, with limited accuracy, to data coming in related to one very specific topic, as well as learn from its mistakes and make adjustments based on predictions, without the need to develop exhaustive rules to account for different responses that are expected for different combinations of data. This is this same principle that underpins many AI systems today, which, like human beings, are mostly sophisticated prediction machines. The more sophisticated the machine, the more it is able to make accurate predictions based on a complex array of data used to *train* various models, and the most sophisticated AI systems of all are able to continually learn from faulty assertions in order to improve the accuracy of their predictions, thus exhibiting something approximating human *intelligence*. Machine learning
You may be wondering, based on this definition, what the difference is between *machine learning* and *Artificial intelligence*? After all, isn’t this exactly what machine learning algorithms do, make predictions based on data using statistical models? This very much depends on the definition of *machine learning*, but ultimately most machine learning algorithms are* trained* on static data sets to produce predictive models, so machine learning algorithms only facilitate part of the dynamic in the definition of AI offered above. Additionally, machine learning algorithms, much like the contrived example above typically focus on specific scenarios, rather than working together to create the ability to deal with *ambiguity* as part of an *intelligent system*. In many ways, machine learning is to AI what neurons are to the brain. A building block of intelligence that can perform a discreet task, but that may need to be part of a composite *system* of predictive models in order to really exhibit the ability to deal with ambiguity across an array of behaviors that might approximate to *intelligent behavior*. Practical applications
There are number of practical advantages in building AI systems, but as discussed and illustrated above, many of these advantages are pivoted around “time to market”. AI systems enable the embedding of complex decision making without the need to build exhaustive rules, which traditionally can be very time consuming to procure, engineer and maintain. Developing systems that can “learn” and “build their own rules” can significantly accelerate organizational growth.
Microsoft’s Azure cloud platform offers an array of discreet and granular services in the AI and Machine Learning domain <docs.microsoft.com/en-us/azure/#pivot=products&panel=ai>, that allow AI developers and Data Engineers to avoid re-inventing wheels, and consume re-usable APIs. These APIs allow AI developers to build systems which display the type of *intelligent behavior* discussed above.
If you want to dive in and learn how to start building intelligence into your solutions with the Microsoft AI platform, including pre-trained AI services like Cognitive Services and the Bot Framework, as well as deep learning tools like Azure Machine Learning, Visual Studio Code Tools for AI, and Cognitive Toolkit, visit AI School <aischool.microsoft.com/learning-paths>.
The post Microsoft Azure Blog: What is Artificial Intelligence? appeared first on Apostolidis IT Corner.
Monitor & Alert for your Azure VM
Lets see how easy it is to monitor and create an alert, in order to be notified when your VMs are restarted, when they start, stop, get high CPU usage, memory and much more.
First navigate to the Azure Portal https://portal.azure.com, and then click the Monitor button.
You will be navigated to the Monitor blade. At the center of the screen you will see three mail buttons, each starts a wizard.
Click the “Create Alert” under the Explore monitoring essentials, the first of the three buttons.
The create rule wizard will start. First you need to Select target.
Select the subscription, at the Filter resource type select Virtual machines and select the VM from the Resource list.
Once you press the target VM you will see a preview of the selection and the available signals.
After the alert target, select the criteria
At the configure signal login blade, select the signal from the list. I have selected the Restart Virtual Machine.
Once you select the signal you can select the severity level and also you will see the preview of the condition.
After that give a name and a description for the alert. Also select the resource group where the alert will be saved and if you want the alert to be enabled upon creation.
The next step is to create an action group. The action group is the list of accounts to get the notifications when the alert is triggered. The notification can be email, SMS, Push Notifications and Voice call. You can add many action groups and many action in each group.
Now the alert is ready. Once the alert is triggered you will be notified. At this example I added an email alert and once the VM restarted I received the following email:
More Microsoft Azure guides at Apostolidis IT Corner
Azure Storage | Static Web Site
Το Microsoft Azure ανακοίνωσε την δυνατότητα να φιλοξενεί στατικές ιστοσελίδες απευθείας στο Blob Storage, με το κόστος του Blob Storage! Τι σημαίνει αυτό? Για 1 GB χώρο και 100000 views το κόστος είναι περίπου 0,05 ευρώ το μήνα!
Μπορείτε να υπολογίσετε το κόστος με το Azure Prising Calculator Στο link https://azure.microsoft.com/en-us/pricing/calculator/
Τι χρειαζόμαστε? απλά ένα Storage Account V2.
Μόλις δημιουργηθεί το Storage Account, πρώτα ενεργοποιούμε το Static website από τα Settings του Storage Account. Μόλις πατήσουμε Save θα δημιουργηθεί ένα Virtual Directory με το όνομα $web. Το πατάμε για να μπούμε μέσα στο Blob για να ανεβάσουμε το περιεχόμενο μας. Επίσης σημειώνουμε το Primary endpoint γιατί είναι και το URL του Site μας.
Για να ανεβάσουμε content στο $web Blob μπορούμε να χρησιμοποιήσουμε τον Storage Explorer
και είμαστε έτοιμοι. Κάνουμε Browse στο URL του Static website, στο παρδειγμά μου είναι το https://proximagr.z6.web.core.windows.net/
Φυσικά μπορούμε να βάλουμε το δικό μας Domain. Πρώτα φτιάχνουμε ένα CNAME που θα κάνει Point στο Endpoint και μετά πηγαίνουμε στο Custom Domain όπου δίνουμε το CNAME μας.
και το αποτέλεσμα: