CodeCloudy

Azure | .Net | JQuery | Javascript | Umbraco

Umbraco Life Savers

Error:

The model item passed into the dictionary is of type ‘Umbraco.Web.Models.PartialViewMacroModel’, but this dictionary requires a model item of type ‘CWSStart.Web.Models.LoginViewModel’.

Solution:

http://stackoverflow.com/questions/28708282/this-dictionary-requires-a-model-item-of-type-umbraco-web-models-partialviewmac

The issue was that another member of my team had created a Banner.cshtml file in /Views/MacroPartials and this Partial View was being loaded instead of my one.

My case:

i had a view with the same name on the AuthSurface folder and in the views folder.

 

Leave a comment »

IIS

Services that make up the IIS 7 Web server include:

Service

Display Name

Description

Executable

Default Startup

AppHostSVC

Application Host Helper Service

Stores configuration history and application pool mapping in history subdirectories at set intervals.

svchost.exe

Automatic

FTPSVC

FTP Publishing Service

Enables IIS 7 Web servers to be File Transfer Protocol (FTP) servers.

svchost.exe

Manual

IISADMIN

IIS Admin Service

For IIS 6.0 metabase configuration compatibility

inetinfo.exe

Automatic

MSFTPSVC

FTP Publishing Service 6

Enables IIS 6 Web servers to be File Transfer Protocol (FTP) servers.

inetinfo.exe

Manual

W3SVC

World Wide Web Publishing Service

Provides Web connectivity and administration through IIS Manager

svchost.exe

Automatic

WAS

Windows Process Activation Service

Provides process activation, resource and health management services for message-activated applications.

svchost.exe

Manual

WMSVC

Web Management Service

Enables remote management of a Web server.

wmsvc.exe

Manual

To troubleshoot and manage services you can use the Services snap-in to the Microsoft Management Console. The MMC Services snap-in has feature for:

  • Monitoring the status of services.
  • Starting, stopping, pausing and resuming, or disabling services.
  • Viewing service properties and dependencies.
  • Setting up recovery actions to deal with service failures.

https://blogs.iis.net/tomwoolums/the-services-behind-internet-information-services-7-0

Leave a comment »

Javascript/JQuery/json/

  1. Get size of json object
 type="text/javascript">

  var myObject = {'name':'Kasun', 'address':'columbo','age': '29'}

  var count = Object.keys(myObject).length;

  console.log(count);

2. Trying to pass in a boolean C# variable to a javascript variable and set it to true

myjavascript( <%= MyBooleanVariableInCSharp.ToString().ToLower() %> );

You may also want to try:

isFollowing: '@(Model.IsFollowing)' === '@true'

and an ever better way is to use:

isFollowing: @Json.Encode(Model.IsFollowing)

	
Leave a comment »

The following exception was thrown: Access to the path denied. Exception details: System.UnauthorizedAccessException: Access to the path

According to File.Delete Method…

An UnauthorizedAccessException means one of 4 things:

  1. The caller does not have the required permission.
  2. The file is an executable file that is in use.
  3. Path is a directory.
  4. Path specified a read-only file.

Other answers:

Try setting the access permissions to “Full control” for the .Net user from where you are reading/saving the files. You need to find out from the application pool for the website what is the identity it is running under (by default this is Application Pool Identity) and grant that the correct permissions.

 

Leave a comment »

Azure Scheduler 102 – Creating a Job

What is a Job Collection?

Simple Definition: Job Collection is a Collection of jobs that runs in a particular region.

A job collection contains a group of jobs, and maintains settings, quotas and throttles that are shared by jobs within the collection. A job collection is created by a subscription owner, and groups jobs together based on usage or application boundaries. It’s constrained to one region. It also allows the enforcement of quotas to constrain the usage of all jobs in that collection; the quotas include MaxJobs and MaxRecurrence.

What is a Job?

Simple Definition: a Task to be completed in a given time.

A job defines a single recurrent action, with simple or complex strategies for execution. Actions may include HTTP requests or storage queue requests.

 

How to create Scheduler Job?


 

Step 1: create a job collection


Note: this will create a standard job collection. If you need to change it to free plan (which allows only 5 jobs maximum) use the Scale configuration tab.

 

Step 2: Now you can choose the action type

  • Invoking a Web Service over HTTP/s

    OR

  • Post a message to a Windows Azure Storage Queue

 

 

 Selecting Action Type:


Invoking a Web Service over HTTP or HTTS will enable to choose http methods.


 

Post a message to a Windows Azure Storage Queue will enable inputs to enter storage and queue access details.


 

You can use existing Azure Resources as well.


 

Step 3: selecting schedule

You can run as a one time job by

  • Running it immediately by selecting “Now”

Or

  • In a specific time


 

OR you can create a recurring job


 

The job status can be monitored via the portal


The queue message will be created in the queue in the specified schedule

Following is a job with a message viewed using storage explorer.


 

Creating a Scheduler may fail!

Creating the scheduler collection may fail if the queue name is not provided properly.

NOTE: “Queue name must be 3-63 characters in length may contain lower-case alphanumeric characters and dashes.”

< Back to Inroduciton

 

Leave a comment »

Azure Scheduler 101 – Introduction

Purpose: Run jobs on simple or complex recurring schedules

Windows Azure Scheduler allows you to invoke actions – such as calling HTTP/S endpoints or posting a message to a storage queue – on any schedule. With Scheduler, you create jobs in the cloud that reliably call services inside and outside of Windows Azure. You choose whether to run those jobs right away, on a recurring schedule, or at some point in the future.

 

Where can we use Scheduler?

  • SaaS apps can use Scheduler to invoke a web service that performs data management on a set schedule
  • Internal Uses (1st party):

    Process long running requests – Http/s requests will timeout when a response is not received within a certain amount of time. For complex requests, such as a series of SQL queries against a large database, posting a message to storage queues allows you to complete the work without building additional asynchronous logic into your web service.

    • Windows Azure Mobile Services powers its scheduled scripts feature with the Scheduler. Skype, XBOX Video also uses scheduler to schedule its tasks.
    • Another Windows Azure service uses the Scheduler to regularly invoke a web service that performs diagnostic log cleanup and data aggregation
  • Enable a service to be invoked when offline – Typically, a web service needs to be online at the time the Scheduler invokes the endpoint. With messages posted to storage queues, however, the service can be offline when the Scheduler sends the message and field the request when it later comes online.
  • Recurring application actions: As an example, a service may periodically get data from twitter and gather the data into a regular feed.
  • Daily maintenance: Web applications have needs such as daily pruning of logs, performing backups, and other maintenance tasks. An administrator may choose to backup the database at 1AM every day for the next 9 months, for example.

 

You will have a new section in the azure portal


 

If not, you can enable it using preview services


 

Portal Access High level view

Job collections: which lists available job collections which simple groups jobs.


Jobs: this section will show all the jobs which we can filter by job collection or status.


 

Next lets create a job > Next Post

Leave a comment »

Simplest way of compressing files using .net 4.5

 

We will try to create a sample project to demonstrate compression using .net 4.5

  1. Create a sample Console project
  2. Right click references > Add References

    Search and add the DLL “System.IO.Compression.FileSystem”

     

  3. Now let’s add a sample folder called “test” and add 2 sample text files in it.

  4. Now select the 2 sample test files and change the property “Copy to output directory” to “copy if newer”

    This will create the test folder and copy both the test files in to it in the debug folder which is the working folder for the program.

     

  5. Now you can use the following code in the Main program to compress the “test” folder

     

    Syntax (C#)

    public static void CreateFromDirectory(

        string sourceDirectoryName,

        string destinationArchiveFileName

    )

    sourceDirectoryName: The path to the directory to be archived, specified as a relative or absolute path. A relative path is interpreted as relative to the current working directory.

    destinationArchiveFileName: The path of the archive to be created, specified as a relative or absolute path. A relative path is interpreted as relative to the current working directory.

     

     

     

    System.IO.Compression.ZipFile.CreateFromDirectory(“test”, “testzip.zip”);

     

    Output in Debug folder:

     

    Applying this in Windows Azure:

    Storing in the local file system won’t be applicable in azure environment. We need to store in a centralized location where all servers can access them.

    You can upload your files into a blob container and compress those files using a worker role & store them in another blob container.

     

    Moreover, there are plenty of other libraries that we use to compress. All has its pros and cons.

     

    Download Code Sample from HERE

Leave a comment »

Asp.net Web API 102 – Scaffolding

In the previous blog we discussed the Basics of Web API and try to understand the default code. Now we will try to create our custom Web API using a Case Study.

Case study:

We will build our own requirement.

  • We have a Public Product Catalog
  • Any organization or an individual can publish their products here
  • We will add more requirements when we move forward…

Step 1: Create a Model “Product”


(Note: I have put full namespaces so you can better understand where it came from.)

I have added some data annotations to the properties.

  • ID will be the Key property.
  • Name property is marked as required.
  • And the maximum length of the Description will be 500 characters.

Now let’s use Scaffolding to create the Controller using the Model “Product”.

Step 2: Select Scaffolding for Web API 2 with Entity Framework

Right Click Controller Folder à Add à Select “New Scaffolded Item”


Choose the following option in the Web API category:


Step 3: Populate Controller Details for Scaffolding

  • Give a Controller Name
  • You have the option to create Controllers with Asynchronous Controller Actions or without.
  • Select the Model Class “Product”
  • Create a New Data Context to work with Databases.


Error 1:

When scaffolding the controller from the Model following error may appear:

“There was an error generating ‘WebAPI1.Models.DBContext. Try rebuilding your project.”

Rebuilding the project would solve this error.

Error 2:

If the Product class doesn’t have a key property, when scaffolding following error may appear. Example: if ID property doesn’t have the Data Annotation Key.


Note: Make sure you rebuild the project after you add the Key property. Also rebuild after any changes in the Model, before Scaffolding.

Now you have Created the your custom API Controller “ProductController”.

In the next blog we will see the differences of Actions with and without Asynchronous behavior.


Leave a comment »

Asp.net Web API 101 – Basics

We will discuss some points and try to find “Why Web Api?”

  • Earlier we had following web services:

    • Soap services – supports only xml

    • WCF services – supports many data formats, Protocols: HTTP, TCP, UDP, and custom transports

      • For a comparison between SOAP & WCF: Here

      • For a comparison between WCF & Web Api: Here

  • Now apps are available in PCs, Mobiles, Tablets, Notebooks, electronic devices, etc. Not all these devices speak SOAP. But they do speak HTTP. When you have more clients, you need to scale. Web API tries to minimize unnecessary configurations and keeps it simple.
  • It is simply a framework that allows you to build HTTP services. Services that are activated by simple GET requests, or by passing plain XML over POST, and respond with non-SOAP content such as plain XML, a JSON string, or any other content that can be used by the consumer.
  • WCF was initially created to support SOAP + WS-* over a variety of transports. Not all platforms and devices support soap. There was need to non-SOAP services. WCF 3.5 added WebHttpBinding – a new binding to create non-SOAP service over HTTP, better known as a RESTful service. Although, WCF support increased over time, it came with some pain. So, the main goal of the Web APIs is to stop looking at HTTP through the eyes of WCF, and just use as a transport protocol to pass requests. Web API aims to properly use URIs, HTTP headers, and body to create HTTP services accessible by any devices and platforms available.

To Start a Web Api Project using Visual Studio 2013

  • Create a New ASP.NET Web Application


  • Select Web Api template


The project creates a sample Web API controller named “ValuesController”. It will be inherited from the ApiController class which defines properties and methods for API controller.

Since Web API is a http service, each controller will by default have HTTP METHODS.

  • GET
  • POST
  • PUT
  • DELETE

Exercise 1:

http://localhost:16107/api/values

This URL will call the default Method “Get”

    // GET api/values

        public IEnumerable<string> Get()

        {

            return new string[] { “value1”“value2” };

        }

This will return a string array. Example: we can use this to retrieve a product catalog.

Exercise 2:

http://localhost:16107/api/values/001

This URL will call the default Method “Get”, now passing a parameter id.

        // GET api/values/5

        public string Get(int id)

        {

            return “value”;

        }

Example: we can use this to retrieve product by passing a product id.

Exercise 3: POST simple Json string using fiddler

We can use Fiddler, to send a POST requests.




FromBody attribute:

This forces Web API to read a simple type from the request body. However, only one argument in the action can be decorated with this attribute, and you get an exception when you try to decorate more than one. Without FromBody attribute it will return an error “HTTP/1.1 405 Method Not Allowed”. More

Exercise 4: POST Json Object using fiddler

Post Body = { “ProductName”=”Apple”, “Price”=”22”}


Exercise 5: POST Json Object as a Model using fiddler

Post Body = { “ProductName”=”Apple”, “Price”=”22”}



Exercise 6: POST Form data using fiddler

Change the content type

  • Content-type: application/x-www-form-urlencoded

Change Request body

  • Name=’Apple’&Price=’22’


Exercise 7: PUT (update) Json string using fiddler



Exercise 8: DELETE using fiddler

Note: body part is not applicable for DELETE.



Summary

Now, we have a basic idea about Web API, how to start a project, understanding existing sample code when a Web API project is created, and how to call HTTP methods of Api Controller.

In the Next Blog we will see how can we can work with custom methods on API Controller.

References:

http://www.w3schools.com/tags/ref_httpmethods.asp

http://blogs.microsoft.co.il/idof/2012/03/05/wcf-or-aspnet-web-apis-my-two-cents-on-the-subject/

Leave a comment »

Understanding Windows Azure Tables comparing with Relational SQL Tables (on-premise/SQL Azure)

Table of Contents

  • Background
  • A comparison of Relational SQL Table and Azure Storage Table
  • Azure Storage Table Concept
  • Understanding Entity
  • Understanding Properties by example
  • Partitions
  • Key things to remember

Background

Windows Azure Storage is a storage option provide by Windows Azure. Using this technology will bring all the benefits of Windows Azure such as High Availability, Scalability, Security, etc. Azure tables were specially designed to store non-relational data that needs some structure, but highly scalable. All the values of Azure tables are stored as Key-Value pairs which makes it a highly scalable storage option.

When someone starts to learn about windows azure table storage, the first thing that will come to his mind will be “Why do we need this when we have Sophisticated Relational SQL Server Tables?”

Therefore, the best way to understand about Windows Azure Tables will be, start thinking about a relational SQL table and learn the differences.

A comparison of Relational SQL Table and Azure Storage Table

Relation SQL table Azure Storage table
Can have Relationships between tables There are no relationships between tables. Follows NoSQL Concepts
Has Rows of values Considered as Entities
Strictly defined Columns & enforces data integrity There is no Concept of Columns, instead values are stored as key-value pairs
Each row has fixed amount of columns Each entity can have Zero or more properties
Each data/value will have a data type which corresponds to the column definition Each data/value are independent & can have different data types
No build-in System columns 3 System properties for each entity (PartitionKey, RowKey, Timestamp)
Merging data from multiple sources that has different formats – may need to be converted programmatically Have the option of saving data in a different formats.Example:Source 1: 01/01/2001Source 2: “2001” (only the year as a String)
Ability to Query – TSQL Limited Query options
Can apply indexes to any column Can’t do that. Is only available for PartitionKey & RowKey built-in
Geographically redundant price (as of 2014-Mar-09) – $9.99 Upto 1GB per Month Cheaper – $0.095 per GB per Month
When comparing only on-premise SQL More Availability
SQL Azure Max Capacity (as of 2014-Mar-09) – 150 GB 200TB

Azure Storage Table Concept

In an Azure storage table a collection of Properties (key-values pairs) will form an Entity. Those entities will be meaningfully grouped in to partitions using PartitionKeys. Ultimately, the collection of entities that were grouped in to partitions will form the Azure Storage Table.

Understanding Entity

An entity has 3 system properties:

  1. PartitionKey – means of partitioning entities for load balancing with the rule of thumb “Data that need to be Queried together, must be kept together”
  2. RowKey – This is a key that helps to uniquely identify an entity that belongs to a partition. So, RowKey together with PartitionKey will make an Entity unique in a given azure table.
  3. TimeStamp – When you first create an entity, the Date & Time of creation will be recorded in this property. And later if you do any change to that entity, the value will get updated with the last updated date & time.

Those system properties are built-in indexed and it is recommended to use these properties as identity columns so it is efficient. Non-system values cannot be indexed.

The paritionKey together with RowKey will form unique entities:


Understanding Properties by example

Example 1: No fixed Schema for storing values

In a relation SQL table all those values that is entered under column “Age” will be bound to the limitations defined in the column definition/design such as Data type, max length, unique key, etc.

Now let’s try to store those values in an Azure Storage Table. Now, those values are not bound to a particular column. (There is no concept of columns)

Each and every values are stored as key-value pairs. Unlike in relational SQL tables, those values can be of different data types as well.

This design has enable some new scenarios that were not possible before.

For example, let’s say we store date values in relational SQL table we have to store them in a standard format. But in SQL Azure table, we can store the date as string for some entities if needed.

Example 2:

There is a requirement to create a simple consolidated search engine of products that are referred from two different ecommerce stores. Store 1 stores Product Expiry Dates as DateTime data type (1900-01-01 00:00:00). On the other hand, Store 2 stores only Year and Month, but as Strings (“1900/June”). If we are doing this using relational SQL, we will have to assume a default date for Store 2 (may be logically incorrect according to the product type) and programmatically convert each and every date stored either as a separate column or do it when querying. But, we can push products from both the stores to Azure tables without any of these effort. (Store 1 values stored as DateTime, Store 2 values stored as String)

Moreover, if we take a particular entity, it can have Zero or more properties.

Example 3

In the above example,

  • First entity will have Zero properties
  • Third entity will have 3 properties (“First”,”Last”,”Birthdate”)
  • Second entity will have 4 properties (“First”,”Last”,”Birthdate”,”Sport”); the property “Sport” only belongs to the second entity and will not be applicable for other entities unless other entities have explicitly defined a property with the same name.
  • PartitionKey together with RowKey will form unique entity in Azure storage table. Although first & second entities have the RowKey “001” since they have different paritionKeys they will be unique inside the table.

Properties supports following data types:

  • Binary
  • Bool
  • DateTime
  • Double
  • GUID
  • Int32
  • Int64
  • String

Partitions

Azure storage tables are partitioned by the system property called “PartitionKey”.

Entities that were partitioned with the same ParitionKey will belong to the same load balancing group. On other words, anything within the same partition will live in the same server (same storage machine), so they can be queried/accessed faster.

So, as a rule of thumb when we design for Azure Storage tables we need to remember that “Data that need to be queried together, must be kept together in the same partition”.

Since, we cannot index non-system properties, the key idea of azure table is that design it in a way so that we make use of the concept of partitioning for load balancing. So, the art of choosing how to partition entities will depend on the requirements, transactions, and mainly on what kind of Queries that we are going to use on the azure table.

Following are some examples of how developers have used azure tables and how they have paritioned according to different requirements:

In this part of the table, we can we the entities have been partitioned using their document name.

The different versions of the same document are stored using different RowKeys.

In this case, all the versions of a particular document will be stored together. So, retrieving the versions of the same document will be very fast.

As you can see ParitionKey or RowKey doesn’t have to be a GUID or Integer they can be any data type. In the following example, Name is considered as the RowKey.

Following is a good example that describes how independent that values are stored for each entity as properties.

Key things to remember

In summary, when we design for azure storage tables with the concept of partitions, following are some key concerns:

  • First of all, know what you cannot do with Azure storage tables. It is not a replacement for relational SQL tables. In different cases, it can reside independently or it can work together with relational SQL tables to form a system. The latter case is the obvious.
  • Know the transactions; it is important to know what are the entities that are going to be updated together.
  • Know your Queries. It is important to know what are the entities that are going to be queried/accessed together.
  • Therefore, depending on the above concerns, we have to partition data.
  • Sometimes, if we design our tables to server some key queries, it may not possibly cater some other queries, which we will have to look for workarounds. In Relational SQL we have really good guides and best practices. In contrast, there is less direct guidance on designing such NoSQL based tables. So, it’s more of an Art!
Leave a comment »