Read a folder on local machine using azure function c# - c#

Is it possible to read a file located on local machine path C:\data with an azure function trigger by http request ?

You can expose a local file system to Azure using the on-premises data gateway.
However this supported from Logic Apps, but not as far as I know from Functions. You could however still use the Logic App as a bridge to your Function using the Azure Function Connector.
You are of course free to use your own personal computer however you like, but be aware that the on-premises data gateway exposes machines on your own network directly to the internet, which in the context of a business is often considered a significant security hazard. Definitely do not do this in a business context without clearing it with IT security persons first.

I would say no. The resource that you want to read data from needs to be accessible from the web. Put the files in the cloud so that the function can access them.

Related

Can Microsoft On-Premises Data Gateways be accessed from a non-Power client [duplicate]

I have set up an Azure On-Premise Data Gateway by following these instructions here. I also tested accessing through Logic Apps following these instructions.
However, I don't want to use Logic Apps. Are there any client libraries where I can directly access the gateway instead of only being able to talk through the Logic Apps workflow schema?
All I'm trying to do is stream files from file paths on-prem using C#.
Unfortunately, there is no such DotNet SDK or any another SDK is available for you as of now which might help in On-Prem Data GateWay accessing.
As of now, you can use only the Power BI, Microsoft Flow, Logic Apps, and PowerApps services that may help you to securely transfer the data between On-Prem and Cloud.
There is also some list of an available data source for on-prem data gateway. You can find it here
If you are trying to move on-Prem SQL server to Azure SQL Database then ADF would be helpful
So finally, you would need to continue with your Logic Apps or other services which currently support the onPrem Data Gateway
The gateway is built on top of Azure Relay, which you can use directly. Hybrid connections are also available in App Service.

Accessing Azure File Storage from WebJob using SMB protocol

I have a c# console application running as a web job in Azure PaaS. Since it is a legacy system and use local UNC path to put the generated pdf, I am exploring the ways I can do this on Azure Storage. Following this, I have created a storage account then File share and finally a directory inside file share. I can access the directory from windows machine by entering the login credentials. so I know the storage is all set and working. Now I want to replace the UNC path in my c# code with the UNC(?) path on Azure PaaS but I am wondering if that would work and if yes then how I should handle the credentials? Since Microsoft says that File Share supports SMB 3.0 I reckon I should be able to use it just the way I use any on premises drive. I do not want to use REST api's to do the file operations as defined here and in the video here because it would involve code changes which in my case would be a huge exercise. Since File share supports SMB protocol I was expecting to find examples where it is called from a web job. Can somebody point me to the right resource or guide me how I can accomplish this piece of functionality.
Here's your problem -
From the App Service sandbox Wiki -
Restricted Outgoing Ports
Regardless of address, applications cannot connect to anywhere using ports 445, 137, 138, and 139. In other words, even if connecting to a non-private IP address or the address of a virtual network, connections to ports 445, 137, 138, and 139 are not permitted.
That's largely SMB traffic.
Your options are limited, i would try to publish on Cloud Services instead (worker role), still PaaS but with a vintage feel to it and no outbound port restrictions.
Service Fabric with Guest Executable programming model could also be an option, although it's probably a little too involved for a simple console app. Pick Windows nodes for .NET Full Framework.

What components are necessary to have a functional data tier on Azure?

I want to set up a test case of my data tier on Azure. For my scenario this means
putting a SQL Server database into Azure
Storing my custom .dll data access code
Creating a tcp listener to take XML requests, call the custom dll code, and return the resulting XML.
What is a way to accomplish this in the architecture of azure?
My current understanding is that I need to do the following for each step:
Create a VM that hosts an Azure SQL database
Make sure .Net is on the VM and load my .dll
Create a worker role on the VM.
So I'm thinking 1 VM, 1 database, and 1 worker role. I have very little confidence that this covers my needs and I'm not sure what I might be missing.
It shouldn't matter, but our current client is a WPF application.
You have a few options on how to implement this.
For the database you could either start a VM from the Gallery with SQL Server pre-installed on it, and restore your database on it. Or, even better, create a Azure SQL Database, and create your database on that service. The difference: Azure SQL Database is not a VM; it is a service. Chances are, your database should work as-is. But if it doesn't (for some reason), then you can fallback on a VM with SQL Server on it.
Regarding your DLL and website, you could spin up a regular Windows VM, and deploy your DLL and website on it; if you are doing a proof of concept, there may not be a need to have a third machine involved. With that said, if your objective is to also learn about cloud services (web roles for example), then yes, you could also deploy a web role separately, which you would need to configure to connect to your DLL through some sort of web service call. You can deploy your website manually by creating a package from Visual Studio, or push directly from within Visual Studio (both would require you to create a new kind of project - a Web Role project - and add your website to it).
If you deploy 2 VMs (a cloud service and a VM for your DLL), then you will also need to configure a Network (it's a specific service within Azure) so that your website can communicate to your service (where your DLL is installed).
Last but not least, you will need to create a storage account, in which your VM disks will be located. This storage account is also another service that is part of the Azure offering. Your disks will be stored as blobs.

Solution for a no-server multi-user application/database?

I am at a dead end an I could really use some help.
I intern for a huge company. My projects involves creating an application to automate/simplify the work of a retiring employee.
The problem here lies in the strict company policies. I am a developer stuck at business end of the company. Therefor IT gives me nothing:
I don't have a server (nor web nor database)
I can't create a server, because no pc will be running and we can't keep them logged in due to single sign on with company cards.
I can't install anything on the pc's in the network.
I can access a share file server, that is backed up every day.
The libraries involved have to be free
A central database has to be accessed by a dozen of users (at once)
The database will recieve new data every day and will grow accordingly
The users will both read and write from/to the database
Preferably C#.NET or WPF solution
Application needs to open files stored on the shared drive. ( Only once, the important information will be extracted and stored in the database.. the file will then be removed)
My initial idea was to use silverlight (which runs standalone) in combination with SQLite. I ran a test and Silverlight files stored on the shared drive work. (Silverlight is installed on every pc on the network) This is my preferred front end. However (correct me if i'm wrong) I tried SQLite-net and I needed to add the sqlite3.dll to my windows/system32 folder, but on the network PC's I don't have access to the Windows folder, so this can not be done.
Also I read that SQLite or files in general can become corrupt when accessed by multiple users as one, so maybe I thought locking was an idea.
Which solutions are there to my problem?
I worked for a company for several years writing software for police departments to manage traffic collision reports. Police stations usually have little-to-no IT support, so we faced many similar limitations. The company actually did pretty well using Microsoft Access databases, with the setup looking something like this:
The shared drive had an Access database file (.mdb or .accdb) which was the actual "database".
Client computers (at the officers' desks) had Access applications with local "utility" tables for temporary storage, UI defined in Forms, and logic defined in Modules. Each of the client machines were connected to the repository on the shared drive by using linked tables. Local client configuration was stored either in the Access application in a config table, or in a text file on the machine.
It's not the cleanest solution, but it would allow you to create and maintain a unified solution using files that don't need to be installed and don't require any funny permissions, as long as everyone has read/write access to the shared drive.
Create a website. Today you can host ASP web apps in a stand alone .exe. By doing so you can make sure that the shared files are only accessed by one process. You can also limit the access to sqlite.
It also means that you do not have to distribute anything. Simply start your application and tell your users which url and port they have to browse too.
As for permissions, only the account running your webhost requires access to shared files etc.
You should take a look at ScimoreDB. It's an embedded database that supports multi-process read/write access. If needed it can also act as a client/server database; even as a distributed database with multiple nodes.
It's free to use and deploy. It has support for C++ and .NET. Only disadvantage is that it only works on Windows.

Obtaining Files From A Local Web Server or Network Share - Which Is Better

Here's the scenerio.
I have to access a web service on the local LAN to obtain a list of files which I then must retrieve from the machine running the web service. The question has arisen whether to use a mapped drive or just retrieve the files via HTTP from the web service (or web server if the service is self-hosting).
All machines are running Windows XP or later.
I am leaning towards the web server approach - because it has the fewest unknowns as far as having the necessary permissions to access the files.
So basically the question is which is the better approach - web server or network share?
I would go the webservice route because it reduces the number of variables in the equation. Based on your current setup you already need a web service in order to get a list of files to download. At this point you know access to the web service isn't a problem so putting the files there removes a lot of unknowns.
If you put files onto another machine you run the risk of hitting at least the following problems that do not exist with the web service (since you already know you have access)
Permission Issues
Firewall issues
I would think it depends on various factors you haven't mentioned: will lots of clients be trying to access these files at a given time? Will the app be distributed across multiple servers in the future? Might you need to implement a caching system in the future?
If the answer is no to all of these, then you should probably pick what's easiest.
I would lean towards plain old HTTP. Doing it via the web service would probably involve marshalling the file as an array, for example, which makes it larger. A file share means needing to worry about permissions.

Categories