Is there a code to get all the image editing software the user has installed?
I would like to know how to list all the applications by going through registry, is there any way to then filter out only the apps that can edit images, like Paint, Photoshop, etc...?
Thanks!
Here is an example for any generic file type:
How to get recommended programs associated with file extension in C#
If you look for jpg, png, etc. you'll get image editors.
No.
There is no expectation that every application developer somehow places metadata tags for their application inside of the registry (or wherever)... and, if for some reaosn a few handful of application developers did there is no way to guarantee the consistancy. Not to mention that application developers don't always use the common words you'd expect for their applications... Not every image editing application has the word "Photo" in it (for example, Picasa from Google).
The best you can hope for is to build some keywords to look for, add in a list of famous applications that don't conform to the keyword conventions your expecting ("Paint", "Photo", "Image", etc.), and work with that... either that or create a large database yourself to check against. Also, as other answers/comments have indicated, looking for applications that are used for specific extensions is helpful.
Nothing guaranteed though.
Related
I'm looking for the best way to achieve the following workflow.
Ask a series of questions
Captured the responses
Use the answers captured to either hide or show certain aspects of word document.
Save the complete word document to a location (TBD).
I'm not a developer, so will need to source one who could pick this up, but before I do I wanted to know the best approach to this workflow.
Appreciate any feedback you can offer.
Cheers
There are several libraries for interacting with Word (.docx) documents in C#, such as NPOI and DocX, and it is not theoretically complicated to programatically populate a document based on user input and some decision tree and then save it somewhere locally or expose it for download via a web interface. But, keep in mind, that's only part of the solution -- apps have to be hosted, secured, monitored, etc., and that's where the "hard" part is likely to be.
If you are looking to accomplish this within an enterprise environment that uses Microsoft Office 365, you may not need a developer at all. Microsoft Flow / Microsoft PowerAutomate allows you to produce complex workflows such as the one you described. There's a very similar one listed here:
https://flow.microsoft.com/en-us/galleries/public/templates/3c651e28cded46aab2ba40a2c3116f30/create-word-and-pdf-documents-from-microsoft-forms/
I am looking for a solution (APIs, etc.) for handling a similar experience as cafepress.com. I need to be able to upload images (preferably multiple at a time) and be able to map my uploaded images to various product images (clean stock images of shirts, mugs, etc.). I also want to give the user some very basic controls over the images they upload such as cropping, resizing, levels, etc. Any suggested libraries or APIs would be greatly appreciated. I am looking for .NET solutions (if server-side). I am not looking for how to tie this all together but rather some suggested libraries or tools to build out some of this functionality.
Note: If this is not the place for this type of question, please move accordingly or suggest an alternate site.
Frankly, I don't think you are going to find what you are looking for specifically, unless you are looking at using a full on CMS of some form. Frankly, those problem domains are too far apart. Instead you should probably look at them as individual pieces.
So far as upload controls exist, there are probably close to 100, some free some not so free. Personally, I already have a Telerik subscription, so it was a no-brainer for me, but Rad Upload works well and supports multiple uploads. Free implementations are available.
The cropping and post image handling tasks ( at least the ones you have listed ), can easily be handled using standard System.Drawing.* calls, or if it gets more advanced a ton of free libraries exist, like the age old ImageMagick, but there are a number of commerical libraries available as well. Chances are though, the inbuilt libraries will be more than sufficient.
Finally the mapping to products should be handled by your business layer ( aka, code you write ) as it is going to be so specific to your app.
However, if you are looking for a storefront or CMM with a multiple image upload control, that is a very different question with many options as well, both free and commercial.
use openCV. you can find it here: http://opencv.willowgarage.com/wiki/
you can use this library perfectly with c++.
if you need to use it in C# use this wrapper: www.emgu.com/wiki/index.php/Main_Page
I was wondering if there is another way to spell check a Windows app instead what I've been of using: "Microsoft.Office.Interop.Word". I can't buy a spell checking add-on. I also cannot use open source and would like the spell check to be dynamic..any suggestions?
EDIT:
I have seen several similar questions, the problem is they all suggest using open source applications (which I would love) or Microsoft Word.
I am currently using Word to spell check and it slows my current application down and causes several glitches in my application. Word is not a clean solution so I'm really wanting to find some other way.. Is my only other option to recreate my app as a WPF app so I can take advantage of the SpellCheck Class?
If I were you I would download the data from the English Wiktionary and parse it to obtain a list of all English words (for instance). Then you could rather easily write at least a primitive spell-checker yourself. In fact, I use a parsed version of the English Wiktionary in my own mathematical application AlgoSim. If you'd like, I could send you the data file.
Update
I have now published a parsed word list at english.zip (942 kB, 383735 entries, zip). The data originates from the English Wiktionary, and as such, is licensed under the Creative Commons Attribution/Share-Alike License.
To obtain a list like this, you can either download all articles on Wiktionary as a huge XML file containing all Wiki- and HTML-formatted articles. This is then more or less trivial to parse. Alternatively, you can run a bot on the site. I got help to obtain a parsed file from a user at Wiktionary (I seem to have forgotten his name, though...), and this file (english.txt in english.zip) is a further processed version of the file I got.
http://msdn.microsoft.com/en-us/library/system.windows.controls.spellcheck.aspx
I use Aspell-win32, it's old but it's open source, and works as well or better than the Word spell check. Came here looking for a built in solution.
I have thought of three approaches to create and maintain resources in .Net projects for WinForms using Visual Studio 2008. (I am sure there should be more than three ways.) I need to decide on one before starting to implement internationalization for our product.
Have individual sets of resource files (resx) for each windows form or piece of UI (a custom control) in each .net project. These are auto generated by Visual Studio when Localizable property is set to true in the form or control properties.
Have one resource file per .net project. This is added manually and updated manually with the resource strings and messages.
Have one resource manager project that has resources for all the components for a set of .net projects.
Personally, I do not like the first approach as it creates numerous resources files. The only advantage we get in this approach is that we do not need to set text in UI elements manually.
I like second and third approach as they are easy to maintain and there is only one set of resources that you need to handle. So no duplication of strings and messages. Easy for the translators also.
What are your thoughts? Please share.
I have tended to use VS to create the project and provide the default set of resources but then maintain any additional resources outside of visual studio via the SDK tools winres.exe, resgen.exe and al.exe.
You can maintain the resources in a fairly simple folder structure of one folder per culture and just have a batch file or two to build the resources into satellite assemblies. This gives you the advantage of keeping the VS solution to the core product and all localisation can be done after the fact.
If internationalizing your app means more than translating pieces of text you should go with the first one. You can create satellite assemblies to deploy different cultures. This way you are not localizing just text but also images, control layout, etc. This is the way how Microsoft recommends it and they have good reasons for taking this approach.
I've found that the simplest approach to internationalization is to simply maintain a list of all the different pieces of text in your application (labels, buttons, form captions etc.) in a spreadsheet or tab-delimited file of some sort, and then send this file to the translators to add (in one additional column for each language) all the translations.
You then call a simple method in the Load event of each form (which are all maintained in English) that iterates through all the controls on the form recursively and changes their Text properties to the translated values for whichever language you're translating the app into. The language can either be determined programatically (I forget where in the .Net namespace this is indicated), or you can have a simple language selection dialog when the application first starts (the advantage of this second method is that your app can be translated into whatever language the user wishes, without having to set the language for all of Windows - this is especially useful for kiosk applications).
In my opinion, creating and maintaining all the different internationalized versions of every form is a major pain, although it is useful when the translated text values are significantly different in size from the English versions.
Personally, I prefer the first approach because the context (in your case, the forms) is very important for a translator to do his job perfectly. In your second and third approach, the context is gone because it is just a list of strings.
Yes, the first approach can be a pain to maintain but at least your application would be translated correclty.
I personally love to add a single project .Resources.
Next I enable Microsoft MAT ( https://developer.microsoft.com/en-us/windows/develop/multilingual-app-toolkit ) and manage all my translations via MAT.
This way you can also recycle translations from other solutions, saves you time ;-)
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
Resource files seem great for localization of labels and messages, but are they perfect?
For example:
Is there a better solution if there is a huge amount of resources? Like 100,000 strings in a .resx file? (Theoretically, I do not actually have this problem)
Is this a good method for storing the other types of data, such as images, icons, audio files, regular files, etc.?
Is it a best practice to store your .resx files in a stand-alone project for easier updates/compiling?
Are there any other issues that you have run into when using .resx files?
1. Is there a better solution if there is a huge amount of resources? Like 100,000 strings in a .resx file? (Theoretically, I do not actually have this problem)
I've used Alfresco as an alternative content repository on Java projects. RESX files, from a maintaince standpoint (because of encoding issues I guess) can really stink.
2. Is this a good method for storing the other types of data, such as images, icons, audio files, regular files, etc.?
I've seen it work with images...but that's it. (not sure with other media/files)
3. Is it a best practice to store your .resx files in a stand-alone project for easier updates/compiling?
I don't, but you can edit a resx file on a live site and then edit will go through, I believe. Certainly that's the way it works in development (except for the global resx, I think)
4. Are there any other issues that you have run into when using .resx files?
Besides being really annoying to maintain, and the fact that visual studio doesn't provide the neatest tools for working with them...no.
I recently used a .resx file with 5 million strings (normal length, like this sentence), compiled in different DLLS about 1 GB in size. It still works fine in an Azure web project.
The load time is unknown, maybe few seconds or so, since it always can heat up in stages, I never noticed it.
We have been using resource files on a relatively large .NET Windows Forms application (over 500 various forms, approximately 20 resource strings per form) and we've had no performance issues regarding resources from .resx files.
We have used Babylon.NET as a tool for managing translations (has a free version just for translators).
You did not specify if your project will be web or desktop application. One functionality that resource files offers for desktop applications is the ability to also localize control positions and size which IMHO is not possible using other tools (unless you are using something like DevExpress layout control which has automatic sizing).
Never seen any problems with resx resources, they are being cached perfectly. We have used them in WinForms, asp.net mvc, wpf, etc...
One thing you should do is use the Microsoft MAT (Multilingual App Toolkit) extension for Visual Studio.
You can control your translations, export them to send to translators (e.g. not the locked translations), import them again and verify them or comment on it, recycle existing translations (saving you a lot of time!)
and it works with the industry standard xlf format!
If you sign up with the Azure api you can even automatically translate resources (you have a few thousand words free of monthly credit on azure).
See: https://multilingualapptoolkit.uservoice.com/knowledgebase/articles/1167898-microsoft-translator-moves-to-the-azure-portal
you can even see how much work already has been done in a project:
Oh and it comes with a handy editor which your translators can also use!
To get started:
install the MAT Visual Studio extension
Go to your project in Visual Studio
Click Properties --> open AssemblyInfo.cs
Add this attribute: [assembly: System.Resources.NeutralResourcesLanguage("en")]
Select your project in Solution Explorer and to in Visual Studio to [Tools] --> [Multilingual App Toolkit] --> [Enable selection]
This will add a new folder "MultilingualResources" to your project
Right mouse click your project --> [Multilingual App Toolkit] --> [Add translation languages…] --> select the language you want to translate (e.g. Dutch).
In the "MultilingualResources" folder you will see a new file "....nl.xlf", double click it, it will open with the Multilingual Editor. (if not right mouse click and change the default "Open With" to the multilingual editor)
Now you only add strings to your default Resources.resx file (the language should be same as the "NeutralResourcesLanguage" you added in AssemblyInfo.cs.
For the translations you DONT add strings to the ...nl.resx files, you work with the .xlf files, located in the MultiLingualResources folder.
(after you have done lots of translations, a rebuild might be needed so that the translated .xlf files update the translated .resx files)
Where to get it:
feedback: https://multilingualapptoolkit.uservoice.com/
visual studio extension: https://marketplace.visualstudio.com/items?itemName=MultilingualAppToolkit.MultilingualAppToolkit-18308
knowledge base: https://multilingualapptoolkit.uservoice.com/knowledgebase
github of Cameron (Microsoft) who manages this project: https://github.com/TheMATDude
I have had two problems with resource files, both about performance of the translators (people), rather than the speed of string lookup.
The sales staff at the oversee office
that did the translators could not
cope with editing XML or learning any
new tool.
So they just used Excel to edit the translations. Therefore, we might as well have stored the translated strings as a CVS file, so avoiding having to copy the translated strings into the resource files.
A new build needs to be done so as to
see the effect of any translations.
Once again if the translated strings were stored as a CSV file, we could have cached them in the ASP.NET cache. Then any changes to the translations would show up on the next page load.
So we could have used a custom implementation of the resource provider and keep to the standard ASP.NET resource lookup system. Or just ignore the standard resource lookup system if it does not help in your case – it depends on how your pages are written.
You may find at some point that you wish to be able to override strings for a single customer, if so you will need a multi-stage lookup system. Otherwise, you have to merge the customer’s custom strings with the translated string each time you ship a new version of the system.
For point#4.
I have been using .resx files for all strings on our site that must be localized into many languages and haven't had any major issues with them.
The one thing that you need to think about is if you want this text to be searchable. For some of the sites I work on there are some localized resources that need to be searchable so I must keep them in the database. However, when I have the choice I prefer the .resx file for similar reasons mentioned above.
I will simply add that you should look for custom implementations (or do you own) of the resource provider (provider model like the membership provider) to store your resources in a database. That's what we did for our CMS, and it's very useful.
When we first looked for an example back then we found Creating a Data Driven ASP.NET Localization Resource Provider and Editor.
Here is my take on resource files:
I would assume that if there is a LARGE amount of string, that using a database might be the best method to allow for searching and sorting of the data. It would probably not be too difficult to account for multiple languages in a resource table, and the speed should best fast.
I would think that this is a good method for storing static resources, or things that might be changed by a client. As for dynamic resources, it might be better to use a database, either alone, or in conjunction with the file system. I think in the new SQL Server there is a new type that is an optimal hybrid of using a database and the file system.
I read in another question (don't know which) that using resource files in an external project is a good practice, because you wouldn't have to recompile an entire project when resources change. Just recompile the resource project. This would also allow for (fairly) easy edits to be made by clients, where they would only need to "source code" for the resource project, and not your other real code (API code, etc.).
I have not used resource files enough to make any claims about their reliability, extensibility, or any potential issues that you might have when working with them.
I've been using resource files in a .net razor page app after dumping our previous proxy server that used a custom regular expression language to replace strings as they passed through the proxy.
We dumped the proxy method as it was more suited to large strings (paragraphs) and pretty awkward for all the dynamic fragments and stuff we had.
Had no problem at all and it's faster so far than the proxy server. I store all the target pages, comments, names, en and all other available languages in a DB..trivial to add a new column for a new language.
We have about 5k entries in multiple resx files so far
I then use a builder process to create all the resx files and place them in the correct local and global folders any time something is updated.
Dead easy to build a simple interface for translators to search for pages, languages, comments, names etc and update. We choose not to auto rebuild the resx files on a change but you could if you trust your translators ;)
We also allow translators to add new fragments/text to translate but as yet we've not had any bright ideas on how to include them automatically and have to manually substitute the string in the source file and recompile.
For editing resx files I've used zeta,
https://www.zeta-resource-editor.com/index.html
Which can open all your languages in one go and highlights differences in placeholders and also missing translations. You can edit all the languages on one row and save all the files in one go. We don't use it now as everything is in the DB but recommend it.